Mindfulness Versus Positive Evaluation
Abstract and Keywords
Mindfulness is a flexible state of mind that is characterized by openness to novelty, sensitivity to context, and engagement with the present moment. In this chapter, mindfulness is proposed to be a critical factor in determining individual performance and shaping our learning experiences. In particular, mindfulness appears to be crucial in helping us deal with the inevitable uncertainties in our lives and environments. The role that mindfulness plays in shaping evaluations, both positive and negative, is also discussed. Finally, it is suggested that the mindful individual is likely to choose to be positive and will experience both the advantages of positivity and the advantages of perceived control for well-being.
Life is a battle. On this point optimists and pessimists agree. Evil is insolent and strong; beauty enchanting but rare; goodness very apt to be weak; folly very apt to be defiant; wickedness to carry the day; imbeciles to be in very great places, people of sense in small, and mankind generally, unhappy. … In this there is mingled pain and delight, but over the mysterious mixture there hovers a visible rule, that bids us learn to will and seek to understand.
What is considered evil, beautiful, good, folly, and wickedness are products of our mind. It is surely easier to be happy living in a world full of beauty and goodness. Just as surely, it is easier to be happy if we think these things of ourselves. This chapter will consider the ways our mindless use of evaluation, be it positive or negative, leads to our unhappiness; the direct effects of mindfulness on happiness; and why teaching mindfulness may reap more benefits than trying to teach people to be positive. Most will agree that pessimism is virtually synonymous with unhappiness. What might be worth considering is how positive evaluations may lead to the same result.
Before proceeding, however, it is important to take at least a brief look at what mindfulness is and is not: It is a flexible state of mind—an openness to novelty, a process of actively drawing novel distinctions. When we are mindful, we become sensitive to context and perspective; we are situated in the present. When we are mindless, we are trapped in rigid mind-sets, oblivious to context or perspective. When we are mindless, our behavior is governed by rule and routine. In contrast, when we are mindful, our behavior may be guided rather than governed by (p. 280) rules and routines. Mindfulness is not vigilance or attention when what is meant by those concepts is a stable focus on an object or idea. When mindful, we are actively varying the stimulus field. It is not controlled processing (e.g., 31 × 267), in that mindfulness requires or generates novelty. Mindlessness is not habit, although habit is mindless. Mindlessness need not arise as a function of repeated experience. As demonstrated subsequently, mindlessness may come about on a single exposure to information.
For those of us who learned to drive many years ago, we were taught that if we needed to stop the car on a slippery surface, the safest way was to slowly, gently pump the brake. Today most new cars have antilock brakes. To stop on a slippery surface, now the safest thing to do is to step on the brake firmly and hold it down. Most of us caught on ice will still gently pump the brakes. What was once safe is now dangerous. The context has changed, but our behavior remains the same.
Much of the time we are mindless. Of course we are unaware when we are in that state of mind because we are “not there” to notice. To notice, we would have had to have been mindful. Yet over 25 years of research reveals that mindlessness may be very costly to us. In these studies we have found that an increase in mindfulness results in greater competence, health and longevity, positive affect, creativity, and charisma and reduced burnout, to name a few of the findings (see Langer, 1989, 1997).
Mindlessness comes about in two ways: either through repetition or on a single exposure to information. The first case is the more familiar. Most of us have had the experience, for example, of driving and then realizing, only because of the distance we have come, that we made part of the trip on “automatic pilot,” as we sometimes call mindless behavior. Another example of mindlessness through repetition is when we learn something by practicing it so that it becomes like “second nature” to us. We try to learn the new skill so well that we do not have to think about it. The problem is that if we have been successful, it will not occur to us to think about it even when it would be to our advantage to do so.
Whether we become mindless over time or on initial exposure to information, we unwittingly lock ourselves into a single understanding of that information. For example, I learned that horses do not eat meat. I was at an equestrian event, and someone asked me to watch his horse while he went to get the horse a hot dog. I shared my fact with him. I learned the information in a context-free, absolute way and never thought to question when it might or might not be true. This is the way we learn most things. It is why we are frequently in error but rarely in doubt. He brought back the hot dog. The horse ate it.
When information is given by an authority, appears irrelevant, or is presented in absolute language, it typically does not occur to us to question it. We accept it and become trapped in the mind-set, oblivious to how it could be otherwise. Authorities are sometimes wrong or overstate their case, and what is irrelevant today may be relevant tomorrow. When do we want to close the future? Moreover, virtually all the information we are given is presented to us in absolute language. As such, we tend to mindlessly accept it. Too often we mindlessly learn what we should love, hate, fear, respect, and so forth. Our learned emotional responses to people, things, ideas, and even ourselves control our well-being. Yet many of these responses are taken at face value. It seems easier that way than to question the underlying values and premises on which our evaluations are built.
Mindfulness, Uncertainty, and Automatic Behavior
Most aspects of our culture currently lead us to try to reduce uncertainty: We learn so that we will know what things are. In this endeavor we confuse the stability of our mind-sets with the stability of the underlying phenomena. We hold things still purportedly to feel in control, yet because they are always changing, we give up the very control we seek. Instead of looking for invariance, perhaps we should consider exploiting the power of uncertainty so that we can learn what things can become rather than what they are. If we made a universal attribution for our uncertainty, rather than a personal attribution for incompetence, much of the stress and incompetence we experience would diminish. Mindfulness, characterized by novel distinction-drawing, leads us in this direction. It makes clear that things change and loosens the grip of our evaluative mind-sets so that these changes need not be feared.
A large body of experimental studies, including our own, make a cogent case for the automaticity of most human behavior (see Bargh & Chartrand, 1999). The costs of the unconsidered nature of most social behavior are either overlooked or weighed against presumed benefits. The argument given for these benefits can be broken up into a normative part and a descriptive part. Neither is unproblematic. The normative part of the argument is a classic “resource constraint” proposition. Because (p. 281) cognitive work is costly, as the argument goes, cognitive commitments to the values and perspectives that we will bring to bear on a particular predicament are efficient. It is cognitive behavior that is well adapted to the circumstances of having to react to an environment that requires quick, decisive action. But for cognitive commitments, we would be “stuck,” just like the ass in the medieval tale of Burridan, whose obsessive dithering between two stacks of hay leads him to starve to death. The alternative, a mindful engagement with the situation, is erroneously believed to lead to “analysis paralysis,” which stifles decisive, purposive action.
This argument only works, however, if we accept that the environment is static and our understanding of it complete, or that we have discovered the “one best way” to deal with all possible eventualities. Both of these assumptions, however, are unrealistic. In our world, a world that is constantly changing in unpredictable ways, letting our beliefs die in our stead (Popper, 1973) is the hallmark of the successful individual. To be mindless is to close the future. At what point do we want to do this? Twentieth-century writing in epistemology teaches that scientific theories and models are regularly replaced by successors (Popper, 1959) whose premises are radically different from those of the incumbent theories. The succession of “paradigms” of scientific knowledge does not follow a path of “linear progress” toward more truthlike theories over time (Kuhn, 1981; Miller, 1994). Theories—or the models of the world or cognitive schemata that people use in order to choose between different courses of action—regularly change in fundamental ways, and the hallmark of rationality is not being able to salvage a theory from apparent refutation by the addition of fortifying hypotheses, but rather the ability to specify the conditions under which a theory will be abandoned (Lakatos, 1970).
Certainly, it is no less important for the individual to question her theories than it is for science. When information is processed mindlessly, the potential for reconsideration is abandoned. This typically happens by default and not design, so that even if it were to the individual's advantage to question her theories, it will not occur to her to do so. Bargh and his colleagues focus on the fact that the environment often requires “prompt” action (Bargh & Chartrand, 1999). What they fail to consider is that “adaptive,” and therefore changing, actions also are required. Now, the emphasis is no longer on cognitive precommitments that aid actionability but rather on the ability to act while remaining open to the possibility that the theory on which the action is predicated may shortly be supplanted by a different theory. But if we are open to a potentially new theory, how can we take action? At this point, some are tempted to say analysis can paralyze us. Analysis paralysis, however, only follows if we assume that there is some level of analysis at which we may be able to identify a best theory. In this case we would keep searching for the “right” decision. Otherwise, there is no reason to be paralyzed by the process of reframing and reinterpreting the environment in terms of new models, because we know that all models are ultimately mistaken or can be significantly improved upon. We can take action in the face of uncertainty. Even a seemingly unassailable theory like Newton's formulation of classical mechanics was abandoned by modern physicists in favor of the relativistic picture of space and time, in spite of the fact that Newton's theory may very well be salvaged from refutation by the addition of “fortifying” hypotheses (Lakatos, 1970).
The ability to refine one's theory—or to alter it dramatically in the face of new circumstances—seems to be critically dependent upon our ability to withhold judgment about the “best one.” We never abandon a “theory” because it has been refuted but rather because we have a better theory that has been more severely tested and has withstood those tests more competently (Lakatos, 1970).
Studies of learning behavior suggest that keeping multiple perspectives of the same phenomenon “alive” at any given time is critical to the process of learning from “experience.” As an example of this, Thomas Kuhn (1981) noted in his analysis of Piaget's studies of the ways in which children learn about the concept of “speed,” being able to simultaneously hold the mental models of “speed as blurriness of moving object” and the mental model of “speed as minimum time of object to destination” is critical to the children's ability to make correct inferences about the rates of motion of moving objects. At the least, it is clear that learning is not likely to take place if we are closed to new information.
Our studies (e.g., Bodner, Waterfield, & Langer, 2000; Chanowitz & Langer, 1981; Langer, Hatem, Joss, & Howell, 1989; Langer & Piper, 1987) showing that conditionalized presentation of information (“x could be seen as y”) to students leads to better “performance” of the students on subsequent tests than does the unconditionalized presentation of information (“x is nothing but y,” or “x is y”) lend support to the notion that successful adaptive (p. 282) behavior depends on the “loosening” of the grip that our cognitive commitments have on our minds.
The argument for automatic behavior also relies on the questionable belief that automatic behavior is faster and somehow “easier” for people to engage. This deserves several comments. We might consider how often speed is really of the essence. To answer this, we may want to consider what the difference in speed is between mindless and mindful responses. We may produce the same response either mindfully or mindlessly. When we choose to do this, the difference in speed is likely to be trivial. On this point, my original work failed to make clear that while mindlessness closes us off to change, we also cannot be in a constant state of mindfully drawing distinctions about everything at once. To argue that mindlessness is rarely if ever beneficial means that we do not want to close ourselves off to possibility. Instead, we want to be either specifically mindful with respect to some particular content or “potentially” mindful. We may not want to notice the myriad ways each corn flake is different from the other, for example, but we do not want to be so automatic in what we do notice that we fail to see the metal nut that slipped into the bowl, either. A mindful breakfast, then, can take the same time as a mindless breakfast.
Furthermore, the need for exceptional speed, where milliseconds might matter, as in swerving the car to avoid hitting a child, may be avoided altogether by mindful behavior. When mindful, we often avert the danger not yet arisen.
In fact, mindfulness can increase, rather than decrease, one's performance. Consider states of “flow” (Csikszentmihalyi, 1990), which are characterized by a decrease in the effort required to process information and by an “enjoyment” of the experience of performance, “without” a loss of engagement or of the sense of being-in-the-present. It is difficult to argue that people who were found most likely to experience states of flow—such as surgeons and musicians—are also most likely to be automatic in their processing of the stimuli with which they interact. Similarly, our studies on the prevention of mindlessness (see Langer, 1989, 1997) show that when people learned mindfully, they were more likely to enjoy the learning experience than were people presented with an unconditional version of the same information.
The error in the proposition that automaticity is “easier” to engage in than is a conscious awareness and engagement with the present is a faulty comparison, whereby “automatic” processing is contrasted with “controlled processing” of information (which by definition is effortful). Mindfulness is orthogonal to controlled processing (see Langer, 1992). For the former, one is actively engaged in drawing novel distinctions (e.g., when does 1 + 1 = 1?—when adding one wad of gum to one wad of gum); for the latter, one relies on distinctions previously drawn (e.g., as when we multiply 237 × 36). Thus, mindfulness may seem effortful when it is confused with controlled processing. Similarly, it may seem effortful when it is confused with stressful thinking. Events appear stressful when we are certain that a particular occurrence will necessarily lead to an outcome that is negative for us. It is hard to think about negative things happening to us. It is the mindless presumption that it will be negative that is hard, not mindfulness. Perhaps the ease of mindfulness becomes apparent when we consider that when we are fully engaged in our work, just as when we are at play, we seek novelty rather than certainty. Indeed, humor itself relies on mindfulness (which is why a joke already heard and remembered, without being newly considered, is rarely funny). Mindfulness is not a cold cognitive process. We may be mindful when we simply notice our peaceful reactions to the world around us.
Most of us have the mind-set that practice makes perfect. We often take as a given that we should learn “the” basics of complex skills so well that we do not have to think about them again, so that we can go on to master the finer points of the task. In an earlier work (Langer, 1997), I raised the question “Who decided what ‘the’ basics are?” To the extent that the learner differs from whoever that decision maker was, it may be advantageous to question the basics so that we can take advantage of our idiosyncratic strengths. Such questioning is ruled out when we are mindless. For example, it seems odd that a very small hand should hold a tennis racket the very same way a very large hand should and that in either case, the way should remain unchanged despite the weight of the racket. Does it make sense to freeze our understanding of a task at the point when we know the least about it? It is unlikely that experts do this; instead, they question basics. Our data (e.g., Langer & Imber, 1979) suggest that mindless practice leads to imperfect performance.
The costs of mindlessness go beyond performance decrements (see Langer, 1989, 1997). Even if our world (personal, interpersonal, and impersonal) were governed by certainty, it would be to our advantage to “be there” to experience it. In an (p. 283) uncertain world, mindlessness sets us up potentially to incur costs every time things change.
Uncertainty keeps us situated in the present. The perception of uncertainty leads to mindfulness, and mindfulness, in turn, leads to greater uncertainty. As such, mindfulness leads to engagement with the task at hand. Being situated in the present and involved in what we are doing are two ways mindfulness enables us to be content. Moreover, by drawing novel distinctions, we become sensitive to perspective, and in so doing, we come to see that evaluation is a function of our view rather than an inherent part of the stimulus. Our mindlessness regarding evaluation is perhaps the greatest cause of our unhappiness.
Mindlessness and Evaluation
We take for granted that evaluations exist independently of us. Each day we think and feel and act as if people, objects, and events were good or bad in themselves. For example, potholes, tax collectors, and divorce are bad, whereas caviar, philanthropists, and holidays are good. But we are essentially mindless to the fact that we have accepted value judgments that we have attached onto various events and objects and states of the world. We find something pleasing or displeasing because we choose to see it in a particular way. Such judgments are in our control, yet we too often are oblivious to this fact.
Things “out there” are not self-evidently good or bad. Sometimes we say this (e.g., “One man's passion is another man's poison”), yet our everyday experience signals otherwise. Potholes make cars slow down; a tax collector can be someone's beloved husband; divorce can be the best outcome for the child living in unspoken tension. When we are not locked into fixed evaluations, we have far more control than we think over our well-being. We have control over the experience of the present. The prevalence of value judgments in our lives reveals nothing about the world, but much about our minds. We judge and evaluate in order to do the “good” thing, to have the “right” thing, or do the “right” thing. The resulting feelings we identify with happiness. We are rarely immediately conscious of the purpose of our evaluations. Evaluation is something we use to make ourselves happy. As we shall see, however, the use of the evaluative mind-set is self-defeating, for it brings us unhappiness instead.
Many of our thoughts are concerned with whether what we or others are doing or thinking is good or bad. Evaluation is central to the way we make sense of our world, yet in most cases, evaluation is mindless. We say that there are two sides to the proverbial coin. Although we acknowledge that everything has advantages and disadvantages, we tend to treat things as good or bad in the balance. A more mindful approach would entail understanding not only that there are advantages and disadvantages to anything we may consider but that each disadvantage is “simultaneously” an advantage from a different perspective (and vice versa). With this type of mindful approach, virtually every unpleasant aspect of our lives could change.
“All behavior makes sense from the actor's perspective, or else the actor would not do it.” This realization makes “all negative evaluations” of people suspect, and all action based on these predictions about people of questionable worth. If we are trying to predict what others will do in the future, and we believe the past is the best predictor, then it would behoove us to know better what the action meant to the actor.
A frog is put into a pot of water. The pot is slowly heated. The frog keeps adjusting and finally dies. Another frog is put into a pot of water. The heat is turned on very high. The frog notices the change and jumps out of the pot. When things “drastically” change for us, we notice a difference. Up until that time, we accommodate our experience into the extant frame we are using, and we seem to do this even when it is to our disadvantage. It does not occur to us to consider that the situation, our behavior, or the behavior of other people may be understood differently from the way we originally framed it. If we did, we could take advantage of cues that are less extreme to avoid the “heat.”
Often negative evaluations lead us to give up. “Tomorrow will be better.” “It's always darkest before the dawn.” Implicit in these messages is the idea that we should give up the moment and accept that there are bad things.
Evaluation, positive or negative, is a state of mind. That does not mean that consequences are not real. It means that the number of consequences one could enumerate for any action are dependent on the individual's interest in noting them, and the evaluation of each of these consequences is dependent on the view taken of them. Events do not come with evaluations; we impose them on our experiences, and in so doing create our experience of the event. For pleasure, in winter Finns dip themselves in ice-cold water, and some Americans swim in the cold ocean; many watch horror movies and ride roller coasters for the purpose of becoming afraid.
Consider the following three different perspectives: (a) bad things are intolerable; (b) bad things happen, but if we just hold on, they will pass; and (c) bad (p. 284) things are context dependent—shift the context, and the evaluation changes. It is the third perspective that brings us most of what we currently value. Western culture currently teaches us only the second perspective. Even the saying “Every cloud has a silver lining” does not quite lead us to the third view. The implication here is that the bad thing will result in something good. Again we are expected to give up the moment and wait for it to pass, but now what will result is not just the passing of the bad but the arrival of something good. An optimist is said to be the one who, when surrounded by manure, knows there must be a pony in there somewhere. Again this is not what the third view is about. In this view there is an awareness that the very thing that is evaluated as negative is also positive. It is not that there may be five negative things and five positive—which surely is better than just seeing the negative—but that the ten things are both negative and positive, depending on the context we impose on them.
The previously noted cultural expressions are encouragement to “hope.” Typically the encouragement to hope implicitly regards the present as necessarily bad. It is fine to want tomorrow to be good and to expect that it will be. When this is what we mean by encouraging hope, there is no problem. All too often, however, words of hope are expressed when people are feeling bad and they indirectly are led to accept that set of feelings. It is not fine to passively give up today. Such giving up follows from the view that is implicitly reinforced by the previous statements that events themselves are good or bad, rather than that our views make them good or bad.
Although the culture encourages us to be able to “delay gratification,” waiting is mindless in that it suggests that there is no way to enjoy what is being done at the moment. Mindless hoping and learning to wait work against this concern. In one experiment aimed at testing these ideas, research participants were given cartoons to evaluate where the same task was defined as either work or play. When it was framed as work, it was found to be unpleasant, leading participants' minds to wander as they tried to just get through it. Although it was the same task, their response to it was very different as a function of the way they viewed it (Snow & Langer, 1997).
The downside of evaluation to intrapersonal processes is prodigious. We try to get through the “bad” times; we hesitate to decide because the “negative” consequences may be overwhelming. We try to feel better by comparing ourselves with those “worse off.” We suffer guilt and regret because of the negative consequences we experience or have perpetrated on others. We lie because we see the negative aspects to our behavior and try to hide them from others. Each of these processes—social comparing, experiencing regret and guilt, and lying—implies that events are good or bad and that we must learn to accept them as they are and learn to deal with them, rather than to question our evaluation of them in the first place.
The implicit message given by the culture is that there is one yardstick by which to measure not just outcomes but ourselves and others. We look for new explanations only when all seems to fail. And, as with the frog, it may be too little, too late. For evaluation to be meaningful, we need to use a common metric. The problem enters when we are oblivious to the fact that many other potential yardsticks can be used, with very different results. The prevailing view in the coping literature is to allow a period of grieving so that the person can thereafter reengage in life-goal pursuits. Indeed, the very worst thing that one can do to persons who have just undergone a tragedy or loss is to have them see the situation differently (see Snyder, 1996). After loss, people may need to go through a period of grief and depression, and then after a period of time, goal-directed hopeful thought can be useful (see Feldman & Snyder, 2000). People who have undergone traumas want to be heard and have others listen to their “pains” rather than trying to “see” those pains differently. In this latter regard, friends or helpers will lose credibility as listeners if they become too prescriptive (Tennen & Affleck, 1999).
This view is not incompatible with the position being argued here. If the event is already negatively evaluated, it should be treated respectfully. Nevertheless, many “tragedies” initially could have been understood as opportunities at best, or inconveniences at worst.
The Multiple Meanings of Behavior
When the stories we tell ourselves are compelling and so much information seems to fit our interpretation, it is hard to understand why the other person just doesn't get it. And so we become evaluative. Presently, for many of us to feel right, someone else must be wrong. This dichotomous reasoning is the cause and consequence of an implicit acceptance of a single perspective. Behavior makes sense from the actor's perspective, or else it would not have occurred. I am right, and so are you. The task of successful interpersonal relating, then, may be to search for the information to make this point clear to us or simply accept that the behavior in question must have made sense.
(p. 285) Psychologists (e.g., Jones & Nisbett, 1972) have long described differences that result from the differences in perspective depending on whether one is responsible for some action, the actor, or whether one is an observer of that action. The findings suggest that as observers, we are more likely to attribute other people's behavior to dispositions and our own to situations. Situational attributions help keep us in the present. Dispositional attributions hold things still, presumably to enable us to predict the future. Because of our tendencies to confirm our hypotheses (Langer & Abelson, 1974), they instead may become self-fulfilling prophecies, creating a world less pleasant than it otherwise would be. Negative dispositional attributions keep us at a distance from people and thus reduce the chance to see that the attribution was wrong.
Past researchers have pointed out that behavior engulfs the field of our observation. Thus, as observers we see most clearly the action taken, while the situational constraints effecting those actions are less visible. As actors those situational constraints are felt more keenly. As actors we often know why we had to do whatever we did. We also know that in other circumstances we have behaved differently. Observers usually do not have this information.
While the research on attribution theory has certainly yielded important findings, there is another factor that needs to be highlighted that has not yet been examined, one that may account for even more of the interpersonal misunderstanding and concomitant unhappiness that people experience. Not only do people see different information depending on their vantage points and motivation, but, as implied earlier, people often see the “same” information differently. All of the behavior is accounted for but with a different label that carries with it a very different evaluative tone. Consider, for example, serious versus grim, flexible versus unpredictable, spontaneous versus impulsive, private versus secretive, and so on. “All” behavior is vulnerable to labels connoting these different evaluative tones. If our behavior is mindlessly engaged so we are essentially oblivious to why we did whatever we did; however, even as actors we become vulnerable to negative dispositions.
We often think we know other people, and because of this assumption we don't ask, and because we don't ask, we don't learn that the “same” event may look very different to someone else.
Often we don't know how other people feel unless we ask; we don't ask because we think we know. We think we know because we know how we would feel in the same situation. That is, we overestimate how similar other people are to ourselves. Lee Ross and colleagues have called this the “false consensus effect” (Ross, Greene, & House, 1977). We presume that our behavior makes sense and that all well-adjusted people would do the same thing. If someone does something different, he or she must then be “that kind of a person.” For example, people in various experiments were asked to predict the opinions and attitudes of others about topics as varied as defense spending, soup, and what constituted appropriate behavior in various situations. Time and again, people overestimated the proportion of other people who feel or would behave as they do. Again, if I assume that all of us feel the same and I find out that you feel differently, it is your strange behavior that calls for explanation.
It may not be so much that we overestimate how similar others are to ourselves. The mistake we may make is that when we look at ourselves as observers, we see ourselves the same way we see others (Storms, 1973). However, when we take action, we may do so as mindful actors and not observers. Often we see the “same” behavior from different vantage points, but we label it quite differently—different with respect, primarily, to its evaluative tone. “We” may be interested in getting along with others, for example, but he may be seen as conforming.
One major problem with our tendency toward false consensus occurs when we turn it on ourselves. When we look back at our own behavior, now from the observer's vantage point, we may see ourselves as having behaved like “one of them.” Because, as Kierkegaard noted, we live our lives going forward but understand them looking back, it is important to consider what we do as observers of other people. When we look back, we, too, are the objects of our inquiries and may treat ourselves the way others might. Those who are less evaluative of others will be less evaluative of themselves. This is the hidden cost of making downward social comparisons. We may feel temporarily good at seeing ourselves as superior to someone else, but when we turn things around, we become “him,” the observed.
Consider a person's decision: X is an unpleasant feeling for me. If I do Y, the unpleasantness goes away. It would, then, seem sensible to do Y. Let us briefly consider drinking in this light. Going forward in time, we may feel depressed and empty. We learn that drinking eases the pain. If we do not acknowledge that the behavior initially made sense and only attend to the negative consequences of “excessive” drinking, after the emptiness passes, we do ourselves (p. 286) and others an injustice. The negative feelings that result from the awareness of these consequences probably lead to more drinking, and the cycle continues (Snyder, Higgins, & Stucky, 1983).
From the observers' perspective, for example, “too much” drinking clearly creates unwanted problems for the drinker. The drinker does not say to him- or herself, “I have had enough, but I think I'll drink more.” He drinks as much as is deemed necessary to accomplish whatever his goal may be. The behavior is not irrational. It is undertaken to achieve a state of mind, and it most often accomplishes this. On the other hand, when we become observers of our actions, we may become more aware of the negative consequences of those actions. Looking back as observers of ourselves, we may see that we have caused harm to our livers or hurt our loved ones. Typically, we did not drink to bring about these ends. “Going forward” in time, the behavior was not driven by weakness. For most of us, it is easier to learn something new when we are feeling strong. At those times we feel up to the challenges that face us. It would seem, then, that learning how to manage stress or to understand alternative ways of dealing with emptiness, for example, if those are what prompted the drinking, would be easier if we felt good about ourselves. The point here is that we should feel better about ourselves if we see that in its own context, that is, from a going-forward mode, our behavior made sense. With that understanding, less costly alternatives for achieving our goal may be sought. In her doctoral research, Sharon Popp (personal communication, November 15, 1999) found that construction workers drank “excessively.” Upon questioning, Popp found that when they were drinking, they opened up with each other and put their macho concerns aside. From these drinking interactions they discovered who they felt they could trust. Trust in their line of work is important. Should they drink or not? In more mundane circumstances, simply asking the question, “How may this behavior be sensible?” will quickly reveal reasonable understandings of our own behavior and that of other people. When we see behavior in a right and wrong frame, it is a question we do not think to ask.
Couples often come to feel that they see the same world, thus obviating the necessity for attention to actor—observer differences. Divorce statistics suggest otherwise.
Husband and wife are in two different rooms. Thinking it reasonable, she yells, “ ‘What are these?’ She expects him to get up and go see what she is talking about, and usually she is not disappointed. But one friend once struck back at his wife. … When his wife returned home one day, and shouted to him in his study, ‘Did they come?,’ the husband, not knowing what she was talking about, nevertheless said, ‘Yes!’ The wife shouted to him again. ‘Where did you put them?’ He shouted back, ‘With the others’ ” (Fairle, 1978, p. 43).
Usually, couples do not get to see that they are seeing the same thing differently. “If we have the same frame of reference, we will respond in the same way.”
The power of most great literature and movies is that we come to see the sense of the actor's behavior when the actions are in some way deplorable to us. The tension between the two may be the power of the work. Consider Lolita. If we could just have disgust for Humbert, there would be no problem. After all, grown men are not supposed to become sexually aroused and active with adolescent girls. Nabokov's skill reveals itself in drawing us inside this character so that we cannot so easily dismiss him. Behavior makes sense from the actor's perspective. Oedipus did not just kill his father. That would not have been interesting. We come to see how we could have made the same awful mistake. We tend to enjoy literature and film when we can identify with the characters. Simply being observers barely justifies the price we have paid for the popcorn. But great pieces, perhaps, let us identify with the protagonist and take us places we thought we would never go.
If we have the same experience, we will respond similarly. When we respond differently, we would be wise to conclude that the experience was different. This suggests that individual differences may be more differences in experience than differences in individuals. You and I have our hand on a hot radiator. I have to remove my hand more quickly than you. Are you braver or more able to endure pain? No. If you felt what I felt, you would remove your hand when I removed mine.
I see 10 horses running toward us. I am pleased they are coming to say hello. There are six of us, and everyone else runs for protection. They say I'm in denial. I compare my self to others and wonder what is wrong with me. I see 10 horses running toward us. I am pleased they are coming to say hello. There are six of us, and all but one are equally pleased the horses are approaching. One of us runs away for protection. That person is seen as cowardly. In both cases, the odd person makes an excuse for the difference in behavior. The rest do not get to learn that another perspective exists.
(p. 287) Interestingly, our culture provides us with norms that help us to misunderstand. If we or someone else commits an “error,” we become contrite or indignant. Our response depends on whether, for example, we tell ourselves that “patience is a virtue” or “the early bird catches the worm.” We may think we should have been satisfied with some outcome and not greedy if we think, “A bird in the hand is worth two in the bush” unless we think, “Nothing ventured nothing gained.” We should not have been cowardly, “an eye for an eye,” unless we think we should have “turned the other cheek.” Even our most mundane behavior is hard to pin down: “Clothes make the man” versus “You can't judge a book by its cover.” We can always make sense out of our behavior, or we can take ourselves to task, and the culture provides some of our evidence for whichever we choose. The problem is that much of the time most of us do not realize that there is a choice to be made.
Often, unaware of our motives, we tend to feel even more culpable or blaming when we call to mind any of these or similar refrains that suggest we should have known better. Just as each individual behavior has an individual perspective on it that lends reason to the action taken, so, too, does the opposite behavior.
Several seemingly mundane behaviors, both those taken as “bad” and some taken to be “good,” look different when examined through this none-valuative lens. Consider regret, making excuses, blame, and forgiveness in this new light.
Regret happens under two conditions: when we are unhappy, and when we obscure the difference between our perspective at time one, when we took some action, and time two, when we evaluate the action we took. Regret is a prediction of our emotions: If we had chosen differently then, we would feel better now. If we feel fine now, the need for the prediction would not arise. When it does arise, it depends on the lack of awareness of the reasonableness of the action given the circumstances we faced at the time.
Much of the regret people experience concerns actions not taken. Perhaps the best way to feel bad is to see oneself as not having done anything when something could have been done. This is the most difficult case to deal with because any action taken can be used as at least some justification for not having taken some alternative action. “I couldn't get the phone because I was in the other room going through the mail, so I missed out on finding out about the trip in time to go” versus “I wasn't doing anything, and I missed the call and didn't find out in time to go.” Are we ever really doing nothing?
To test whether future regrets could be prevented and “cured,” we (Langer, Marcatonis, & Golub, 2000) conducted the following investigations. First, research participants showed up for a study on gambling in which they could win more than $100 and risk none of their own money. After the person showed up he or she was asked to wait until it was his or her turn, which would be indicated by a light above the waiting room door. Upon seeing the flashing light, the participant was to go to another floor in the building for the experiment. Participants were then randomly assigned to one of four conditions. We arranged it so that everyone missed an opportunity. Only those participants who were aware of spending their time well were expected to not feel regret.
Group 1: Participants in this group were simply told, “We do not need you to do anything at this time, so just wait until it is your turn.”
Group 2: This was the same as Group 1, except that a Civil War documentary was playing on a VCR in the room.
Group 3: These subjects were invited to watch taped episodes of Seinfeld while they waited.
Group 4: These participants were asked to wait until it was their turn, but it was suggested that they spend the time thinking and feeling.
It was expected that Groups 1 and 2 would suffer the most regret; after all, to their minds, we reasoned, they were doing nothing and missed out on an opportunity. Although the mindful group, Group 4, was expected to feel the least regret, Group 3, based on popular ratings of the TV show, was expected to enjoy themselves and thus not regret the opportunity missed as much as the first two groups.
After waiting 20 min, the experimenter returned and informed the participants that they had missed their turn, and that two people who showed up won $200 and everyone else won at least $50. The experimenter checked the light above the door to show participants that it did indeed work.
Individuals in the Mindful condition (who were told to “be aware of what you are thinking and feeling” while they waited) (a) had a more positive experience as subjects; (b) found the experience of being a subject more beneficial; and (c) expressed less regret about the way they spent their time (p. 288) compared with individuals in the Civil War (CW) condition (who were allowed to watch a Civil War documentary) or those in the “Nothing” condition. Individuals in the Mindful condition were also more willing to participate in future investigations than were those in the CW or Nothing conditions.
It may be unreasonable, given the world we currently live in, to be happy or mindful all the time. If we were, as this research suggests, we might never have to experience regret. This work suggests that if the regretted action cannot be undone, any engaging activity may remove the negative feelings. Rethinking why the regretted action or inaction occurred, however, has the clear advantage of preventing its return. But the best alternative, it would seem, is to start out with the assumption that their behavior made sense to them at the time given the circumstances as they saw them, or else they would have behaved differently. This research tells us that when we are aware of why we are doing what we are doing, there is little room for self-recrimination.
Counterfactual thinking is the generation of alternatives that run contrary to the facts of what happened (Roese, 1997). Typically, the individual thinks that had he done otherwise, the outcome would have been better. Whereas upward comparisons, as in the social comparison literature, may initially result in feeling bad because of a realization of some positive alternative that might have been, over the long haul they may provide useful information and motivation for engaging in different behavior in the future: “If only I had taken her advice, this whole mess would have been avoided.” Downward comparisons, in which we breathe a sigh of relief that we did not behave otherwise and thus avoided some negative outcome, result in positive feelings by contrast to what might have been: “Thank goodness I remembered to call, or else I, too, would have been fired.”
It is true that if one imagines what else one might have done, there may be some experience of relief and the consideration of new information that may be of later use. Nevertheless, I would argue that there is a hidden downside to this kind of thinking. Surely if we proceed mindlessly, experience negative outcomes, and think about how we could have behaved differently, it is far better than believing we had no choice. But it is too easy for people to jump from “could have been” to “should have been” and then there arises the problem of how could we have been so stupid or incompetent not to have done it that way in the first place.
Counterfactual thinking occurs after the behavior has been engaged. Mindful thinking occurs before the activity has been engaged. This way people know why they did what they decided to do and why they did not do otherwise. The consideration of alternatives may still lead to information that may be useful in the future but without the self-recriminations.
Moreover, every time we say to ourselves that we “should have,” we implicitly reinforce the illusion of certainty and the single-minded evaluation of consequences. For example, “If only I had gone on Thursday instead, all would have been right with the world.” Many mishaps could have occurred on Thursday that probably went into the reasoning to go on Wednesday. We no more know all the consequences that could have arisen now than we could have known then. The difference is that if we thought about it then, we would be aware that we could not have foreseen everything. The consequences we did consider may have looked different to us at the time before the decision was made. Now, after the fact, we experience regret by freezing the evaluation of these consequences. Before: “I told him about your appointment because he was getting angry at you, and I thought that would upset you. After: “I'm sorry I violated your privacy by telling him about your appointment.” If we are mindful of alternative courses of action and alternative views of the potential consequences, then we are more likely to see the uncertainty inherent in the situation. If we proceed mindlessly, we often take the current view as the only reasonable view. Our self-respect suffers because we then feel we should have known.
Counterfactual thinking is more likely to occur after the experience of negative outcomes than positive outcomes (e.g., Klauer & Migulla, 1996; Sanna & Turley, 1996). Anger, depression, boredom, or essentially any unhappiness can trigger thoughts of how we might have done things differently in the past. If we proceed more mindfully, our perspective is forward-looking, not backward-looking.
Counterfactual thinking or regret is also more likely to occur the closer the person is to the sought-after goal. This tendency is so great that we even do it in situations that we take to be largely chance-determined. For instance, if we choose a lottery number that misses by one number, we experience more regret than if we miss by many numbers. The reason for this is that the closer we are to the goal, the easier it is to see how we might have behaved differently. Missing a train by 5 min seems worse to most of us than missing it by half an hour. The more mutable the situation, the more regret we feel. But all the while, we ignore that our actions made some sense to us and that is why we so engaged in them, whether the distance to some other goal was great or small.
(p. 289) The more normative the “appropriate” behavior is taken to be, the more regret we experience for our behavior (Kahneman & Miller, 1986). But, again, this analysis is after the fact. It is important to realize that many norms can be brought to bear on any situation before the fact. If after the fact we learn that “everyone” went to the party, we will feel more regret than if going to the party was not as typical. Before the fact we may consider many different parties and conclude that there is no norm for attendance; people are as likely to go as to not go. Before the fact, there are many sources of reasonable information to consider in making our decisions, with many possible consequences. After the fact, there are fewer paths that make sense because consequences are now apparent. Regret denies the utility of our past experience to our present situation.
Ask 10 people if making excuses is good or bad. Next, ask them why. Virtually all will answer that making excuses is bad, and most will offer a view that amounts to saying that the excuse maker is not owning up to what she did. What does it mean to take responsibility for our actions? For which action should we be responsible?
Consider, again, that behavior always makes sense from the actor's perspective. If it did not, the actor would have done differently. People do not get up in the morning and say, “Today I'm going to be clumsy, inconsiderate, and hurtful.” What were they intending when we experienced them that way? If we are not mindful of our intentions going forward, we become vulnerable to other people's characterization of our behavior looking back. The same behavior looks different from different perspectives. A negative view of our behavior necessitates an excuse.
What is the difference between an excuse and a reason? If I give an explanation that makes sense to us both, the explanation is taken as a reason for my action. When it is not accepted, a reason becomes an excuse. When it is not accepted, the actor's perspective is denied. The attribution of excuse making allows the person to whom the excuse is made to feel superior, at least for the moment. The cost is loss of genuine interaction and understanding.
If our behavior made sense going forward in time and we were mindless to our intentions, we may offer a reason that is unacceptable to the blamer, and it will be taken as an excuse. If we respect ourselves enough to know that what we did must have made sense to us even if we cannot remember why we did it, we will reject the accusation. The alternative is that the behavior was the person's fault, engaged because he is bad. If he is not bad, then why did he do it? When people live in a world of absolute right and wrong without regard to perspective, any explanation different from their own is taken to be an excuse.
Our culture has confused reasons and excuses, with the result that the blamer has a ready reason not to listen. What does this mean when you think I am making excuses? Does it mean that I had no reason for what I did? I find myself saying, in such situations, “I know I care about you. Are you trying to persuade me that I don't?” “I know I'm a nice person. Are you trying to make me think I am not?”
The word “excuse” conveys an accusation on the part of the person to whom an explanation is given. It implies distrust regarding the speaker's motives and intentions. Our culture has become so tolerant of excuses that the difference between a reason and an excuse is not likely to be easily noticed. By obscuring the difference between the two, we unwittingly act as though our actions have no reasons, or that the only acceptable reason is one in which someone must look bad. As a result, self-respect suffers as others are given the final word over our intentions. If I paid attention to my actions before engaging in them, I would know why I did what I did after the action was completed. The cost of my mindlessness is that now I am more likely to accept your understanding of my behavior. What might have been a reason becomes an excuse.
If behavior makes sense from the actor's perspective, then we as actors become less vulnerable to other people's attributions of excuse making. Moreover, the very idea of an “excuse” reinforces the view that consequences are inherently negative.
Blame and Forgiveness
“To err is human, to forgive divine.” Or is it? Again, ask 10 people whether forgiveness is good or bad. All will probably tell you that it is good. Forgiveness is something to which we should aspire. The more wronged we have been, the more divine it is to be able to forgive. Now ask 10 people if blame is good or bad. All will probably tell you that blame is bad. And yet to forgive, we have to blame. If we do not blame in the first place, there is nothing to forgive.
But there is a step before blame and forgiveness that needs our consideration. Before we blame, we have to experience the outcome as negative. If your behavior resulted in something positive for me, blame would hardly make sense. Those who see (p. 290) more negativity in the world are then those more likely to blame.
The same behavior makes many different senses. If we do not appreciate that we may look at a situation and see different things or see the same things differently, then we will remain stuck in an evaluative mind-set. If we remain in this mind-set, then we will experience negative outcomes that could have been experienced as positive. If we experience negative outcomes, then we will be tempted to find someone to blame. If we blame, at least we can try to forgive. To be forgiving is “better” than to be unforgiving. Understanding that the action made sense to the actor obviates the necessity for forgiveness.
Discrimination Is Not Evaluation
We can be discriminating without being evaluative. Noticing new things is the essence of mindfulness. Unquestioningly accepting a single-minded evaluation of what is noticed is mindless. Our culture is replete with examples of mindless evaluation. Sadly, it is hard to conjure up examples of a more mindful stance. We take an evaluative component as an essential part of our beliefs. Without knowing if something is good or bad, after all, how would we know whether to approach or avoid ideas, people, places, and things? Yet accepting evaluation, rather than mindful discrimination, as essential, we set ourselves up for the experience of feeling inadequate. By mindlessly attaching this evaluative component to our beliefs, we become victims of our mind-sets. We experience this reactivity only when things go wrong. These are the times we try to change; yet these are the times we are least equipped to do so.
With the awareness that we are responsible for our evaluations, we are more likely to use them in a conditional way. As such, we can stay responsive to our circumstances rather than become reactive to them as absolute evaluations lead us to be.
“When no news is bad news.” If we give up evaluation, we give up the compliment but we are no longer vulnerable to the insult.
If someone compliments us, what is our reaction? If we are very pleased, it would suggest a certain amount of uncertainty about our level of skill. It also suggests a degree of vulnerability we would experience if we did not succeed. Imagine that somebody whose opinion you respected told you that you were great at spelling three-letter words. Chances are if you are over 10 years old, you would not be moved by this compliment. You know you can do it, so the feedback is essentially unimportant to you. Imagine the same respected person told you that the way that you pronounce vowels is extraordinary. Again, you would be unlikely to be very moved. This time you are not taken in by the compliment because the issue probably does not matter much to you. In both cases, when you were not testing yourself, the compliment was unimportant. Here, then, is a way to protect ourselves from negative remarks: “If we don't take the compliment, we are not vulnerable to the insult.”
The behaviorist literature tells us that there is positive and negative reinforcement. And there is positive and negative punishment. Positive reinforcement is the presentation of a positive stimulus—for example, a compliment. Negative reinforcement is the cessation of an aversive stimulus—for example, if someone is always insulting you, and now you do something and no insult follows, that behavior will be negatively reinforced. Reinforcement increases the response leading to it; whether it is positive or negative reinforcement, reinforcement feels good. Conversely, punishment is meant to stop the behavior leading to it. Positive punishment is the presentation of an aversive stimulus—for example, an insult. The interesting but less well known case is negative punishment: “the cessation of a positive stimulus”—time out from compliments. Because compliments feel so good, we are not inclined to look beyond them. Because compliments may help control us, there is little motivation for others to see their costs to us. Compliments, like insults, generally concern what we do and not who we are. As such, they help keep us in an evaluative frame of mind. Evaluating the self takes one out of the experience; self becomes object rather than actor. Ironically, with less experience, there is less of a self to evaluate.
If we give up evaluation, we give up downward social comparing.
If you never assume importance
You never lose it.
(Lao Tse, The Way of Life)
The most frequent evaluation people make is to compare themselves with other people. When we want information about how to do a task better, we compare ourselves with others who are slightly better at the task. When we want to bolster our self-images, we compare ourselves with those who are less able then we are, that is, we make downward social comparisons.
A good deal of the work in psychology that deals with the “self” takes as a given that evaluation will occur and then proceeds to examine what information will be used and how it is used in making that (p. 291) evaluation. When evaluation does take place, it is carried out in the manner suggested by these researchers. The literature does not question whether or not there can be another way of being that is nonevaluative.
Leon Festinger (1954) went so far as to say that people have a drive to evaluate their opinions and abilities. When objective means for this comparison are not available, people make comparisons with others. People choose similar others for these comparisons. To feel good about themselves, people often make downward social comparisons. Regarding abilities, we make upward comparisons. I am more likely to compare my tennis skills with those of someone better than with those of someone I know cannot play as well. As Festinger was quick to note, of course in these upward comparisons I am not likely to compare myself with those far better than I am. I try, according to Festinger, to close the gap and become as similar to others as I can. There is also a tendency to reduce discrepancies regarding opinions. Both of these tendencies, regarding ability and opinion, implicitly reinforce the idea that there is a single view (a right and wrong), and that it is in our best interest to be like everyone else.
Is there any evidence that we can be completely nonevaluative? I do not know of any. Nevertheless, our own research suggests that this is the direction in which we might want to move. Johnny Welch, Judith White, and I (Langer, White, & Welch, 2000) conducted an investigation to look at the effect of being evaluative on negative emotions such as guilt, regret, stress, and the tendency to blame, keep secrets, and lie. First, we gave a questionnaire to people that simply asked them how often they compared themselves with other people, regardless of whether they saw themselves as better or worse. We then asked them to indicate how often they experienced the feelings or behaviors just noted (guilt, etc.). We divided the participants into two groups—those who answered that they frequently compared themselves with others and those who made these comparisons less often—and then looked at how often the two groups experienced the listed emotions and behaviors. The findings results were clear. Those who were less evaluative experienced less guilt, regret, and so on. Moreover, in response to the question, “In general, how well do you like yourself?” the less evaluative group was found to like themselves more.
The next step in our research was an experiment in which we (Yariv & Langer, 2000) either encouraged or discouraged people to make evaluations and found that, as in the questionnaire study, those who were more evaluative also suffered more.
We may stay evaluative because positive evaluation helps us feel good in the short run. As soon as we agree to accept a positive evaluation as reason to feel good about ourselves, however, we open the door for the damaging consequences of perceived failure. Surely, depression, suicide, and just feeling bad all result in whole or part from an evaluative stance.
James Joyce's famous book The Dubliners was rejected by 22 publishers. Gertrude Stein submitted poems to editors for about 20 years before one was accepted. Fred Astaire and the Beatles were also initially turned down. The list goes on (Bandura, 1997).
We have much control over the valence of our experience. Research participants actively drew 0, 3, or 6 novel distinctions while being engaged in disliked activities (listening to rap or classical music; watching football). We found that the more distinctions drawn, the more the activity was liked (Langer, 1997). That evaluations are malleable may also be seen in the classic research on the “mere exposure effect”: The more often you see something, the more you like it. We had hypothesized that this effect would obtain primarily if, on repetition, participants drew novel distinctions—that is, if they were mindful. In this research, exposure was held constant, but participants drew several or few distinctions. The mere exposure effect seems to rely on mindful distinction drawing and not on exposure alone (Fox, Langer, & Kulessa, 2000). In either case, however, we have control over the valence of our experience.
Consider how we look for change when we observe our children and thus constantly draw novel distinctions. Our affection for our children only grows. We look for stability when we observe our spouses or partners. Sadly, too often our affection diminishes for our spouses or partners. Positive affect, thus, seems to depend on our willingness to mindfully engage another person.
The Myth of Inaction
Let us return to the question of analysis paralysis. Would giving up evaluation lead to inaction? After all, if you cannot believe your action will be successful or that you will want the final outcome you have worked for, why engage in it at all? The short answer is, “Why not?”
(p. 292) Consider what many experience as a midlife crisis. At some point in life, many people come to realize that nothing has any intrinsic meaning. There are three possible responses to this. Those who do not successfully emerge from this belief stay depressed and cynical at the meaninglessness of it all. Some ignore this belief and proceed as if they never had it, although all the while it lurks in the background. Finally, there are those who accept that everything can be equally meaningless or meaningful. This last group is the most likely to stay situated in a self-constructed meaningful present.
Similarly, a person can take action falsely believing that the action will result in a singularly desirable end state and repeatedly suffer surprise and disappointment. Instead, the same person can come to see that the action may not lead to the outcome, and the outcome may not be desirable anyway, and thus decide not to take any action. But a third option also presents itself. The person may be freed to take action because feared negative consequences are just as unpredictable as desired positive consequences, and even if they do occur, they also have another side. It is more satisfying to do something than to do nothing. Action is the way we get to experience ourselves. And so we act not to bring about an outcome but to bring about ourselves. In fact, when asked if they would hesitate to act if assured that the outcome to their prediction would be positive, people overwhelmingly say no. The fear of inaction has hidden in it the evaluative belief that making the “wrong” decision may be costly. Recent research we have conducted (Bodner et al., 2000; Langer & Lee, 2000), in fact, shows that people who are taught to reframe positive as negative, and vice versa, make their decisions more, not less, quickly. Giving up our evaluative tendencies does not seem to lead to inaction.
This is not to say that “inaction” is necessarily bad. Typically, or we see inaction as the absence of a particular action. That is, we do not make the phone call, buy the item, or attend the event. We need not see ourselves as inactive, but rather actively pursuing another course. If we realized this, we might be less afraid of giving up the illusion of evaluative stability.
In our attempts to understand behavior, psychologists may have unwittingly contributed to the unhappiness we are now directly attempting to change. As researchers we have been in the perspective of observer, more often than not, oblivious ourselves to how the same behavior may have different meaning when understood from the actor's perspective in addition to our own.
Mindfulness versus Positive Evaluation
Many years ago, we conducted an experimental investigation aimed at teaching people to be positive (Langer, Janis, & Wolfer, 1975). We looked at what the effects would be on preoperative stress of having a positive view of one's situation. We found that patients became less stressed, took fewer pain relievers and sedatives, and were able to leave the hospital sooner than comparison groups. Surely a single-mindedly positive view is likely to be more beneficial to health and well-being than a mindlessly negative view. But there are problems endemic to teaching people to view things positively. First, in the way we normally use language, if things can be positive, other things would then seem to be inherently negative. Second, because evaluation is taken to be an inherent part of outcomes, changing one evaluation from negative to positive may suggest more that the person was originally mistaken than that all outcomes may be viewed in a positive way. As such, positivity training may be less likely to generalize to new situations. Third, it would seem to follow that to be positive would be to accept positive statements by others (i.e., compliments), but to do so sets us up for negative punishment. Fourth, if being positive means we should be grateful that we are not as badly off as others might be (i.e., make downward social comparisons), then such gratitude comes at a very high cost. Fifth, and most important, if we teach people to be positive, we may unintentionally teach them to keep evaluation tied to events, ideas, and people, and thus we promote mindlessness.
When mindful, we may find solutions to problems that made us feel incompetent. We may avert the danger not yet arisen. By becoming less judgmental, we are likely to come to value other people and ourselves. All told, it would seem that being mindful would lead us to be optimistic, obviating the necessity for learning how to be positive.
While well-being is more likely to be related to positive than negative evaluations, positive evaluation makes negative evaluations appear to be independent of us. “Positive” experiences like hoping and forgiving, regret over past actions, and delaying gratification for a future goal all implicitly suggest that there is still potential negativity that one may have to confront. Positive evaluations, then, may implicitly rob us of control. Similarly, downward social comparisons may work in the short run to alleviate negative affect, but they set the stage for upward comparisons in the future. By contrast, mindfulness keeps us (p. 293) engaged and situated in the present. The mindful individual comes to recognize that each outcome is potentially simultaneously positive and negative (as is each aspect of each outcome), and that choices can be made with respect to our affective experience. Thus, the mindful individual is likely to choose to be positive and will experience both the advantages of positivity and the advantages of perceived control for well-being.
Questions for Future Research:
1. To what extent are individual differences in mindfulness shaped by culture?
2. What behaviors and psychological processes mediate the effects of mindfulness?
3. How can we design interventions to promote mindfulness and thereby improve well-being?
Bandura, A. (1997). Self-efficacy: The exercise of control. New York: Freeman.Find this resource:
Bargh, J., & Chartrand, T. (1999). The unbearable automaticity of being. American Psychologist, 54, 462–479.Find this resource:
Bodner, T., Waterfield, R., & Langer, E. (2000). Mindfulness in finance. Cambridge, MA: Harvard University (Manuscript in preparation).Find this resource:
Chanowitz, B., & Langer, E. (1981). Premature cognitive commitment. Journal of Personality and Social Psychology, 41, 1051–1063.Find this resource:
Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. New York: Harper and Row.Find this resource:
Fairle, H. (October, 1978). My favorite sociologist. The New Republic, 7, 43.Find this resource:
Feldman, D. B., & Snyder, C. R. (2000). Hope, goals, and meaning in life: Shedding new light on an old problem. University of Kansas, Lawrence (Unpublished manuscript).Find this resource:
Festinger, L. (1954). A theory of social comparison processes. Human Relations, 7, 117–140.Find this resource:
Fox, B., Langer, E., & Kulessa, G. (2000). Mere exposure versus mindful exposure. Harvard University (Unpublished manuscript).Find this resource:
Jones, E., & Nisbett, R. (1972). The actor and the observer: Divergent perceptions of the causes of behavior. In E. E. Jones, D. E. Kanouse, H. H. Kelley, R. E. Nisbett, S. Valins, & B. Weiner (Eds.), Attribution: Perceiving the causes of behavior (pp. 79–94). Morristown, NJ: General Learning Press.Find this resource:
Kahneman, D., & Miller, D. T. (1986). Norm theory: Comparing reality to its alternatives. Psychological Review, 93, 136–153.Find this resource:
Klauer, K. C., & Migulla, K. J. (1996). Spontaneous counter-factual processing. Zeitschrift fur Sozialpsychologie, 26, 34–42.Find this resource:
Kuhn, T. (1981). A function for thought experiments. In I. Hacking (Ed.), Scientific revolutions (pp. 6–27). New York: Oxford University Press.Find this resource:
Lakatos, I. (1970). Falsification and the methodology of scientific research programmes. In I. Lakatos & A. Musgrave (Eds.), Criticism and the growth of knowledge (pp. 91–196). New York: Cambridge University Press.Find this resource:
Langer, E. (1989). Mindfulness. Reading, MA: Addison-Wesley.Find this resource:
Langer, E. (1992). Interpersonal mindlessness and language. Communication Monographs, 59, 324–327.Find this resource:
Langer, E. (1997). The power of mindful learning. Reading, MA: Addison-Wesley.Find this resource:
Langer, E., & Abelson, R. (1974). A patient by any other name …: Clinician group differences in labeling bias. Journal of Consulting and Clinical Psychology, 42, 4–9.Find this resource:
Langer, E., & Dweck, C. (1973). Personal politics. Englewood Cliffs, NJ: Prentice-Hall.Find this resource:
Langer, E., Hatem, M., Joss, J., & Howell, M. (1989). Conditional teaching and mindful learning: The role of uncertainty in education. Creativity Research Journal, 2, 139–150.Find this resource:
Langer, E., & Imber, L. (1979). When practice makes imperfect. Journal of Personality and Social Psychology, 37, 2014–2025.Find this resource:
Langer, E., Janis, I., & Wolfer, J. (1975). Reduction of psychological stress in surgical patients. Journal of Experimental Social Psychology, 11, 155–165.Find this resource:
Langer, E., & Lee, Y. (2000). The myth of analysis paralysis. Cambridge, MA: Harvard University, unpublished data.Find this resource:
Langer, E., Marcatonis, E., & Golub, S. (2000). No regrets: The ameliorative effect of mindfulness. Cambridge, MA: Harvard University.Find this resource:
Langer, E., & Piper, A. (1987). The prevention of mindlessness. Journal of Personality and Social Psychology, 53, 280–287.Find this resource:
Langer, E., White, J., & Welch, J. (2000). Negative effects of social comparison. Cambridge, MA: Harvard University (Unpublished manuscript).Find this resource:
Lao, T. (1962). The way of life (Trans. W. Bynner). New York: Capricorn Books.Find this resource:
Miller, D. (1994). Critical rationalism. Chicago: Open Court.Find this resource:
Popper, K. R. (1959). The logic of scientific discovery. London: Hutchinson.Find this resource:
Popper, K. R. (1973). Objective knowledge. London: Routledge.Find this resource:
Roese, N. J. (1997). Counterfactual thinking. Psychological Bulletin, 121, 133–148.Find this resource:
Ross, L., Greene, D., & House, P. (1977). The false consensus effort: An egocentric bias in social perception and attribution process. Journal of Personality and Social Psychology, 13, 279–301.Find this resource:
Sanna, L. J., & Turley, K. J. (1996). Antecedents to spontaneous counterfactual thinking: Effects of expectancy violation and outcome valence. Personality and Social Psychology Bulletin, 22, 906–919.Find this resource:
Snow, S., & Langer, E. (1997). [The power of mindful learning]. Reading, MA: Addison-Wesley (Unpublished data, Reported in E. Langer).Find this resource:
Snyder, C. R. (1996). To hope, to lose, and hope again. Journal of Personal and Interpersonal Loss, 1, 3–16.Find this resource:
Snyder, C. R., Higgins, R. L., & Stucky, R. (1983). Excuses: Masquerades in search of grace. New York: Wiley-Interscience.Find this resource:
Storms, M. (1973). Videotape and the attribution process: Reversing actors' and observers' points of view. Journal of Personality and Social Psychology, 27, 165–175.Find this resource:
Tennen, H., & Affleck, G. (1999). Finding benefits in adversity. In C. R. Snyder (Ed.), Coping: The psychology of what works (pp. 278–304). New York: Oxford University Press.Find this resource:
Yariv, L., & Langer, E. (2000). Negative effects of social comparison. Cambridge, MA: Harvard University (Unpublished manuscript). (p. 294) Find this resource: