Open-mindedness and Non-attachment to Views
Open-mindedness and Non-attachment to Views
*… Seeing that harmful actions arise from anger, fear, greed, and
intolerance, which in turn come from dualistic and discriminative
thinking, I will cultivate openness, non-discrimination, and
non-attachment to views in order to transform violence, fanaticism,
and dogmatism in myself and in the world. *Thich Nhat Hanh, First Mindfulness Training “Reverence for
Life”1The human understanding when it has once adopted an opinion draws all
things else to support and agree with it. And though there be a
greater number and weight of instances to be found on the other side,
yet these it either neglects and despises, or else by some distinction
sets aside and rejects, in order that by this great and pernicious
predetermination the authority of its former conclusion may remain
inviolate.Bacon, 1620, Novum Organum Aphorism 46
“When the facts change, I change my mind. What do you do, sir”
John Maynard Keynes, Economist, (Apocryphal)
Consider that one of the greatest obstacles to our well-being is our attachment to views — that we are right and others are wrong. As a result, we find it hard to change our views and to hear others. This can result in dogmatism, fanaticism and even violence both physical and emotional. At a personal level, our attachment to views may be one of the greatest obstacle to our own well-being and enlightenment because of the difficulty we face relinquishing deep-rooted beliefs in an inherently existing ‘self’ or ‘I’.2
In this primer we look at three areas related to this issue. First, we look at what we mean by attachment and non-attachment to views and their relationship to engagement and detachment. We clarify that non-attachment is not **de-**attachment but rather un-attached engagement and open-mindedness.
Second, we present the scientific evidence that we are attached to views: that we resist changing our views despite strong, contrary evidence. We discuss various reasons why that might be and the impact of this materially and spiritually.
Third, we explore the deeper connections with ontology and our ideas of self in philosophy and Buddhism.
What are Attachment and Non-Attachment?
Ordinarily we think of attachment as something positive or even neutral: I’m attached to this old watch because my father gave it to me, or the boat is attached to the shore by a rope. And conversely to be unattached offers sounds a bit negative. For example, if you say “I’m unattached” it means you are without a romantic relationship — whereas to be attached is to have one (observe that common slang for getting married is to “get hitched” which roughly approximates to to “get attached”).
And this sense is still there when it comes to views. Not to be attached to a view is to be de-tached. Whilst there can be a positive sense of dispassionate and independent as in “the judge considered the arguments with a sense of detachment”, there is also the sense of uninvolved and uncaring: “the man watched the dogs attack the fox with an air of detachment”.
Thus our use of attachment may be surprising. In ordinary english attachment is often used in as positive context: we are attached to places, people and things that we like and care about. Conversely, the opposite of attachment — detachment — has a mildly negative sense of emotionless unconcern, anomie, lifelessness — “he kissed her with an air of detachment”, “he lived detached, absent, as if something were permanently missing”.
Our use of attachment and non-attachment are somewhat special and rather specific. It derives from a Buddhist tradition. In that tradition “attachment” is the translation for key concept around the way that we “cling” to things: experiences, things, ideas, even consciousness. It can be found as a key phrase in translations of the Four Noble Truths, the core teachings of the Buddha3:
-
Life involves suffering
-
Suffering arises from attachment [also translated as “craving”,
“clinging to” …]
-
Suffering ceases when attachment ceases
-
Freedom from suffering is possible by practicing the Eightfold Path
The special usage also explains why we use *non-*attachment rather than de-tachment as the contrast to attachment. Non-attachment, which is our focus here, is not detachment. It is not simply an absence, a lack of attachment. Rather it is something positive, a positive choice that makes true engagement and commitment possible.
Consider an analogy with listening. When we listen to another person we can listen in several ways. One way to listen is to do passively. It is listening just as not talking but without really engaging with what the person is saying. What they say comes in our ears but we do not really hear it or listen to it in a true sense. This is “detached” listening. On the other hand, there are times when we truly listen and listen deeply. This is an active not a passive act. Where actively engage ourselves with what they are saying, opening our mind to it, positively welcoming what they are saying.
Common garden examples of attachment
Research on Attachment to Views
Misinformation, False Beliefs and Cognition
If we look around us: at newspapers, at our friends, even in our own lives, it becomes clear that misinformation is ubiquitous, and that false and erroneous beliefs are widespread and persistent.
For example, over half of all voters in Republican Primaries in 2011 were “birthers” who believed that President Obama was not born in the United States despite overwhelming evidence to the contrary. Or the fact that a sizeable minority in Britain and the United States now believe, erroneously, in a link between vaccination and autism thanks to a single 1998 study which was widely reported at the time but subsequently discredited. This has had a large impact for public health as parents refuse to vaccinate their children resulting in a significant number of preventable deaths and illness.4
There is now a sizeable body of scientific research on why false beliefs persist.
What are we interested in?
-
Conclusive evidence that a) people have false beliefs b) that these
persist in face of presentation of contradictory evidence c) why
that is — what cognitive and ontological mechanisms are at work d)
what practical recommendations would come out of this for creating
open-mindedness and non-attachment to views -
Questions: framing in terms of “false” belief and misinformation
-
People don’t change views -> they discard ignore or dismiss
contradictory evidence -> they do this because it conflicts
with / threatens the security of their identity (ego) -> leads
us into ontology -> buddhist attachment to views -> becoming
more open-mindedness requires addressing our attachment ->
buddhist practice (e.g. meditation) as this generates awareness
and equanamity, which then can lead to a falling away of our
attachment- Guarding the senses
A real world impact of “attachment” (Lewandowsky et al. 2012, pp. 106-107):
On August 4, 1961, a young woman gave birth to a healthy baby boy in
a hospital at 1611 Bingham St., Honolulu. That child, Barack Obama,
later became the 44th president of the United States. Notwithstanding
the incontrovertible evidence for the simple fact of his American
birth—from a Hawaiian birth certificate to birth announcements in
local papers to the fact that his pregnant mother went into the
Honolulu hospital and left it cradling a baby—a group known as
“birthers” claimed Obama had been born outside the United States and
was therefore not eligible to assume the presidency. Even though the
claims were met with skepticism by the media, polls at the time showed
that they were widely believed by a sizable proportion of the public
(Travis, 2010), including a majority of voters in Republican primary
elections in 2011 (Barr, 2011).In the United Kingdom, a 1998 study suggesting a link between a
common childhood vaccine and autism generated considerable fear in the
general public concerning the safety of the vaccine. The UK Department
of Health and several other health organizations immediately pointed
to the lack of evidence for such claims and urged parents not to
reject the vaccine. The media subsequently widely reported that none
of the original claims had been substantiated. Nonetheless, in 2002,
between 20% and 25% of the public continued to believe in the vaccine-
autism link, and a further 39% to 53% continued to believe there was
equal evidence on both sides of the debate (Hargreaves, Lewis, &
Speers, 2003). More worryingly still, a substantial number of health
professionals continued to believe the unsub- stantiated claims
(Petrovic, Roberts, & Ramsay, 2001). Ulti- mately, it emerged that the
first author of the study had failed to disclose a significant
conflict of interest; thereafter, most of the coauthors distanced
themselves from the study, the journal offi- cially retracted the
article, and the first author was eventually found guilty of
misconduct and lost his license to practice medi- cine (Colgrove &
Bayer, 2005; Larson, Cooper, Eskola, Katz, & Ratzan, 2011)5.
It is not just the facts, it is the narrative we tell about the facts (“I don’t want to be right”):
Even when we think we’ve properly corrected a false belief, the
original exposure often continues to influence our memory and
thoughts. In a series of studies, Lewandowsky and his colleagues at
the University of Western Australia asked university students to read
the report of a liquor robbery that had ostensibly taken place in
Australia’s Northern Territory. Everyone read the same report, but in
some cases racial information about the perpetrators was included and
in others it wasn’t. In one scenario, the students were led to believe
that the suspects were Caucasian, and in another that they were
Aboriginal. At the end of the report, the racial information either
was or wasn’t retracted. Participants were then asked to take part in
an unrelated computer task for half an hour. After that, they were
asked a number of factual questions (“What sort of car was found
abandoned?”) and inference questions (“Who do you think the attackers
were?”). After the students answered all of the questions, they were
given a scale to assess their racial attitudes toward Aboriginals.Everyone’s memory worked correctly: the students could all recall the
details of the crime and could report precisely what information was
or wasn’t retracted. But the students who scored highest on racial
prejudice continued to rely on the racial misinformation that
identified the perpetrators as Aboriginals, even though they knew it
had been corrected. They answered the factual questions accurately,
stating that the information about race was false, and yet they still
relied on race in their inference responses, saying that the attackers
were likely Aboriginal or that the store owner likely had trouble
understanding them because they were Aboriginal. This was, in other
words, a laboratory case of the very dynamic that Nyhan identified:
strongly held beliefs continued to influence judgment, despite
correction attempts—even with a supposedly conscious awareness of what
was happening.*In a follow-up, Lewandowsky presented a scenario that was similar to
the original experiment, except now, the Aboriginal was a hero who
disarmed the would-be robber. This time, it was students who had
scored lowest in racial prejudice who persisted in their reliance on
false information, in spite of any attempt at correction. In their
subsequent recollections, they mentioned race more frequently, and
incorrectly, even though they knew that piece of information had been
retracted. … *
Relation to self-identity
False beliefs, it turns out, have little to do with one’s stated
political affiliations and far more to do with self-identity: What
kind of person am I, and what kind of person do I want to be? All
ideologies are similarly affected. [emphasis added]
Denial vs healthy scepticism
A Loving Father Rejects His Son
From The Art of Power by Thich Nhat Hanh, pp.87-89
The Buddha told the story of a merchant, a widower, who went away in a business trip and left his little boy at home. While he was away, bandits came and burned down the whole village. When the merchant returned, he didn’t find his house, it was just a heap of ash. There was the charred body of a child close by. He threw himself on the ground and cried and cried. He beat his chest and pulled his hair. The next day, he had the little body cremated. Because his beloved son was his only reason for existence, he sewed a beautiful velvet bag and put the ashes inside. Wherever he went, he took that bag of ashes with him. Eating, sleeping, working, he always carried it with him.
In fact, his son had been kidnapped by the bandits. Three months later, the boy escaped and returned home. When he arrived, it was two o’clock in the morning. He knocked on the door of the new house his father had built. The poor father was lying on his bed crying, holding the bag of ashes, and he asked, ‘Who is there?’ ‘It’s me, Daddy, your son.’ The father answered, ‘That’s not possible. My son is dead. I’ve cremated his body and I carry his ashes with me. You must be some naughty boy who’s trying to fool me. Go away, don’t disturb me!’ He refused to open the door, and there was no way for the little boy to come in. The boy had to go away, and the father lost his son forever.
After telling the story, the Buddha said, ‘If at some point in your life you adopt an idea or a perception as the absolute truth, you close the door of your mind. This is the end of seeking the truth. And not only do you no longer seek the truth, but even if the truth comes in person and knocks on your door, you refuse to open it. Attachment to views, attachment to ideas, attachment to perceptions are the biggest obstacle to the truth.’
Links
-
I Don’t Want to Be Right - New Yorker
-
Memory for Fact, Fiction, and Misinformation: The Iraq War 2003
Psychological Science March 2005 16: 190-195
-
Nyhan, Brendan, Jason Reifler, and Peter A. Ubel. 2013. ‘The Hazards
of Correcting Myths About Health Care Reform’: Medical Care 51
(2): 127–32. doi:10.1097/MLR.0b013e318279486b.
http://journals.lww.com/lww-medicalcare/Abstract/2013/02000/The_Hazards_of_Correcting_Myths_About_Health_Care.2.aspx- Context: Misperceptions are a major problem in debates about
health care reform and other controversial health issues.
Methods: We conducted an experiment to determine if more
aggressive media fact-checking could correct the false belief
that the Affordable Care Act would create “death panels.”
Participants from an opt-in Internet panel were randomly
assigned to either a control group in which they read an
article on Sarah Palin’s claims about “death panels” or an
intervention group in which the article also contained
corrective information refuting Palin.
Findings: The correction reduced belief in death panels and
strong opposition to the reform bill among those who view
Palin unfavorably and those who view her favorably but have
low political knowledge. However, it backfired among
politically knowledgeable Palin supporters, who were more
likely to believe in death panels and to strongly oppose
reform if they received the correction.
Conclusions: These results underscore the difficulty of
reducing misperceptions about health care reform among
individuals with the motivation and sophistication to reject
corrective information.
- Context: Misperceptions are a major problem in debates about
-
Misinformation and Its Correction: Continued Influence and
Successful Debiasing Psychological Science in the Public Interest
December 2012 13: 106-131,
http://psi.sagepub.com/content/13/3/106.full -
Lord C, Lepper MR, Ross L. Biased assimilation and
attitude polarization. The effects of prior theories on
subsequently considered evidence. Journal of Personality and
Social Psychology 1979; 37: 2098-2110.-
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.372.1743&rep=rep1&type=pdf
-
People who hold strong opinions on complex social issues are
likely to examine relevant empirical evidence in a
biased manner. They are apt to accept "con- firming" evidence
at face value while subjecting "discontinuing" evidence to
critical evaluation, and as a result to draw undue support for
their initial posi- tions from mixed or random
empirical findings. Thus, the result of exposing contending
factions in a social dispute to an identical body of relevant
em- pirical evidence may be not a narrowing of disagreement
but rather an in- crease in polarization. To test these
assumptions and predictions, subjects supporting and opposing
capital punishment were exposed to two purported studies, one
seemingly confirming and one seemingly disconfirming their
exist- ing beliefs about the deterrent efficacy of the
death penalty. As predicted, both proponents and opponents of
capital punishment rated those results and procedures that
confirmed their own beliefs to be the more convincing and
probative ones, and they reported corresponding shifts in
their beliefs as the various results and procedures
were presented. The net effect of such evaluations and opinion
shifts was the postulated increase in attitude polarization.
-
-
Hart, P. Sol, and Erik C. Nisbet. 2012. ‘Boomerang Effects in
Science Communication How Motivated Reasoning and Identity Cues
Amplify Opinion Polarization About Climate Mitigation Policies’.
Communication Research 39 (6): 701–23.
doi:10.1177/0093650211416646.-
The deficit-model of science communication assumes increased
communication about science issues will move public opinion
toward the scientific consensus. However, in the case of
climate change, public polarization about the issue has
increased in recent years, not diminished. In this study, we
draw from theories of motivated reasoning, social identity,
and persuasion to examine how science-based messages may
increase public polarization on controversial science issues
such as climate change. Exposing 240 adults to simulated news
stories about possible climate change health impacts on
different groups, we found the influence of identification
with potential victims was contingent on participants’
political partisanship. This partisanship increased the degree
of political polarization on support for climate mitigation
policies and resulted in a boomerang effect among
Republican participants. Implications for understanding the
role of motivated reasoning within the context of science
communication are discussed.
-
Fishing a Superfund Site: Dissonance and Risk Perception of
Environmental Hazards by Fishermen in Puerto Rico
-
http://onlinelibrary.wiley.com/doi/10.1111/j.1539-6924.1991.tb00603.x/full
-
Risk perception studies show that individuals tend to
underestimate significant risks, overestimate negligible ones,
and distrust authorities. They also rely on a variety of
strategies or heuristics to reach decisions regarding their
risk-taking behavior. We report on a survey of fishermen and
crabbers engaged in recreational and subsistence fishing in a
Puerto Rican estuary (near Humacao), which has been declared a
“Superfund site” because of suspected contamination by
mercury, and at ecologically similar control sites. Nearly
everyone interviewed at the Humacao site was aware of the
mercury contamination, but either denied its importance,
believed the contamination was restricted to a distant part of
the estuary, or assumed that the estuary would be closed by
the authorities if the threat was real. All site-users
consumed the fish and crabs they caught
-
-
Garrett, R. Kelly, and Brian E. Weeks. 2013. ‘The Promise and Peril
of Real-Time Corrections to Political Misperceptions’. In
Proceedings of the 2013 Conference on Computer Supported
Cooperative Work, 1047–1058. CSCW ’13. New York, NY, USA: ACM.
doi:10.1145/2441776.2441895.-
Computer scientists have responded to the high prevalence of
inaccurate political information online by creating systems
that identify and flag false claims. Warning users of
inaccurate information as it is displayed has obvious appeal,
but it also poses risk. Compared to post-exposure corrections,
real-time corrections may cause users to be more resistant to
factual information. This paper presents an experiment
comparing the effects of real-time corrections to corrections
that are presented after a short distractor task. Although
real-time corrections are modestly more effective than delayed
corrections overall, closer inspection reveals that this is
only true among individuals predisposed to reject the
false claim. In contrast, individuals whose attitudes are
supported by the inaccurate information distrust the source
more when corrections are presented in real time, yielding
beliefs comparable to those never exposed to a correction. We
find no evidence of real-time corrections
encouraging counterargument. Strategies for reducing these
biases are discussed.
-
Lewandowsky, Stephan, Michael E. Mann, Nicholas J. L. Brown, and
Harris Friedman. 2016. ‘Science and the Public: Debate, Denial,
and Skepticism’. Journal of Social and Political Psychology 4
(2): 537–53. doi:10.5964/jspp.v4i2.604.
General Cognitive Mechanisms
-
Belief in the Law of Small Numbers. AMOS TVERSKY and
DANIEL KAHNEMAN. Hebrew University of Jerusalem. Psychological
Bulletin, 1971, Vol. 76, No. 2. 105-110.-
Abstract - People have erroneous intuitions about the laws
of chance. In particular, they regard a sample randomly drawn
from a population as highly representative, that is, similar
to the population in all essential characteristics. The
prevalence of the belief and its unfortunate consequences for
psvchological research are illustrated by the responses of
professional psychologists to a questionnaire conceming
research decisions.
-
Buddhism and Ontology
- The Emerging Role of Buddhism in Clinical Psychology: Toward
Effective Integration
[PDF]
*** TODO - finish reading
Appendix: Braindump
-
Central role of Buddhism in all of this
-
Buddhism and non-attachment and misunderstandings
-
**Addiction: extreme pathological attachment. Substance and
behavioural addiction. What about “ontological addiction”:** the
unwillingness to relinquish an erroneous and deep-routed belief in
an inherently existing ‘self’ or ‘I’ as well as the ‘impaired
functionality’ that arises from such a belief
Buddhism, non-attachment and misunderstandings …
People seem to hear the 4NT and attachment part as: “life is about the cessation of suffering, detachment is the way to do, just hook yourself up to a morphine drip”. And normally react to that like: “No, i don’t want to have a lobotomy, I want to live and experience and i accept suffering etc etc. See e.g. for a fairly nuanced look at this which still seems to go down this route: http://gretachristina.typepad.com/greta_christinas_weblog/2010/03/secular-buddhism.html
E.g. you have translations like:
-
Life is suffering.
-
The origin of suffering is attachment.
-
The cessation of suffering is attainable.
-
The path to the cessation of suffering is eightfold path
In my view, this is an misinterpretation. It equates non-attachment with detachment rather than engagement.
Mis-interpretation
-
Getting a lobotomy
-
Living in a hut in the forest
-
Having no possessions
-
Never doing what you enjoy because you get attached to it …
The truth is peace and the ultimate freedom can be obtained right here within everyday life. So then if stripping yourself of all possessions and worldly responsibility isn’t the point of non-attachment, what is?
Metaphors
Clench / unclench
Attachment ... Detachment
|
V
Engagement
People think of non-attachment = opposite = detachment = apathy, disengagement, absence etc …
Not so!
Non-attachment != Detachment but rather = Engagement without attachment
Random Stuff
http://gretachristina.typepad.com/greta_christinas_weblog/2010/03/secular-buddhism.html
In the world of clinical psychology and social work, among attachment theorists and clinicians who study crying and grief, there are some who make a distinction between "sad crying" and "protest crying." "Protest crying" expresses the refusal to accept loss. It treats the fact of loss as a terrible injustice, and demands an immediate return of whatever it is that's been lost. It says, "I don't want this, and I don't accept it." (Not coincidentally, "protest crying" is more likely to elicit a hostile or irritated reaction from others, since it's out of proportion, disconnected from reality, and makes people feel manipulated.)
"Sad crying," on the other hand, expresses despair over loss. It expresses our recognition that whatever's been lost is really gone, and expresses our feelings of grief about it. It says, "I don't want this, but I understand that this is how it is." (And it's more likely to elicit sympathy and compassion and attachment from other people… the good kind of attachment, the clinical- psychology "connecting with others" definition of attachment, not the bad Buddhist definition.)
Appendix
Cognitive Dissonance
http://psychclassics.yorku.ca/Festinger/index.htm
Einstellung Effect
Crudely: Prior experience gets in the way of finding new ways of doing things? (Attachment to existing knowledge)
The counter-intuitive possibility that prior knowledge can have a negative effect on
future performance is a theme in a range of areas of psychology that at first sig
ht might seem unrelated. For example, in negative transfer paradigms previous experience make it more difficult to adapt to a new setting than it would be without such experience.
http://dspace.brunel.ac.uk/bitstream/2438/2276/1/Einstellung-Cognition.pdf
Why Good Thoughts Block Better Ones: The Mechanism of the Pernicious Einstellung (set) Effect
Nice Example of Retained Beliefs
http://www.quackwatch.org/01QuackeryRelatedTopics/ideomotor.html
Some years ago I participated in a test of applied kinesiology at Dr. Wallace Sampson's medical office in Mountain View, California. A team of chiropractors came to demonstrate the procedure. Several physician observers and the chiropractors had agreed that chiropractors would first be free to illustrate applied kinesiology in whatever manner they chose. Afterward, we would try some double-blind tests of their claims.
The chiropractors presented as their major example a demonstration they believed showed that the human body could respond to the difference between glucose (a "bad" sugar) and fructose (a "good" sugar). The differential sensitivity was a truism among "alternative healers," though there was no scientific warrant for it. The chiropractors had volunteers lie on their backs and raise one arm vertically. They then would put a drop of glucose (in a solution of water) on the volunteer's tongue. The chiropractor then tried to push the volunteer's upraised arm down to a horizontal position while the volunteer tried to resist. In almost every case, the volunteer could not resist. The chiropractors stated the volunteer's body recognized glucose as a "bad" sugar. After the volunteer's mouth was rinsed out and a drop of fructose was placed on the tongue, the volunteer, in just about every test, resisted movement to the horizontal position. The body had recognized fructose as a "good" sugar.
After lunch a nurse brought us a large number of test tubes, each one coded with a secret number so that we could not tell from the tubes which contained fructose and which contained glucose. The nurse then left the room so that no one in the room during the subsequent testing would consciously know which tubes contained glucose and which fructose. The arm tests were repeated, but this time they were double-blind – neither the volunteer, the chiropractors, nor the onlookers was aware of whether the solution being applied to the volunteer's tongue was glucose or fructose. As in the morning session, sometimes the volunteers were able to resist and other times they were not. We recorded the code number of the solution on each trial. Then the nurse returned with the key to the code. When we determined which trials involved glucose and which involved fructose, there was no connection between ability to resist and whether the volunteer was given the "good" or the "bad" sugar.
When these results were announced, the head chiropractor turned to me and said, "You see, that is why we never do double-blind testing anymore. It never works!" At first I thought he was joking. It turned it out he was quite serious. Since he "knew" that applied kinesiology works, and the best scientific method shows that it does not work, then – in his mind – there must be something wrong with the scientific method. This is both a form of loopholism as well as an illustration of what I call the plea for special dispensation. Many pseudo- and fringe-scientists often react to the failure of science to confirm their prized beliefs, not by gracefully accepting the possibility that they were wrong, but by arguing that science is defective.
Speculation on Why We Are Attached to Views [Game Theory and Evolutionary Biology]
Why do we have such strong commitment to views. In particular, why do we seem to emotionally involved in what we believe: finding ourselves angry and upset when our beliefs are threatened? (Think of that common agreement to have no religion or politics at the dinner table or in bars because it leads to unpleasant disagreements — even violence).
Here I’m going to offer up some suggestions. These are a bit tongue in cheek:
Commitment problem
Basic idea: emotional “attachment” to our views is a solution to a commitment problem of credibility with others.
Abe and Ben are hunter gatherers. Abe says to Ben: I’ve seen lots of nice Moose over that hill.
Should Ben believe Abe and spend a day trekking over the hill to find the Moose? After all:
A. Mistaken
a. Abe could be talking out of his ass (he just says stuff like
this all the time)
b. Yes, he has some evidence but not a lot
B. Lying
If “talk is cheap” then Ben probably should not believe Abe as there is no reason that Abe has very high credibility — one of those three options is probably true.
However, suppose instead that talk is not cheap: that Ben knows that Abe really likes to be right. That he gets upset and angry when he is wrong, or, that gets really guilty and stressed. Then that fact really adds weight to Abe’s claim. After all if Ben does go over the hill and finds no moose there will be an emotional cost for Abe.
=> This makes Ben more likely to believe Abe.
And, of course, this emotional cost to being wrong and payoff to being right will not just be limited to the location of Moose. It will naturally spread to include all beliefs and opinions we hazard.
This will quickly become a barrier to updating one’s views and opinions if it is associated with “being wrong”. Such commitments to one’s pre-existing views and opinions will even spread to abstract ideas like belief in God. It may become so emotionally welded to our sense of identity and self that it is strong enough that we will kill for our views and opinions — and admire those who do.
Furthermore, such strong feelings create problems for changing our views especially when combined with the inevitable uncertainty of life. To to back to our story: perhaps the moose are not always in the same place, so the moose being absent when Ben goes over the hill does not necessarily mean that Abe was mistaken and Abe has a strong reason to interpret the data that way. With uncertainty we all now we have a valid reason to ignore and even discard new data points that are inconsistent with our pre-existing beliefs.
Aside: lying. The above explanation talks more to option (A) case of genuine mistake. However, the logic also applies in the lying case. Lying would also potentially have evolutionary benefits — Abe wants Ben to go to the wrong place so that Abe gets food and Ben does not etc. However, to lie well requires you to be credible. Once again either having or being able to fake a real emotional cost to being wrong is valuable. In addition, with lying the ability to deceive yourself is very valuable — the best liars are the ones who believe their own lies. Thus, the lying option would also encourage a mechanism for deceiving ourselves. This mechanism would contribute even more to our tendency to ignore or dismiss evidence which contradicts our pre-existing views.
Background: Commitment Problems
Suppose two robbers Abe and Ben rob a jewellery store. After the robbery they need to split up to reduce their risk of being caught. In addition, it is best if only one of them looks after the loot. In addition, Abe has all the contacts needed to sell it.
However, Ben is worried that if Abe has the loot then he may just run off it with himself. What can they do?
The issue here is what game theorists like to call a time inconsistency problem: at the moment just after the robbery the best thing is for them to split up and only one of them to hold on to the loot. But once they have done that, whomever has the loot has an incentive to make off with it all themselves. If they cannot solve this issue they may be forced to do something less optimal: split up the proceeds right now even though that exposes them to more risk and they won’t get as good a price, or to stay together to sell the proceeds together which is even riskier.
What is really wanted here is way for Abe to make a credible commitment to Ben that he will not run off. This is good for both Abe and Ben — they will both end up with less risk and get more money.
A credible commitment for Abe needs to involve something with a greater cost than the benefit of ripping Ben off and taking all the loot for himself. One option might be that Abe’s daughter is married to Ben’s son. In that case, ripping Ben off will cause huge harm to his family. Another option might be that Abe knows Ben is a crazy guy who will stop at nothing if he is taken advantage of, that Ben would track him down and kill him — even if that meant risking huge jail time — just to get even if Abe ripped him off. Though it might sound bad, Ben being crazy in this sense would actually be good for Abe as it would provide a way for Abe to make a credible promise to Ben — Ben would know that Abe knew that he is a crazy guy and would trust Abe not to rip him off.
The other classic example is nuclear weapons. Remember Dr Strangelove and the Russian’s Doomsday Machine
A classic issue in game theory is how to solve commitment problems.
Footnotes
-
http://plumvillage.org/mindfulness-practice/the-5-mindfulness-trainings/ ↩
-
Taken from the definition of “ontological addiction” in Shonin, E., Van Gordon, W., & Griffiths, M. D. (2013a). Buddhist philosophy for the treatment of problem gambling. Journal of Behavioral Addictions, 2, 63–71. ↩
-
See e.g. http://www.buddhaweb.org/, http://gretachristina.typepad.com/greta_christinas_weblog/2010/03/secular-buddhism.html ↩
-
Lewandowsky et al 2012, TODO: add the others. For numbers see e.g. http://www.jennymccarthybodycount.com/. “For example, following the unsubstantiated claims of a vaccination-autism link, many parents decided not to immunize their children, which has had dire consequences for both individuals and societies, including a marked increase in vaccine-preventable disease and hence preventable hospitalizations, deaths, and the unnecessary expenditure of large amounts of money for follow-up research and public-information campaigns aimed at rectifying the situ- ation (Larson et al., 2011; Poland & Spier, 2010; Ratzan, 2010).” [Lewandowsky et al 2012, p.107) ↩
-
A summary of the evidence that the 1998 paper was defective and fraudulent can be found in https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3136032 which links to most of the other papers on the topic. ↩