?

Log in

No account? Create an account

Previous Entry | Next Entry

Helpless, helpless, helpless

http://motherjones.com/politics/2011/03/denial-science-chris-mooney
The Science of Why We Don't Believe Science
How our brains fool us on climate, creationism, and the vaccine-autism link.
— By Chris Mooney

http://www.nytimes.com/2011/04/21/opinion/21bazerman.html
Stumbling Into Bad Behavior
Op-Ed Contributor
By MAX H. BAZERMAN and ANN E. TENBRUNSEL

I see a lot of articles lately about how we can't control our thoughts and beliefs as much as we would like to think we can.

I tend to believe this, and I think it's a good countermeasure against the kind of thing that Barbara Ehrenreich describes in Bright-sided: the notion that positive thinking is guaranteed to bring you everything you want, if you only work at it hard enough.

But I also note that it can be used to excuse behavior that causes harm.

Bazerman and Tenbrunsel:
we have found that much unethical conduct that goes on, whether in social life or work life, happens because people are unconsciously fooling themselves. They overlook transgressions — bending a rule to help a colleague, overlooking information that might damage the reputation of a client — because it is in their interest to do so.
...
When we fail to notice that a decision has an ethical component, we are able to behave unethically while maintaining a positive self-image. No wonder, then, that our research shows that people consistently believe themselves to be more ethical than they are.
...
In the run-up to the financial crisis, corporate boards, auditing firms, credit-rating agencies and other parties had easy access to damning data that they should have noticed and reported. Yet they didn’t do so, at least in part because of “motivated blindness” — the tendency to overlook information that works against one’s best interest. Ample research shows that people who have a vested self-interest, even the most honest among us, have difficulty being objective. Worse yet, they fail to recognize their lack of objectivity.
...
A solution often advocated for this lack of objectivity is to increase transparency through disclosure of conflicts of interest. But a 2005 study by Daylian M. Cain, George Loewenstein and Don A. Moore found that disclosure can exacerbate such conflicts by causing people to feel absolved of their duty to be objective. Moreover, such disclosure causes its “victims” to be even more trusting, to their detriment.

The Chris Mooney article addresses a lot of current research on why different political groups believe different things. I haven't quoted much of that per se because I'm not interested in the political angle so much as the general notion of how bias unconsciously affects people.
an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called "motivated reasoning" helps explain why we find groups so polarized over matters where the evidence is so unequivocal....It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.
...
In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers
...
people's deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place
...
A key question—and one that's difficult to answer—is how "irrational" all this is. On the one hand, it doesn't make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information....Indeed, there's a sense in which science denial could be considered keenly "rational." In certain conservative communities, explains Yale's Kahan, "People who say, 'I think there's something to climate change,' that's going to mark them out as a certain kind of person, and their life is going to go less well."
...
political sophisticates are prone to be more biased than those who know less about the issues. "People who have a dislike of some policy—for example, abortion—if they're unsophisticated they can just reject it out of hand," says Lodge. "But if they're sophisticated, they can go one step further and start coming up with counterarguments." These individuals are just as emotionally driven and biased as the rest of us, but they're able to generate more and better reasons to explain why they're right—and so their minds become harder to change.

I wonder if there are ways people can learn to see their biases better and compensate for them? Or is this all leading to "We're all hoplessly biased and rationalizing animals, so we might as well not even try to get closer to an agreement on what's happening, what's causing what, and what causes harm?"

This entry was originally posted at http://firecat.dreamwidth.org/716666.html, where there are comment count unavailable comments.

Comments

( 9 comments — Leave a comment )
chaos_by_design
Apr. 21st, 2011 11:49 pm (UTC)
Or is this all leading to "We're all hoplessly biased and rationalizing animals, so we might as well not even try to get closer to an agreement on what's happening, what's causing what, and what causes harm?"

I do think that honesty and a willingness to take a cold, hard look at yourself and acknowledge as much as you can your own biases is a good first step. If you're aware of your biases and emotional thinking, that at least gives you some tools to deal with them, even if you can't get rid of them completely.
opalmirror
Apr. 22nd, 2011 02:13 am (UTC)
Democracy: the tyranny of truthiness.
beaq
Apr. 22nd, 2011 07:18 am (UTC)
Hrum. Institutionally, isn't science a way of compensating for biases? It's not going to decide what's important and what constitutes "harm", but it's at least a way to record what's happening. Over the long haul. Individually, I think maybe the only thing to do is uh. Yeah. Teach people early on about how interpretive bias and rationalization work?
firecat
Apr. 22nd, 2011 07:20 am (UTC)
Institutionally, isn't science a way of compensating for biases?

Yes. But the article goes into the extent to which biases come back into play as soon as any scientific research is reported outside the immediate group studying that field. (The article says that people decide whose qualifications to trust based more on whether they agree with the results than whether the qualifications are sound.)
beaq
Apr. 22nd, 2011 07:27 am (UTC)
Right. I guess I wasn't sure what the conditions were. Do you mean, how do we, individual people, decide what science is good science?
beaq
Apr. 22nd, 2011 07:28 am (UTC)
If so, then, uh. Good science education?
beaq
Apr. 22nd, 2011 07:43 am (UTC)
Which I couldn't really define except insofar as it should make a person skeptical.
firecat
Apr. 22nd, 2011 09:35 am (UTC)
I think teaching people how to be skeptical would be a darned good idea. The problem is that then they might turn around and be skeptical of authority figures who want to control them.
pingback_bot
Apr. 24th, 2011 03:12 am (UTC)
Linketies and life
User moominmuppet referenced to your post from Linketies and life saying: [...] recent articles; "The Science of Why We Don't Believe Science" and "Stumbling Into Bad Behavior" [...]
( 9 comments — Leave a comment )

Latest Month

April 2017
S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829
30      
Powered by LiveJournal.com
Designed by chasethestars