“Do as I say, not as I do”

“Do as I say, not as I do” by Sharon Begley.

Finding example of moral hypocrisy is just too easy what with Elliott Spitzer( former New York governor, moralizing proponent of laws against sex tourism and prostitution; named in “Escort Service” sting in March, coming out when memories of Mark Foley (ex-concresssman; crusader …. Sending sexually explicit messages to congressional pages) and Sen. Larry (family values, “wide stance”) Craig were mercifully fading. But while the ubiquity of hypocrisy can sour you on human nature, there’s a bright side: scientists have lots of examples to study as they look for ways to make hypocrisy a little less common than breathing.

Scientists have long bickered over whether hypocrisy is driven by emotion or by reason- that is, by our gut instinct to cast a halo over ourselves, or by efforts to rationalized and justify our own transgressions. In other moral judgments, brain imaging shows regions involved in feeling, not thinking rule. In “the train dilemma,” for instance, people are asked whether they would throw a switch to send an out-of-control train off a track where it would kill 10 people and onto one where it would kill on person. Most of us say we would But would we heave a  large man onto the track to derail the train and save the 10 people is simple calculus, however the emotional component is  heaving someone to his death  rather than throwing an impersonal switch  – is repugnant, and the brain’s emotion regions scream, Don’t!.

The role of emotion in moral judgments  has upended the Enlightenment  notion that our ethical sense is based on high-minded philosophy and cognition. That brings us to hypocrisy , which is almost ridiculously easy to bring out in people, In a new study that will not exactly restore your faith in human nature , psychologists  David DeSteno and Piercarlo Valdesolo of Northeastern University instructed 94 people to assign themselves and a stranger one of two tasks: an easy one, looking for hidden images in a photo, or a hard one, solving math and logic problems. The participants could make the assignments themselves, or have a computer do it randomly. Then everyone was asked how fairly did you act, from extremely unfairly (1) to “extremely unfairly”(7). Next they watched someone else make the assignments and judged that person ethics. Selflessness was a virtual no-show: 87 out of 94 people opted for the easy task and gave the next guy the onerous one. Hypocrisy, however, showed up with bells on; every single person who made the selfish choice judged his own behavior more leniently – on average4.5 vs.3.1 – than that of someone else who grabbed the easy task for himself.

The gap might not have been on a par with delivering homophobic sermons while having a gay affair, but it suggests how that kind of hypocrisy is possible. For one thing, people’s emotions might have gotten the better of them, just as emotions drive the runaway train dilemma, When we judge our own transgressions less harshly than we judge the same transgressions in other, DeSteno said, it may be because, “we have this automatic, gut level instinct to preserve our self-image. In our heart, maybe we’re not as sensitive ti our own transgressions” Ass Dan Batson of the University of Kansas, a pioneer in hypocrisy studies, “people have learned that it pays to seem moral, since it lets you avoid censure and guilt. But even better is appearing moral without having to pay the cost of actually being moral” such as assigning yourself the tough job.

To test the role of cognition in hypocrisy, DeSteno had volunteers again assign themselves an easy task and a stranger an onerous task, but before judging the fairness of their actions, they had to memorize seven numbers, this just keeps the brains thinking regions to tied up to think about anything else, and it worked, hypocrisy vanished.People judged their own selfish behavior as harshly as they did others’, strong evidence that moral hypocrisy requires a high-order cognitive process. When the thinking part of the brain is otherwise engaged, we’re left with gut-level reactions, and we intuitively and equally condemn bad behavior by ourselves as well as others.

If our gut knows when we have erred and judges our transgressions harshly, moral hypocrisy might not be as inevitable as if it were the child of emotions and instincts, which are tougher to change than thinking.

Since it’s a cognitive process, we have “volitional control over it” argues DeSteno. That matters because of another nasty aspect of hypocrisy: we apply the same moral relativism when judging the actions of people like ourselves. When “people like us” torture, it’s justified; when people unlike us do, it’s an atrocity. When we make that judgment, the brain’s cognitive regions are the hypocrites; emotional regions, make honest judgments and see the heinous behavior for what it is. As with other forms of judgment, the way to change hearts and minds is to focus on the former; appeal to our better angels in the brain’s emotion areas and tell circuits that are going through cognitive contortions to excuse in ourselves what we condemn in other to just shut up.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Leave a Reply