Who Are You Calling a Hypocrite? A Guest Post

David DeSteno, a professor of psychology at Northeastern University, has done some interesting research on hypocrisy and morals, which has appeared recently in Newsweek and The Times. DeSteno has agreed to blog about his findings here.

With the election season fast approaching, epithets of “hypocrite” are flying. You know the implications — hypocrites cannot be trusted, they’re morally compromised. Whether pointing to Obama or McCain (and to whom you point may depend on your inclinations), there are lots of stones being hurled from glass houses.

Now we can all readily point out the high profile cases of hypocrisy we see on the evening news, and just as readily condemn them. These “hypocrites,” however, seem to be a corrupt few (well, few if you treat politicians as a single unit). Yet my collaborator, Piercarlo Valdesolo, and I thought that this seemingly distant phenomenon might not be as removed from everyday life as one might think. After all, decades of work in social psychology have shown that humans possess a very powerful motive to hold positive views of themselves.

David Dunning has repeatedly shown that most of us think we’re above average (explain that to your mathematician friends). Yet, we certainly know that people often act in questionable, self-serving ways. These findings raise interesting questions. Are people typically hypocritical and if so, why?

To examine this issue, we designed a simple experiment in which we confronted people with one of two scenarios. In the first, they were told that there were two tasks — one short and fun, the other long and onerous. Their job was to assign the tasks to the next participant and themselves. We told them that most people thought flipping a coin was the fairest way to do this (and we provided them with a computer program that would do a random coin flip), but they could arrive at the decision however they wished.

What we found will not increase your faith in your fellow human. Only 8 percent of participants decided to “flip the coin.” The rest assigned themselves the easy task.

Lest you think they didn’t see this as wrong, a separate sample of similar individuals unanimously indicated that simply giving oneself the preferable task was morally incorrect. We then asked the remaining 92 percent how fairly they acted. As you can see (in the control condition), they believed themselves untarnished, scoring above the midpoint on our 7 point fairness scale.

In the second scenario, we had a different set of participants watch another person commit the exact same transgression — he simply assigned himself the fun task without using the randomizer. When these participants judged the fairness of his actions, he got slammed. It’s not the act that counts, but rather who does it.

Now, this raises a problem. If we’re always ready to be hypocritical, what does this imply about our ability to be trustworthy partners? If we always cut ourselves slack without any pangs of guilt, why would we ever trust each other?

Our hunch was that somewhere lurking in the mind is a sensitivity to even our own fairness transgressions. Hypocrisy, then, might just be a post hoc justification to protect our egos.

To examine this issue we repeated the experiment, but had participants make judgments of fairness while completing a secondary task (referred to as a “cognitive load”) that prevented them from “reasoning away” their actions by keeping them somewhat distracted. These conditions would reveal the mind’s spontaneous response to breaking a fairness norm.


As you can see, hypocrisy disappeared. People were just as sensitive to their own transgressions as they were to those of others.

Thus, at heart, I would argue we’re designed to be fair, but left to the luxury of time and our own devices hypocrisy readily emerges.

If this is true, what does that say about how we engage in relationships with others? Most people clearly possess a deep sensitivity to violations of fairness, yet most also readily act hypocritically when it serves them. Are we better off with the “honest veneer” that makes us seem moral and thereby worthy partners? Or, as one person asked me recently, “Might we be better off interacting with people with ADD as, given your distraction data, Prof. DeSteno, one might suppose them to be less hypocritical?”

I’ll post my ideas on these questions on Wednesday, along with examining how the processes underlying hypocrisy may interact with the dynamics of social affiliation and intergroup conflict. As a teaser, let’s just say that our “self-halo” may be a bit flexible by design. Until then, I look forward to your thoughts.

M Todd

My sister sent me a T-shirt with "I demand justice" in large letters, and underneath in smaller letter "But if there must be injustice let it be in my favor" That is the human condition in a nutshell.

The founders of this country understood all to well this human nature. That is why we have checks and balances, a bill of rights, and justice system that depends on a jury of your peers. Each time we suspend these rights or ignore the responsibility of them self interest and corruption will soon follow.

A simple moral compass that works for me is the golden rule. "Do unto others as you would have them do unto you" Maybe this would be a simple cognitive load the author is talking about. I know when I "rationalize" my actions - my actions are anything but pure.

Dave DeSteno

Salty (#41),

Good question about behavior. We deliberately asked after they made their choice so that we could ascertain responses to acts in which they already engaged. However, the impact on moral behavior is equally as interesting. My sense is that if load came first, you would see more "fair" or prosocial behavior, as at the moment of decision to act, the intuitive response against acting selfishly would be more salient.

As for you other points about rights of refusal, etc., who knows. Could go many ways, as then we're getting into a more elaborate calculus.

Dave DeSteno

Dennis (#44),

Hold on till tomorrow, when I'm going to discuss how things change as a function of different types of targets/people. You're right in that group affiliation (i.e., mine vs. yours) will definitely come into play.

Chris S.

@43: Reread the post: "Only 8 percent of participants decided to “flip the coin.” The rest assigned themselves the easy task."

I read that as 100% taking the easy task themselves.


Hypocrisy and bias, two human traits that we'll never get rid of as, in my opinion, they have their roots somewhere in our murky, distant evolutionary past, and are the direct result of it; the development of the human forebrain.

Give a dog or cat or chimp an easily-obtainable treat, or a difficult to obtain treat and they will always go for the one most easily obtained. Of course, our forebrain is considerably larger than theirs, as is its ability to protect itself from negative thoughts about itself.

Our intrinsic bias, I feel, is related to survival of the fittest. My bias toward something subconsciously leads me to believe it is good for me, our country, my family, my political party, etc.

Our intrinsic hypocrisy is also related to the survival instinct. We rationalize our decisions so as to avoid guilt. Guilt has a long-term deleterious affect on one's psyche, and may lead to depression, and decreased physical and social performance, both of which can affect success/survival.

A next step in the bias experiment may be to have the test subject assign the tasks to two different individuals to ascertain why he/she gives one task to one individual and not another.



I'm curious how many people gave themselves the onerous task and delegated the fun/short one to themselves. Seems to me as this is as fair as the coin toss, yet you haven't given us the percentage.

I would like to imagine that I would pick the long/onerous task because of, as somebody said, "mental brownie points". However, this is probably just me holding a positive view of myself :)

denis bider

I think the root cause of hypocrisy is that people feel like they need to aspire to some higher moral standards.

If everyone thought it was okay to do the selfish thing, as long as you don't break any laws; and if everyone thought that it's okay to do the altruistic thing only if you really feel like it; then I think the world would be a much less hypocritical, much more honest, much more reasonable place.

I, for one, see nothing wrong with assigning yourself the fun task, and sticking the next person with the tedious task. The net amount of tedium is going to be the same in any case. It isn't evil to choose to be the person who gets to experience the fun task. Makes no difference overall.

What would be more interesting is an experiment in which the tedious task would be somewhat _less_ tedious (T-X) if the subject chose to do it themselves, and somewhat _more_ tedious (T+X) if they left it to the other person to do it. In this case, chosing the fun task is actually 'evil', in the sense that it increases the net amount of tedium experienced. It would be interesting to see at what threshold (X T) people are willing to sacrifice themselves and choose the tedious task instead of the fun one.



A great post & interesting conclusions:
1. "It's not the act that counts, but rather who does it."
2. But ".....hypocrisy disappeared while under a cognitive-load"

I have two questions for David.

1.Does fairness on judgement, under load, change subsequent behaviour when confronted with the same task?

2. What if during the experiment, participants were told that in case they self-allocate the tasks then the other person will have one right of refusal? In a scenario where he refuses his allocated task, the job-allocator would have to do both the jobs himself. The right of refusal, however, will not exist if they flip a coin at the beginning.
Will that lead to a fair judgment and more coin-flips?

If its true, does not morallly fair behaviour assume pre-existence of "rights"(in this case a right of refusal)?


Does this topic remind anyone else of C.S. Lewis?

Chris S.

It seems that hypocrisy requires a bit of mental effort.

So perhaps we should remain focused while we work, but multitask when we make judgements, especially of ourselves?


I'm not sure I see how this experiment works. Were the participants told "choose fairly" or just "choose?" Were they asked whether they thought it was important to act fairly in a situation like this?

I think most did the right thing - sure, it's more fair to flip a coin, but if I were watching people choose one activity for themselves (even knowing I'd have to do the other activity), I'd expect them to choose the easy one every time. They have no reason to try to be fair with me. -- they don't know me and have probably never even seen me.

It's an interesting line of research though. I'll look forward to seeing where it goes.

Dave DeSteno

Thanks, Adam. I guess comments are like press, there are no bad ones. I look forward to pummeling or praise (or anything in between)!


I'd take the fact that I was randomly chosen to assign the tasks as being fair enough already to get me the easy and fun one!


Highly interesting! Thanks for the post.
It may be a little far-fatched but how can you ensure that those who didn't use the computer program to randomize didn't use another randomization device (like a real coin or a mental trick or...)?


Seems like another case of: It is not the crime, but the cover up. I have much more respect for people who at least admit to why they really make the choice vs. using false justifications.


I just wanted to comment on your story because no one seems to comment on anyone's but Levitt's. Keep up the good work!


whats the population size?

Dave DeSteno

Interesting point, Thomas. I suspect that phrasing it as a moral question may have inflated judgments a bit, but I agree that we would get the same (if slightly attenuated) results if we could ask it another way.

One related point is that research by Benoit Monin at Stanford has shown that people who engage in extreme examples of morality (i.e., levels far beyond those of the rest of us) often risk a backlash from their peers. So, there is a pressure too against being to much of a goody-two-shoes. Of course, that only holds when opinions/behaviors are public.

Dave DeSteno

G. T. Fangirl,

Interesting point. But, it is only judgments of self that move (i.e., ratings of fairness in all other conditions are the same). Only under load does one's evaluation of her- or himself change to mirror those others. To me, this suggests that their is a disconnect between what people what people say and what they feel (i.e., what their spontaneous evaluation is)


This post made me think of a good example of hypocrisy: When there is a line of people, practically everyone would agree it is unfair to cut in line. However, when there is a line of traffic, cars cut into lines all the time. And I am willing to bet when a person cuts in traffic, they feel their actions are justified. Perhaps this phenomenon is due to the anonymity of being in a car, but also I think for some reason we feel entitled to get as far as we can no matter who was there first.