How Big Is Your Halo? A Guest Post
In Monday’s post, I discussed the phenomenon of moral hypocrisy. Simply put, most of us judge moral transgressions committed by others more harshly than the same transgression committed by ourselves, even though at an intuitive level we are equally disturbed by our own wrongdoing. Given this, the question becomes: Which response is more optimal?
Are we better off trying to fool ourselves and others by engaging in some “rational” justification for our own transgressions, or would it be better to heed our intuitive (i.e., gut) responses? The answer, at least to me, is: It depends.
I know, you’re thinking “cop-out!” But before you start posting on my apparent equivocation, consider the following.
Humans must successfully manage many different social pressures. Two of the more important ones are to be viewed as a fair and honest partner and to be a loyal supporter of those comprising your group (who, historically, usually constitute your most frequent trading partners). Applying these motives, it suggests a competition between impulses to be fair across the board and to appear more “moral” to and supportive of those with whom we most frequently interact.
Put simply, if we never feel any guilt at cheating the other guy, no one will ever see us as fair. However, sometimes it may be beneficial to excuse our own actions (or those of close associates) in order to maintain the aura of moral superiority while still taking advantage of an opportunity to gain resources, whether doing so directly through our own actions or indirectly through the actions of members of our ingroup.
This view of competing processes fits well with the data that I presented on Monday, but we’re missing one important part. If Valdesolo and I are right about the social nature of hypocrisy, then we should see some flexibility in the size of “moral halos.”
Now, I’m sure it wouldn’t come as a surprise to anyone to suggest that groups characterized by long-standing conflict might disagree about what constitutes moral behavior. But we felt the social nature of hypocrisy might go much deeper — that is, it might reflect a fundamental bias in the mind.
To test this, we replicated the experiments I presented on Monday with one important change. In addition to conditions where people judged transgressions committed by themselves and anonymous others, we also included conditions in which participants and confederates wore colored wristbands denoting whether they tended to “overestimate” or “underestimate” occurrences of mundane events. In reality, the “estimation test” used to classify people was bogus and wristbands were randomly assigned. Nonetheless, this simple assignment into novel, virtually meaningless groups was enough to stretch the bounds of moral leniency.
As you’ll see in the figure, hypocrisy readily emerged; individuals judged their own transgressions to be fairer than those of others — unless that other just happened to be wearing the same color wristband. In that case, he or she was just as angelic as the self. Thus, moral halos stretch to include those “like us” and thereby reveal morality’s sensitivity to social needs. We want to be seen as good partners and to assist others close to us in being seen as the same, but if we cheat or condemn “those other guys” now and again, it can serve both our economic and reputational interests — providing we can excuse it away.
So, my answer of “it depends” may not be that much of a cop-out after all. Remember, the mind wasn’t designed to be perfect, just to have systems that often work well enough to balance competing pressures. Still, it appears that our intuitions, or “feelings,” may represent our “better angels.” On Friday, I’ll be back with a discussion focusing on moral sentiments, how they impact cooperation, and maybe even provide a mechanism for the evolutionarily intriguing notion of upstream reciprocity (i.e., “paying it forward”).