We recently published a guest post on the ethics of the decision-making that led to the 1986 Challenger shuttle disaster. That post was adapted from the new book Blind Spots: Why We Fail to Do What’s Right and What to Do about It by Max Bazerman, a professor at Harvard Business School, and Ann Tenbrunsel, a professor of business ethics at Notre Dame. We then solicited your questions for Brazerman and Tenbrunsel, who now return with their answers. Thanks to all for participating.
Q. When thinking about one’s own “moral code” (i.e. standards we set for ourselves to try to make the right choices and live a more moral life), which approach makes more sense: setting the bar very high while acknowledging that you will never actually achieve it, or setting much lower expectations such that you might be able to achieve them? Example: “It is always morally wrong to lie, but I know that I will tell many lies in my lifetime, though I will try not to” versus “It is sometimes OK to lie in certain circumstances.” –Clancy
A. Great question. The answer is yes and yes – you need to set both a floor and ceiling. Take the example of negotiations. It is always advised that you have a “walk-away” point, the lowest price you are willing to sell for if you are selling, the highest price you are willing to buy if you are buying, and a “target,” or a goal, towards which you can aspire. If you set only a reservation price, it turns out that you are anchored on that number and only do incrementally better than that number. If you only set a target, you end up walking away from deals that are better than your reservation price. Setting both allows you to overcome the pitfalls of having one of those points but not the other.
The same is true in ethical decision-making. You need to draw a line in the sand and understand what values you will not compromise, but you also need to set a target so that you continually strive for improvement. We know from work on moral compensation (the tendency to behave ethically if you are focused on your unethical actions and behave unethically if you are focused on your ethical actions) that our behavior vacillates around a standard. So if the only standard you have is a low expectation of your morality, you won’t do any better than that. Setting a target helps you get off that uninspiring teeter totter. However, if you focus solely on high expectations, the inability to achieve them will create an internal tension. That tension is going to be resolved either by discounting the importance of behaving ethically or by propagating illusions of ethicality about your own actions, neither of which will help you achieve your goal of behaving ethically. Setting both a floor and a ceiling helps you take advantage of the benefits of both but avoids the negative effects that each of them can have on your ethical behavior.
Q. I hope that Chapter 7 on fixing corrupted institutions addresses the excessive influence special interests have on public policy. Many public representatives are essentially paid by unions and commercial interests to sabotage fair and reasonable policies. One possible remedy is publicly financed campaigns. The big question for me is why so few wealthy supporters of fair government are willing to put up the money to help this movement offset the special interests. –Paul Silver
A. Incentives–political bribes in this case–are prevalent because they are so effective. Take the example in Chapter 5 of Blind Spots. Max and his colleagues found that when asked to estimate the value of a hypothetical company, auditors representing the seller provided estimates that were 30% higher than auditors representing a buyer. Worse yet, this pattern of results held even when those providing the estimates were auditors of “Final Four” accounting firms. Given that this was a hypothetical situation and a hypothetical relationship, it is not a far leap to say that these effects are multiplied in real situations involving real relationships.
As you aptly point out, in the political arena public representatives are biased toward, and essentially owned by, those who write the biggest campaign checks. Public paid campaigns, or other forms of campaign reform, would help to eliminate these embedded conflicts of interests. Why then hasn’t this idea progressed despite its apparent sound logic? Because those who benefit from the situation (the big check writers) are biased toward a system that preserves that benefit. While some wealthy supporters may support “fair government” as you suggest, we know that their idea of “fair” is biased toward what benefits them. So when push comes to shove, when the pen must meet the check, what is fair is the system that benefits them.
Q. It seems to me that failing to do what’s right thrives primarily in environments where decision-making is vested in one individual (or in large organizations, very few individuals) who deems it unnecessary to seek input from those who implement or are the objects of those decisions. Some of us seem prepared to gamble with the ethics of a decision and ignore/overlook doing what’s right in the belief that somehow the reward will be worth it. We can do that more easily alone or in very small groups, but not in more open and collaborative environments.
Isn’t that evident in the behavior of women, among whom sharing information, group discussion and decision-making, and consensus are more the norms and which seem to lead to fewer ethical blowups? Are men somehow more hardwired to achieving positions where leading from the front and alone can seem to be a greater accomplishment than doing what’s right? Does all this mean that women are morally superior to men or simply that they act and behave in ways that make it more difficult to avoid doing what’s wrong and easier to do what’s right? –Ray
A. As mentioned in Blind Spots, isolation can be dangerous if it insulates us from the ethical norms of our larger environment. Individuals can be isolated as you suggest, but isolation is not limited to individuals alone. Groups can also be isolated. Like individuals, groups that are powerful within an organization are often “untouchable” and thus can develop their own set of values. While sharing information among group members has the potential to make ethical norms salient, we know from work on groupthink that very cohesive groups can also develop a feeling of invulnerability and support a distorted view of what is right. An experiment conducted by Francesca Gino, Shahar Ayal and Dan Ariely found that when individuals were exposed to the cheating behavior of another person, their cheating increased. That was particularly true when the initial cheater was a member of the same group as the decision-maker. So being exposed to others and having an open conversation is not a panacea for ethical behavior. It depends on who those others are and what the conversation is about.
Q. Who decides what’s right, and why should I go along with their decision even when my ideas of what’s right are completely different?
For instance, I have absolutely no ethical objections to using steroids. If I were in a field where marginal improvements in athletic performance would make the difference between success and mediocrity, I might well choose to use them. I wouldn’t see anything unethical about doing so. Indeed, if there’s any unethical behavior at all, it’s that of the people paying for performance, while at the same time hypocritically punishing those who try to attain that performance. –James
A. In Blind Spots, we make a point that we do not define what is right. Individuals may rely on many sources to make that determination—normative philosophy, their religion, a respected friend. Though an important question, this is not our focus. We examine how individuals violate their standard of what is right without knowing that they’re doing so. The use of steroids may not pose an ethical dilemma for you (though we predict that most would agree that lying about such use is unethical). What’s important is that you are aware of the ethical tricks your mind may play on you in situations that you do believe have ethical ramifications.