Episode Transcript
Steven LEVITT: My guest today, Tim Harford, has an uncanny gift for making economics interesting. His first book, The Undercover Economist, came out over 15 years ago, and it’s still the first book I recommend to anyone who asks me for advice about what to read. He’s wildly popular in the United Kingdom, where he has a weekly column in the Financial Times, and he hosts a popular BBC radio show that investigates the accuracy of statistical claims in the news. Outside the U.K., well, he’s not as well known as he should be, but that might change with his latest book, The Data Detective: 10 Easy Rules to Make Sense of Statistics.
Welcome to People I (Mostly) Admire with Steve Levitt.
LEVITT: I first met Tim Harford right before Freakonomics was published. I was literally the first person he ever interviewed. He wasn’t even a journalist. He was working at the World Bank hoping to start a second career. Usually, I would turn down interview requests like that without a second thought, but something made me think it would be worthwhile. And indeed, he turned out to be the second most interesting, clever journalist I’ve ever interacted with. After Stephen Dubner, of course.
* * *
Steven LEVITT: Tim Harford. In the past, it’s always been you interviewing me. Today, I exact my revenge. And I hope you’re ready.
Tim HARFORD: Ready as I’ll ever be, Steve. It’s great to join you on the show.
LEVITT: We’ve all heard of Florence Nightingale, “the Lady with the Lamp” who became famous during the Crimean War and a pioneer in nursing. But in your new book, The Data Detective, you tell a side of Florence Nightingale I had never heard of. Would you mind telling it to the listeners?
HARFORD: Florence Nightingale — she’s famous as a nurse. She was the first female member of the Royal Statistical Society. She’s celebrated as a statistician in nerd land. And Steve, I thought you were a citizen of nerd land, so I thought you would have known this, but perhaps not.
LEVITT: I guess I’m not an official member yet in nerd land.
HARFORD: So, as well as being this remarkable nurse and remarkable pioneer in nursing and the education of nursing, she was big into pie charts. And she was big into data. And the story that I tell is of the most famous data visualization she produced, something called the rose diagram. She’s charting the soldiers dying in the Crimean War, which is this grim and pointless war in Crimea that the British got involved with.
LEVITT: Where does she even get the data?
HARFORD: She visited Crimea, but she was in Istanbul, which is where the hospitals were. Part of the data she gathered herself. Most of it she put together very carefully afterwards. But the message that she wanted to bring out in the data was pretty simple, which is: a whole lot of soldiers died. And they died of communicable diseases such as dysentery and cholera. And that particularly happened in the first half of the war.
And then halfway through the war, the British medical system really cleaned up its act, literally, because they found a dead horse in the drinking water. They found that the latrines were leaking into the drinking water supply. It was just appalling. She got a commission to come from the U.K. to help her clean up these hospitals. And then, after that, the soldiers stopped dying of communicable diseases.
The rose diagram is basically two blue spirals. So, these two spirals are showing what has happened over the course of two years. And one of them is large and growing and the other is much smaller and shrinking. And they are showing deaths because of communicable diseases.
The story she was telling was: it turns out it’s a really good idea to clean your hospitals and to wash your hands and to take really good care of hygiene. What makes it such a powerful story is you’ve got this sense of before and after. Look at the catastrophe in the first year. And then, we changed everything. And, then look at what happened and how we fixed the problem and people stopped dying.
LEVITT: It’s a remarkably modern way of thinking. I always thought of her as being saintly. But now, I will elevate her to whatever is right above saint.
HARFORD: You could criticize her on the grounds that this data visualization is — not misleading. All the data’s there; it’s all accurate — but the way that she framed the chart made it look like there’s absolutely no possible doubt that that was what was going on. There is something that makes me slightly uncomfortable when I see any argument presented in a particular way that really slants our view as to the truth.
There were just more casualties in the first half of the war. So, there were more people arriving in the hospitals. So, maybe it’s not a surprise that more people died. She was completely scrupulous in the way she used the data. And I think ex-post she was completely right, hygiene standards really matter. But you could argue that a more modern, neutral presentation of the same data would have left open questions as to what was really going on. You’ve got to remember all this is happening before germ theory, before Louis Pasteur, before we really knew what was causing these diseases.
LEVITT: Well, it’s also true — I think most people don’t understand how recent statistics is. I mean, math goes back forever, before the Greeks. But our understanding of statistics is remarkably modern. And in the 1850s, statistics was just a mess. And what is so interesting about the history of it is just how bad everyone was at it and how the first people who tried it got it all wrong. And it’s not the way that science is usually written where a genius like Einstein or Darwin comes in and just tells you the right answer and everything changes. It’s hard.
HARFORD: It is properly hard. And Nightingale, I should say, she wasn’t a professional statistician. She was very smart. And she had a lot of good quality mathematical training. She worked very carefully with some of the leading statisticians of the time. But she was in this very interesting position of being very influential in the sense that she knew a lot of powerful people. She was literally the most famous person in Victorian Britain, except for Queen Victoria.
Yet, at the same time, she’s a woman in a man’s world. She’s trying to challenge the British medical establishment and the British military establishment. She’s telling them they’re doing it all wrong. She was basically saying, “We need massive public health reform because the hospitals are breeding grounds for disease.” And the chief medical officer at the time, a guy called John Simon, was saying, “Well yeah, disease is bad, but there’s nothing we can do about it.”
And she was saying, “No, we can do something about it. We just need to clean everything up. We need much better sanitation, much better public health measures.” She basically won the argument with her graphs. And the British parliament changed the laws, medical practice changed, public health standards improved, and life expectancy jumped. She really changed standards of health in the U.K. It was remarkable.
LEVITT: I think there’s a deep problem with modern nonfiction, and that is the idea that the book needs to have a theme, that it can’t just be a bunch of good stories, that it has to be something more. I would say your new book, The Data Detective is one of the most wonderful collections of stories that I have read in a long time. I read it in two days. It is fascinating.
But I think if I asked you what your book was about, you’d say, “Oh, this is a book that’s been written to help laypeople better evaluate the truth or falsehood of data-based arguments through the use of 10 rules of thumb.” That’s probably how you’d describe the book.
HARFORD: Yeah. I think that’s about right.
LEVITT: But I think that’s a total ruse. I think you use that as an excuse to tell your wonderful stories, because you can with a podcast just tell great stories, or a column. But if you write a book, you have to do something more. Do you think I’m being unfair?
HARFORD: Yeah. Well, I’m immediately trying to remember exactly what that really nice thing you said was before you said, “But — “, I need to write it down and stick it on the cover of the paperback. Steve Levitt says this is the greatest collection of stories ever.
Look, I’m a fan of stories. I like reading good storytelling. I love the challenge of telling a good story, because it’s really hard to tell a good story while also being truthful and rigorous and telling people everything they need to know. But stories always simplify. They always leave certain things out. And that I think poses a challenge: what do we do about that with nonfiction?
I’m not too worried, myself. I try to be like Florence Nightingale. Make sure the facts are unimpeachable, make sure everything is absolutely correct, and then tell your cool story. But even if I live up to that standard, people might not remember the cautionary details. They might not remember that little wrinkle or that complexity. What you write as a writer and what people take away, what they recall as a reader, they’re not always the same thing.
LEVITT: That’s for sure. And I will say, when I read books that are popularizing social science, there’s typically some material in there I know well enough to evaluate whether people are being truthful. I will be honest with you that most of the popularizers of social science, I think they’re fundamentally dishonest in the way that they take the ideas of academics and warp them to tell stories that they would like to tell. I’ve never seen you do that. I was going to ask you actually whether that was something that is really a goal of yours. And it sounds like the answer is absolutely, yes. You hold truth to be very, very important.
HARFORD: Yeah, I like to tell people the truth. And I’m conscious that, of course, you always make mistakes. I get things wrong. And when I get things wrong, I publish corrections where I can. But, I like to get it right. Actually, just before we began this interview, I was contacted by the world’s leading scholar of Florence Nightingale because she had read something I wrote, and she really liked it. And she wanted to see the whole book. I sent her off the chapter.
But of course, I’m now terrified because I know she’s going to find something that’s wrong. But at the same time, I’m like, “Well, I tried my best. I hope it’s right. And if there’s a mistake, well, I’m going to find out, aren’t I? She’ll tell me.” So, yeah, it’s important to me. What is the point of being interested in the world, what is the point of being interested in the data, in evidence, in ideas, if you then go out and misrepresent them?
But I often find that actually mistakes are instructive. I think admitting that you’ve made a mistake and discussing why you made the mistake is a public duty, but it’s also fun. This is completely trivial but on the radio program I present for the BBC called More or Less, which is about math and statistics, we had a piece about a pop song by Kate Bush, which has her singing nearly 100 digits of the decimal expansion of Pi, because she’s a geeky person. And it’s quite beautiful.
So, she’s just singing these numbers. And there’s a mistake in that, probably because a producer just spliced together some tape. There’s a number missing. We had this fun item about this. And then, I said, “Well, the decimal expansion of Pi is infinite. So, I guess eventually whatever string of numbers she sang does in fact appear in the decimal expansion of Pi.” And it turns out that’s true if a certain fact about Pi is true, and that certain fact about Pi, no one’s ever been able to prove.
So, we were able to get a mathematician on to get really deep into number theory and we were just in there having fun with numbers. And never would have happened if I hadn’t made this mistake. Of course, zero stakes there. Nobody cares about that mistake. But I think it’s always better to own up and to try to learn and to try to teach others rather than just sweep it under the carpet.
LEVITT: From a social perspective, of course, that’s true. From a private perspective, I have found the mistakes I’ve made to be incredibly painful, embarrassing, time-consuming. So, lots of people don’t like me. When I make mistakes, it gives people remarkable fodder to come and attack me. So, what I really have tried to do is to stop making mistakes.
So, when I hire a research assistant, I say, “The only thing that matters to me is that if you ever find a mistake, you tell me that you found the mistake. And I don’t care if it’s the last day of your job and every single thing that you worked on will be a waste.” For me, it’s a cost-benefit thing now, that I avoid mistakes with every ounce of my body.
HARFORD: Yeah. I think it’s important, but it is tricky. And that conversation that you have with your research assistants, I wonder how many people have that conversation. I think a lot of organizations, whether it’s just a small relationship between an academic and a research assistant or whether it’s a big hierarchy, there are incentives to just bury bad news.
LEVITT: So, in your new book, you have these rules of thumb for being a lay-consumer of statistics. And I think they all make a ton of sense. Things like, “Don’t let your wishful thinking blind you,” or, “Try to put any specific set of arguments into a broader context,” or, “Consider your personal experience with things as you evaluate whether a data-based claim makes sense.” But there were two things I wished you had done in the book that you didn’t do.
The first was to acknowledge just how difficult it really is for a lay person or someone who’s very expert about something, but not expert about the exact topic at hand, to be very insightful about whether it’s true. I’m really struck by how difficult it is for me to evaluate the truth in economic debates that are just a few steps away from what I know.
And I think it’s a helpful starting point to begin with the view that, look, I probably can’t figure out whether something I’m reading in the paper is right or wrong. And it’s a different stance because I think somehow we’re trained, especially as social scientists, to believe that we can get to the truth. But if everyone could just start by saying, “Look, how in the world do I know whether some argument made by a physicist is right or wrong? I don’t. I can’t. I should use a lot more caution as I approach the world.”
HARFORD: Well, before you tell me the other thing that you wished I’d done, let me respond to that. You’re right. But I really wanted to encourage people to have a little bit more confidence in their critical judgment and their ability to ask smart questions in evaluating the statistical arguments that get made. The reason I wanted to do that was partly because I think there’s a lot of negative messaging around. There’s a lot of people saying, “You can’t understand. It’s super complicated. There’s no way you’ll ever be able to know, so just give up and do whatever it is that the newspaper columnist that you follow tells you to think or the political leader that you like — whatever he or she tells you to think, just follow that.”
I wanted to push back against that. And say, “No, you can ask smart questions, and you can derive some insight.” And the reason that I believe that that is possible is because a lot of these questions are actually not that hard. And you’re right, Steve, I can’t critically evaluate your abortion and crime work. It’s too complicated for me. I’ve heard you describe it. I’ve heard you explain why you think it makes sense and that makes sense to me.
But, if someone were to ask me for an independent evaluation of whether I think you’ve made a mistake or whether everything’s solid, I would have to say, “I don’t know. I don’t have the technical expertise.” So, some of this stuff’s too hard. But a lot of it’s not that hard.
Let me give you a specific example. The health secretary of the U.K. said over the summer, “If everybody in the country who was overweight lost five pounds, then the British National Health Service would save £150 million over five years.” That’s about $250 million. And loads of people emailed me and said, “Hey, you’re the data guy, Tim. How does he know this? What’s the evidence base for this?”
My answer was, hang on, we don’t need to go into an evaluation of the evidence base for this claim. We just need to understand what the claim is. The population of the U.K. is about 70 million. So, he’s basically said if people lost weight, we’d save a couple of pounds per person. Actually, if I remember right, it was £100 million, so it was £1.50 per person. And it was over five years. So, now, he’s talking about 30 pence per person. It’s about 50 cents.
So, what he’s saying is if everybody who’s overweight lost weight, then the U.K.’s healthcare system would save 50 cents per person per year. At which point you go, “Well, it doesn’t really matter. It’s just a distraction. It means nothing.” My nine-year-old son could do the mathematics required to solve that problem. You don’t even need a calculator.
And there’s a lot of claims that get made that you don’t need to go very far before you can say, “Yeah, that makes sense. That really helps me understand.” Or, “Oh, this is complete nonsense. It’s obviously wrong by a factor of 1,000. And I can see that clearly.”
LEVITT: What I hear you saying is that there are two ways in which arguments can go wrong. One is that the facts can be off. And the other is that the interpretation made, given a set of facts, can be wrong. I think that if we divide problems into those two pieces, everything becomes much simpler.
If people don’t agree on facts, then we should go and evaluate the facts and figure out what the facts are. If they don’t agree on the interpretation, then I think that is a much easier problem for the human brain to tackle than the problem which is: how do I take storytelling and facts and everything all mixed together and try to parse out the importance? And I think that’s actually in the background in your book, that’s really lurking in your book and your own thinking.
HARFORD: Yeah, I think that’s right. It comes to the foreground in the conclusion where I talk about the illusion of explanatory depth, which I love. The illusion of explanatory depth is basically: if you ask people, “How well do you understand how a zipper works on a scale of, say, zero to seven?” Most people will say, “Yeah, six, I understand it pretty well.” And then, you say, “Oh, great, here’s a pen, here’s some paper. Use diagrams, bullet points, whatever. Just explain to me exactly how it does work.”
And then, they realize, actually, I don’t really know how it works. The illusion of explanatory depth says just asking people to lay out the facts may help them to understand that maybe they don’t actually know the facts. Maybe they don’t understand the thing that they’re arguing about. It turns out that if you use a similar tactic for say policy choices — so, you say, “Just explain to me how a cap-and-trade system would work.” People who are willing to die in a ditch over whether cap and trade is a good response to climate change or not, it turns out they don’t really know how it works.
When you ask them to explain it, they start to realize, “Oh, I don’t completely understand this. Maybe I should moderate my political views. Maybe I shouldn’t be so critical of people who disagree.” This process of laying out the facts, which I think is worthwhile in and of itself, there’s this bonus which it actually gets people to reflect and be a bit more humble about the limits to their own knowledge.
LEVITT: The other piece that I think is really important for lay people understanding data that I didn’t see you cover in the book that I want to mention is thinking hard about the incentives of the people who are putting forth the argument. And being suspicious of any argument in which the incentives are such that the creators of the argument could benefit in any way.
I’ll give you an example. So, in the academic literature on guns, I have never seen an academic paper, where from the name of the author and knowing what the author has written before, I didn’t already know the answer that the author would find. There just aren’t cases where people who are pro-gun suddenly look at a data set and say, “Oh my God. This particular question I just asked leads me to believe that guns could be bad in this setting.”
I’m very skeptical of that literature. And my rule of thumb is the more ignorant I am of a particular topic, the bigger the weight I put on simply looking at the incentives of the providers of the information and judging the veracity based on that.
HARFORD: I think there’s a lot of wisdom in that. The reason that I didn’t do that — it’s not because I don’t agree. Because I do agree. It’s because I feel that people have received that message over and over again. I think people are constantly being told to be suspicious of the motives of the people who are telling them things. And I think we may have gone too far because although it’s true, I think it’s bred a lot of cynicism. A lot of people worry that we believe anything.
And actually, what I worry about is that we believe nothing at all, that we’re completely skeptical of everything and we just think, “Well, they’re all lying to us. It’s all fake news.” So, that’s what was very much on my mind in writing the book. And actually, it’s interesting because Freakonomics is a book that doesn’t make that mistake. So, Freakonomics right from the start, is a book that says, “Hey, let me tell you something really interesting about the world using this data.” But most books about data actually don’t do that.
Most of the books about data that I’ve got on my shelf are written by eminent economists, statisticians, explaining all the different ways in which data can be used to lie to you. And of course, it’s a really engaging way to talk about data, but there is this worry that I have that people hear that message over and over and over again. And in the end, it becomes an excuse to just go, “I can’t believe any of these people. I don’t trust any of the experts. I’ll just believe whatever my gut tells me, whatever I feel should be true. And I’m not going to look at any evidence because you can’t believe any of it.”
You’re listening to People I (Mostly) Admire with Steve Levitt and his conversation with economist and author Tim Harford. After this short break, they’ll return to talk about gun data and Tim’s honorary title bestowed by the British Monarchy.
* * *
LEVITT: I love the exchange we had where I said to Tim, “Your book is full of wonderful stories,” and then I went on to say all the things I didn’t like about it. And his response wasn’t to be defensive, but instead to say, “I need to write down all the nice things you said before the word ‘but’ and put it on the cover of my paperback.”
What a great reaction. And I suspect my partial quote really will end up on the cover of his book. In the second half of the interview, I’ll try to get him to retell one of my all-time favorite stories of his and also get him to reveal his secrets for telling great stories about data and economics.
HARFORD: Steve, I wanted to ask you about gun data, actually. So, one of the points that I make in the book is: we shouldn’t take the data for granted, because it’s easy to have this mental model, certainly as an outsider, that data is just something that exists in spreadsheets. You can just download it from the Internet and you crunch it. And then, once you’ve crunched it, then out come insights. And actually, data has to be gathered. It doesn’t just accumulate by accident.
We should be aware that there are certain bits of data that could be gathered and just aren’t. That’s what I wanted to ask about guns, because — forgive me because I’m a Brit. So, I do not understand the American debate about guns — my understanding is that there’s a lot of political interest in the U.S. as to what data on guns can and cannot be gathered. There are certain questions that can’t even really be asked because it’s against the law to even collect the data. Have I got the wrong end of the stick there, or have I understood?
LEVITT: I think you’re exactly right that the National Rifle Association has been extremely successful at limiting the collection of data around guns, and that has really hamstrung the academic research into it. In fact, one of the most clever papers ever done on guns was done by my good friend, Mark Duggan. He was simply trying to figure out how he could determine how many guns were in different places.
He had the incredibly clever idea to go to a different data source, which is magazine data. So, there’s enormously carefully collected data on magazine circulation because that’s how advertising payments are done. He used purchase of handgun magazines as a proxy for purchase of handguns. And what was very difficult and clever about the paper is he actually showed that over time, the changes in the number of guns correlated very, very highly with his measures of magazine subscriptions. He used that as a proxy and actually was able to say interesting things about guns.
Things that can’t be measured, it’s very difficult to regulate or control them. I think the N.R.A. has understood that for a long time. And they’ve been very, very effective at making sure that guns can’t be measured. If you think about the economy, imagine that we couldn’t measure incomes or we couldn’t measure G.D.P. That’s the equivalent when it comes to guns. We just don’t know how many guns there are in different places and how that changes over time. And so, it’s really hard to study the problem and certainly extremely hard to get a causality.
HARFORD: But of course, there was a time where we couldn’t measure incomes and G.D.P. wasn’t even defined. So, no one was gathering G.D.P. data. There was a point where we didn’t have data on inflation and prices. And one of the points I’m making in the book is: at a certain point, people said, “We need the data on this. We actually need to devote some time, and attention, expertise, and money to getting these numbers because they help us see things about the world.”
Anybody who starts talking about “Lies, damned lies, and statistics,” and demeaning the statistics by saying, “Oh, they’re always used in a misleading way,” the N.R.A. understand how important the statistics are because they really, really don’t want them to exist. And they’re quite effective at ensuring that they don’t exist. I think that proves as much as looking at all the data that has been gathered. Look at the data that people are trying to prevent being gathered.
LEVITT: One of my favorite stories is from your book, Messy, about creativity. It’s about the jazz pianist Keith Jarrett. Could you tell it?
HARFORD: The story begins in 1975 when this German teenager called Vera Brandes walks out on the stage of the Cologne Opera House bursting with excitement because, a few hours later, Keith Jarrett is going to be on that stage improvising. He’s a great jazz musician. He’s going to be sitting at this piano. And he’s going to be just playing whatever comes into his head.
And all this has come about because Vera is the youngest jazz promoter in Germany. She’s 17-years-old. She just loves jazz. And she’s managed to score this amazing coup of getting Jarrett into the Opera House to play this late-night concert. When Jarrett actually comes on the stage to check out the piano, immediately, it becomes clear that something has gone wrong and there’s been a mix up. They’ve brought out a rehearsal model. The keys are sticky. The pedals don’t work. It’s too small. It sounds tinny. It’s just a bad piano.
And Jarrett says, “Well, I’m not going to play.” But it turns out there’s no way of getting a replacement piano on the stage in time; it’s not possible. The tickets can’t be refunded because of the way the concert’s been set up. This teenage kid is about to be ripped apart by 1,400 people who show up for a concert and there’s no concert. And so, Jarrett takes pity on her. And although he’s a real perfectionist, although he likes things exactly the way he likes them, although he feels the piano is completely unplayable, he just thinks, I’ve got to do it because I’ve got to help this girl out.
He, a few hours later, walks out on stage, sits down at this piano that he knows is unplayable, and begins to play. And instead of the musical catastrophe that he expects, it’s a masterpiece. The concert was recorded supposedly to provide documentary evidence of what a musical catastrophe sounds like. But in fact, once it was remixed, it sounded great. Many people think it’s his best work. It’s easily his most successful work.
So, the concert has been released as The Köln Concert, best-selling jazz piano album in history. And it only got played because Jarrett felt he’d been backed into a corner and he couldn’t let this girl down. He thought, “This is terrible. It’s a bad piano. It’s going to be a bad concert.” But he was, of course, forced to play in a different way and to improvise in a different way. So, he stuck to the middle of the keyboard, which made it sound very soothing and ambient because the upper register sounded terrible.
Because it was such a small instrument, it was quiet. So, he was pounding down on the keys to try to create more volume. So, there’s this weird tension. He was playing this nice ambient-y music, but he was really hammering it hard and playing with a lot of energy. And there’s just something about that that worked really well. It’s how I begin my book, Messy. The book that’s really all about how disruption and challenges and weird stuff that’s ambiguous and messes around with us can actually lead to a problem-solving response.
LEVITT: I love that story. But what makes it reverberate in my head is the fact that I don’t know what conclusions to take away from it. One is, well, it’s really good to be a nice guy because Keith Jarrett did this person a favor and it ended up paying off for him. Or maybe the idea is: real geniuses are able to overcome adversity. Or that if you put up artificial obstacles to success, then that leads to unexpectedly good outcomes, so we should be in the business of putting a lot of obstacles in our way. I don’t know what your takeaway is.
HARFORD: I think when you’re doing the work of the nonfiction writer, you’re trying to make a particular argument for a particular view of the world. So, if I was writing a book that was all about the power of altruism, that would be a cool story about someone who did someone else a favor. But actually, I think it’s a better example of how disruption produces this creative response.
So, how I back the story up is to say, O.K., let me tell you a completely different story — it’s actually more a piece of research — about a strike on the London Underground that shut down half of the London Underground for 48 hours. And when researchers looked at the data, they found that tens of thousands of commuters had changed their route because of the closures. And then, at the end of the 48 hours, they never changed back. They discovered a better way to get to work and all it took was this perturbation to the system.
LEVITT: I would tell you, the single best example of that that I’ve observed in my entire life is Covid-19. Covid-19 disrupted all sorts of things — the idea that you could work from home effectively. And I think that we will never go back in many dimensions to things we were doing before. But no one would have ever had the nerve to experiment to the degree that Covid-19 forced us to change our behavior and to learn about what worked and didn’t work.
HARFORD: Yeah. I think that’s absolutely right. And I think there were things that we won’t go back, not because we can’t go back, but simply because we learned we could have done it that way the whole time. And why didn’t we?
LEVITT: I’ve been calling you Tim, but I understand the royal family has bestowed an honorary title on you.
HARFORD: I have what is called an O.B.E., which it’s a sort of mini knighthood. Of course, Britain being Britain, it was all tied up with the royal family. And I have a letter from the Queen. I went to the palace and all of these exciting things. And I met Prince Charles. But fundamentally, the British government has decided to say thank you.
LEVITT: What is Prince Charles like?
HARFORD: Well, I met him for about 30 seconds. It’s quite an interesting operation because there’s like 100 people in the room and some of them are very famous. But, most people — he doesn’t know who you are. As you approach, your name has been called out. And as you walk somebody is just whispering in his ear and presumably saying, “This guy has a radio show.”
He was very friendly. I mean, he was on his feet for an hour and a half, just shaking someone’s hand and then the next person and then the next person. He said, “Well, we’re giving you this award to encourage you to keep going.” And that was oddly moving. It’s a very British thing to say. It was like, “Oh, just keep it up, carry on.” And I was like, yes, this is the British spirit. Just do some more. Don’t stop.
LEVITT: So, you have this O.B.E., but it seems like your honors started piling up early. I read someplace that you were the world champion of school persuasive speaking. Is that true? And what does that even mean?
HARFORD: Oh, I don’t know where you found that out from. Yes, I was the world’s schools’ persuasive speaking champion. There were teams from Cyprus. The Australians didn’t come, and the Australians are supposed to be really good, but the Americans came, and the Canadians came. The Canadians are really good.
LEVITT: So, I’m really interested in the subject of persuasion. I suspect that you not only were the world’s champion at school persuasive speaking, but that you also have insights for everyday people about how to make an argument persuasive.
HARFORD: Well, 1992 was a long time away, but O.K. So, here’s how I think about it. First of all, I want to get my own head straight. I am worried enough about making a mistake myself before I get remotely interested in persuading anybody else. A lot of people have responded to The Data Detective by saying, “Oh, I’ve got this friend who’s a total idiot. And I get into these arguments on Twitter with these idiots. And how can I persuade these idiots not to be idiots?”
And I always say, “Just start with yourself.” If you can make sure that you’re not an idiot, you’ve done so well. It’s such a difficult thing. Don’t worry about anybody else. So, I’ve turned away from persuasion in recent years. But O.K., if my own head’s straight, what would I go for? I would go for a memorable story. Stories are not so threatening. You’re not attacking anybody. You’re giving them something they’ll immediately find interesting. And they’ll follow the story along and they’ll be curious, and it starts to open their minds.
So, if you’re talking to people in terms of stories, you’re lowering their instinctive psychological defenses that basically say, “This guy is challenging my sacred beliefs. And I’m champion of all that’s right.” Stories get people into a more open-minded frame of mind.
I guess that’s what I do in my books. But I don’t think of it as persuasion. I think of it as hopefully giving people something that they find interesting and engaging. But if people are interested and engaged, you at least have a chance of persuading them. If they’re not interested and they’re not engaged, you’re going to get nowhere.
LEVITT: I agree 100 percent that roughly the only thing that ever persuades anyone is a good story. And the other thing I’ve come to believe — I think you probably would agree with me — is that almost all good stories share a lot of commonalities. So, good stories are almost always about people. There are heroes and anti-heroes in the stories. There’s some kind of a conflict. There’s some kind of a rising tension, which is then resolved in some unexpected way. Would you agree with that assertion?
HARFORD: Yeah, there are various theories about how stories work. But yes, I think that’s right. And not to be confused with anecdotes. So, very often people talk about stories, but actually what’s going on is — I gave you a little example. And that’s fine. A nice example is fine, but it’s not really a story.
LEVITT: Yeah, because stories have beginnings and middles and ends. So, what’s interesting is I used to teach a lecture in my course on data to the undergrads where I talked about storytelling with data. And it always left me feeling a little bit strange because it wasn’t very powerful. And then, one year, I just sat back, and I thought about it.
And I thought, “Wait a second, if a good story has all of the elements we just talked about, analysis of data never leads to a good story. There’s almost never a person involved that you can identify. There’s almost never any intrigue or uncertainty. There’s not a twist at the end.” And I completely redid the lecture and now, the way I start the lecture is by telling a great story. And everyone laughs. They think it’s a great story. And I talk about what made it a great story.
And then, I say, “Let’s take some examples with data and how we would turn them into stories.” And it becomes really clear to everyone that you cannot tell great stories with data. And I really came to the conclusion that when it comes to data, you just should completely abandon the idea of telling stories, that you should use data and just explain the truth.
HARFORD: Yeah, you’re weaving together these two things. You’ve got the truth as evidenced by the data. And then, you’ve got some story that people are going to remember. And the people who are really good at this will just weave the two together so you can hardly tell them apart. But they are different. And Florence Nightingale, it turns out, is great at this. But the story and the data aren’t the same. You’re absolutely right.
LEVITT: It really strikes me, given that you’re such an amazing storyteller, that the right domain for you to be producing your ideas is in a podcast, not in books.
HARFORD: Well, the same thought had occurred to Pushkin, who are a podcasting company set up by Malcolm Gladwell and Jacob Weisberg. And so, they asked me about a year and a half ago: did I want to tell stories in a podcast? And I thought, “Yeah. I guess, I do.”
The podcast is called Cautionary Tales. They’re stories about things going wrong and what is the social science behind that particular fiasco, tragedy, hilarious mishap, some of them are funny, some of them are really not funny. But what’s the lesson? What do the statistics tell us? Or what does the economics tell us or the psychology?
The new season, Cautionary Tales season two, is coming soon. I’m very excited about it because Helena Bonham Carter is playing Florence Nightingale and Jeffrey Wright is playing Martin Luther King. I have written a script for Jeffrey Wright and for Helena Bonham Carter. It’s very exciting.
LEVITT: Do you have advice for people who maybe aren’t natural storytellers?
HARFORD: I suppose the fundamental thing is: the ideal story has a protagonist. It’s got somebody who is moving through the story, who is taking actions, and things are happening to them, and they’re doing things in response. And that’s the heart of a good story. If you don’t know who the protagonist is in your story, then maybe keep thinking about the story.
I have to say, having given this advice, there are lots of stories I tell that don’t have a clear protagonist because I’m so interested in a particular academic idea or a particular course of events. And it’s not always easy to follow this advice, but get yourself a protagonist, get yourself a central actor.
I guess the other key piece of advice is that it’s really nice if what you do at the beginning of telling the story foreshadows the end in a way that’s not obvious, so that when you get to the end, you go, “Oh, wow, I see how all of that fits together.” But it only seems with hindsight obvious what was going to happen, rather than a foregone conclusion.
Of course, you can cheat when you’re writing a story because you know how it’s going to end. You can go back, and you can tweak the beginning. So, you can make it seem like a certain amount of magic. So, those are the two pieces of advice I’d give. And the third one, of course, is read good stories and think about why they’re good.
LEVITT: It’s interesting to me that Tim Harford simultaneously writes books and does podcasts. For Dubner and me, it became crystal clear very early on in the life of the Freakonomics Radio podcast that podcasts were just a much more effective way for us to communicate our ideas. You probably noticed that we stopped writing books, and we don’t have any plans to go back.
Admittedly, though, one advantage books have is a long shelf life. People return to books over and over in a way I suspect they won’t with podcasts, perhaps for that reason, Dubner and I will one day regret our decision. But right now we’re having way too much fun podcasting to worry about it.
* * *
People I (Mostly) Admire is part of the Freakonomics Radio Network, and is produced by Freakonomics Radio and Stitcher. Morgan Levey is our producer and Dan Dzula is the engineer; our staff also includes Alison Craiglow, Mark McClusky, Greg Rippin, and Emma Tyrrell. All of the music you heard on this show was composed by Luis Guerra. To listen ad-free, subscribe to Stitcher Premium. We can be reached at pima@freakonomics.com. That’s P-I-M-A at Freakonomics.com. Thanks for listening.
LEVITT: Can you imagine Mother Teresa knee-deep in data and statistics?
HARFORD: Yeah, Mother Teresa with a calculator.
Comments