Search the Site

Episode Transcript

I want to tell you the story about a friend and a colleague of mine. His name is Noah Feldman. One afternoon back in 2018, Noah was taking a bike ride in the hills around Palo Alto, California. He was visiting from the East Coast. The bike ride was a little break from seeing friends and doing his meetings.

But, Noah’s not your average mountain biker. He’s a constitutional law professor at Harvard. And one of the country’s best. He even helped to draft Iraq’s interim constitution.

So as Noah was cranking around the hills of Old La Honda Road, he was thinking about a company with its headquarters nearby. Maybe you can guess? He was thinking about Facebook.

Noah FELDMAN: I happened to be staying with my friend Sheryl Sandberg, who — I went to college with her, and she’s the C.O.O. of Facebook. But not because of anything to do with Facebook. I was just out there and it was nice to see her.

Noah was thinking a lot about the relationship between his field, constitutional law, and the struggle that platforms have keeping people safe online.

Think about the social media companies. The more they grow, the more content their users post. Not all the content is going to be so nice and friendly. When you get more content, you’re bound to get more hateful speech as well. It’s just really hard to keep all the bad stuff off the platforms.

It’s always an uphill battle for these social media companies. And not unlike the uphill battle that Noah found himself on during his bike ride.

FELDMAN: It was much too hard for me. And I had that, you know, oxygen deprivation feeling that you get when you’re trying to climb hills that are too hard for you. And I was sort of in my mind. And one part of that was trying to figure out how the social media companies were themselves dealing with the challenges of free speech. Not the question of how governments were dealing with them and free speech, but how they were thinking about it internally.

Noah was hyperventilating; his legs were about to give out.

FELDMAN: And sure enough, it worked. And I had an idea. And the idea that came into my mind was that what Facebook would need in content moderation was a Supreme Court.

I wrote it up in a 1,200- or 1,300-word document. I thought maybe I could publish it as an op-ed. And I showed it to Sheryl because she was my host. And she said to me, “Actually, you know what? Before you go and publish this, let me send it to Mark and see what he thinks about it.”

As it turned out, Facebook C.E.O. Mark Zuckerberg loved Noah’s idea. Others were not so sure. Could a for-profit company build a court? Many people — in and outside Facebook — thought the whole idea was more than a little crazy. One of those people was me.

*      *      * 

After the insurrection at the U.S. Capitol, Facebook banned then-president Donald Trump’s account. In the moment, it was really Mark Zuckerberg’s decision. He wrote about it: 

“We believe the risks of allowing the President to continue to use our service during this period are simply too great.” 

That’s what he wrote on January 7th. Then, this happened last week:

ANCHOR: The Facebook Oversight Board has ruled to uphold the social media giant’s ban of former president Trump from the platform. 

Thomas HUGHES: I mean the decision is very clear insofar as the board has found that the suspension of former president Trump was necessary to keep people safe.

When you hear the word board, I bet you’re thinking of a corporate board. A board of directors, or shareholders. But this board is something very, very different.

So what is the Oversight Board? Okay. It’s got 20 members: They’re academics, lawyers, journalists, and human rights activists. And they come from around the world. Now, the Oversight Board is pretty new. And we’re not going to dig into the how — like how it’s doing so far, how things are going. But, there are a few things worth mentioning, and we’ll talk to Noah about this. 

The 20 people on the board — they’re chosen by Facebook. And, if you’re wondering how independent they can really be, well, you’re not the only one. The board is not supposed to be paid by Facebook. So it’s funded through an independent trust. But Facebook does play a role in selecting the members. And Facebook wrote the rules for how the board will work. So, at the end of the day, the board is really not completely outside of the company. 

When the board decides on cases, Facebook has to follow its ruling. And so far, the cases that the board has looked at, they’ve mostly come from people who were upset that their posts were taken down off the site. A case starts when someone submits an appeal. And anyone can submit an appeal.

In Trump’s case, Facebook had banned his accounts on two of its platforms: Facebook and Instagram. The Court looked at whether this ban was done fairly. And it ended up giving a kind of wishy-washy ruling. They upheld Facebook’s ban, but they didn’t like the fact that it was indefinite, or that there were other accounts on the platform that weren’t being treated the same way as Trump’s. 

So, they ended up sending the case back to Facebook, and the company now has six months to make a final decision about Trump’s accounts. And they have to explain why. And, in case you’re wondering, there’s no way for any of us in the public to know how each of the members of the board actually ended up voting. We just get the majority ruling.

I want to focus on a more basic question. I mean a really basic question: Why does Facebook even need a court? And what makes anyone think this whole crazy idea could even work? I mean for a nation, sure. Why not? But a for-profit company?

Courts like to be slow, and careful. Not social media companies. They are known for — well, the opposite. Remember the old motto at Facebook: Move fast and break things. And a Supreme Court? One that could overrule even Zuckerberg’s decisions? I mean why would anyone knowingly or voluntarily give up that kind of power to be the ultimate decision-maker in the company that they founded? 

Those of us inside the company? Well, we thought Zuckerberg just wasn’t feeling too well. In 2018, when Zuck sent around Noah Feldman’s note about the Supreme Court, I was working at Facebook. I was there because I was managing a team responsible for handling all the negative things that people would do on the platform. Like distributing spam, or bullying, or hate speech, or, even worse, terrorism. It’s a pretty long list. 

I can still recall Zuck asking us, in a pretty excited way, “What do you think? Can we use Noah’s idea for a Supreme Court? Do you think it’ll help us deal with all the content that makes people unsafe?” Noah understood Zuck’s concern about moderating content on the platform. But his court idea was about getting to a deeper problem that we were all facing in the company. 

FELDMAN: Now, I thought it was pretty clear at the time that Facebook was going to have major challenges to the legitimacy of their content decision-making. But this was in January of 2018. And the key issues that the public was thinking about with respect to Facebook were not content issues at the time. This was in the aftermath of Cambridge Analytica and the difficulties that Facebook had run into in the context of the 2016 election. So, there was a lot of public concern about, you know, bots and Russian hacking and the effect that that had had on the 2016 election. 

Let’s pause for a second. It’s helpful to remember what Facebook and the world were like in 2018, and why this issue of legitimacy, the one that Noah talks about, is so important. Well, back then, the public really didn’t like Facebook at all. Even more so than now, if you can believe it. Things were so bad that social scientists had a word for it. They called it a Crisis of Legitimacy for the company.

Facebook was getting tons of bad press, ever since the 2016 elections. It felt like every day the news cycle was hitting the company with something pretty serious. There were leaks of private data, there was hacking, there were people in foreign countries using the platforms to sway the U.S. elections. 

People didn’t trust the company to keep their data safe. They didn’t think Facebook kept them safe. They didn’t think the company enforced the rules fairly. On top of that, nearly every political group thought that Facebook was biased against them. Just go down the list: liberals, conservatives, the L.G.B.T. community, the black progressive community. You name it, they thought Facebook had it out for them.

Zuckerberg, Sheryl Sandberg, the whole leadership, they were searching — desperately! — for ways to get the public to trust Facebook. That was the mood in the company. Trust. Legitimacy. That’s the heart of any court. And any ruler — whether a person, or a government, or a company — they all have to do two things really well to get people to follow their rules.

First, they have to make sure those rules are clear. That they’re understandable. And second, they have to enforce them in a way that’s fair, that’s trustworthy. When people don’t trust an authority, then they just go around and break the rules. Or they make their own rules. It can turn into chaos pretty quickly.

So, the idea of a court was about Facebook trying to win back user trust. It really needed to be seen as legitimate. But would it work? Noah ended up working closely with Facebook for years helping to design the court. And from the start, he knew that transparency was going to be really important.

FELDMAN: I had discovered that they were actually trying pretty hard within the organization, not only to have written rules, policies, which they called their community standards, and that they had thousands and thousands of content moderators around the world trying to apply those, and that they had an internal group of, depending on how you count, a couple of hundred people who focused on the hardest problems that arose and also on how to evolve their community standards.

And yet, no one in the public, essentially, really understood that this was happening. And the decisions that they were making, therefore, appeared to come out of a vacuum. The public would never know what that conversation was, would not hear in any systematic way the reasons that had driven Facebook to make the decision that it did. And last but not least, no matter what Facebook decided, half of the people would be unhappy with the decision. And so, I asked myself, “Well, in governments and constitutions,” because that’s what I do for a living, “what happens when you have similar problems?”

The idea of a constitutional court basically addresses these concerns that I was talking about. It accepts that on the hardest constitutional questions, reasonable people can differ and will differ. So, constitutional courts vote on their outcomes. We all understand that in the end, it’s a question of who voted. But the court doesn’t just say, “We voted and this is the conclusion.” No, they say, “Here is the reasoning we followed.”

High courts are very rarely elected, but they enjoy very high rates of legitimacy measured by polling. You ask yourself, “Why? Why do these courts enjoy such legitimacy?” And I think the answer is: they’re transparent about their reasoning. And they explain why they’ve done what they’ve done. And they’re open to new arguments. And they develop their ideas slowly and carefully.

The biggest pushback that I got — there were many kinds of pushback — but the biggest form of pushback was, “well, wait a minute. No company does do anything like this. It won’t be independent and nobody will believe that it’s independent.” That was one line of criticism. Or, the scary part, let’s imagine it is independent. Why would a company want to agree to hand over so much responsibility and authority to an entity that might make decisions that are bad for business, even if they are morally or ethically correct?

So, those were the two strongest forms of pushback. And they came from two different sides. When I was asked these things, I’d say, “Well, these are both great criticisms. I take them both very seriously. They can’t both be true.” If it turns out that the Oversight Board is not really independent, then it won’t challenge the company and it won’t be dangerous for the company. It might be useless, but it won’t be dangerous. And, if it becomes extremely independent and makes decisions that aren’t good for the company, then it will not be seen as being just a tool of the company. 

Facebook is a company that’s really divided in two. On the one side are the policy teams. They’re the ones that make the rules. They also will handle all the concerns that come from people that are outside the company — like investors, or the press, Congress. On the other side are the product teams. That’s where I worked. We’re the teams that build the things that most people will see — like your News Feed or Facebook groups. 

We knew there was a lot of bad content that was slipping through the cracks. And that when we tried to fix the problem, we sometimes were over-correcting. Meaning, we sometimes removed content that we shouldn’t have. But we were really struggling to figure out how a court was going to help us at all. I mean, how could just a few people on the court make a dent in the millions of pieces of content that we had to review every day?

And the policy teams? Well, they didn’t really want the Court either, but some of them thought it could be helpful to them. Maybe a court could help convince the public that Facebook was actually doing something. Maybe it could get people off our backs a little bit. 

And I started to wonder: Could this be the real reason why Zuck wanted a court? To help us solve Facebook’s image problem? I asked Noah about the resistance he heard from inside Facebook. Did anyone say to him, “We get that trust is an issue. But isn’t that just a P.R. problem?”

FELDMAN: Yes, it was a response. And the people who said it actually did it in a way that I would call even more admirable than the way you just described it. They didn’t say, “It’s just a P.R. problem.” What they said is, “We can do a better job of reasoning our decisions and making it more transparent why we’ve decided what we’ve decided, and that’ll actually make us perhaps do better decision making as a result. And once we have done that, we capture the benefits of having an external third-party review. And we won’t have to undertake the cost of doing that.”

And so, what I said in response to that was, “First of all, great point. I think you should do this regardless. Looking at the analogy to a government, many times the government will explain why it’s doing what it’s doing. It will say why. It will go through a deliberative process. It will explain its reasons. And yet the challenge that they face is that the skeptical public often refuses to accept the justifications and explanations being offered by a government because they say, “Well, the government has its own interests.”

And that problem is doubly hard for a for-profit company, because even if Facebook internally expresses why it’s made the content decisions that it’s made and how it’s balanced its values, there is a significant part of the public that will be rightly skeptical that the real driver of the decision is anything other than the bottom line.

But if an independent body, a genuinely independent body that is endowed separately and whose employees aren’t answerable to Facebook and aren’t hirable and fireable by Facebook, if they say, “We’ve looked at this carefully, we’ve gone through the reasoning, we think this reasoning is sound, we think it’s logical,” that should actually, I argued, really help the people in the company who are doing their level best to come up with good decisions. It should enhance the legitimacy of their decision making.”

VENKATESH: This question of legitimacy seems absolutely key. If the public does not feel as though what you’re creating is legitimate, it can be the most elegant piece of architecture, it can be crafted beautifully, and yet it will really not resonate and achieve the purpose. I could imagine, to this day, people might feel like, “I wonder who’s on the court. I wonder if they represent our interests. I wonder if they are primarily elites. Do we have a capacity to shape the court? How do you build into a court like this the capacity to determine on an ongoing basis whether people feel as though it’s legitimate? Is there a way to hear and to measure trust or to measure legitimacy? And was that part of the discussion?

FELDMAN: So, you’re raising many fascinating issues there. So, the first is this idea of legitimacy. And this is one of those things where the people in my day job, law professors, and the people in your day job, sociology professors, use the term a tiny bit differently. So, lawyers like to say, “Well, there are two kinds of legitimacy. There’s the sociological legitimacy. Do people think something is legitimate? Then there’s what law professors call normative legitimacy, which is the question of, well, should it be legitimate? Is it good enough?”

And it struck me in thinking about this that you couldn’t really separate those two things out. If you’re going to be making hard and controversial decisions, you’re not going to be able to say, “Well, pay no attention to the little man behind the curtain.” If you need social conservatives, they have to be social conservatives who will be credible to social conservatives and also who will be known to progressives as social conservatives whom they think are sufficiently thoughtful, reasonable, and capable of having a reasoned conversation for it to be okay.

And the other way around, too. Your progressives also have to be people whom social conservatives can recognize, even in our very intensely divided era, as people whom they understand where they’re coming from and whom they can have a conversation with. 

The teams that I was working on at Facebook, we played around with Noah’s idea for a few months. We ended up feeling that it was going to be a gamble. That it wouldn’t really help us very much. In fact, we were thinking about another question. We really wanted to know how to make sure the court didn’t get in our way! Here’s what I mean:

Facebook is just like every big tech company. They’ve got to use automation to deal with all that content that violates the rules and, well, that creates really bad experiences for people. The way it works is that a computer quickly looks at the millions of pieces of content posted every day and decides whether it should stay or whether it has to go. There’s a small army of humans around. They spot-check the work. But it’s really the computers that are doing all that labor.

They call this proactive review. And sometimes the stakes can be really high. We couldn’t let people use our platform to incite violence, for example. So, our teams were busy working to make sure these computers could get the job done. And if members of the court were allowed to poke around in what we were doing, and take their time, and mull things over? Well, they could really jam up our engine. That’s why we wanted to isolate the court from our daily work. 

Yes, we might have needed the court to build the public trust. But we also had to work on our own to keep the public safe. Could Facebook do both? That’s after the break.

*      *      * 

FELDMAN: Facebook is pushing two billion users. Just by numbers, the overwhelming majority of those users are outside the United States. So, it was obvious from the start that if you were going to have a court-like body, it would have to have members, the majority of whom would not be North Americans and also not Europeans.

That’s Noah Feldman again. He’s the architect of Facebook’s Supreme Court — what’s now called the Facebook Oversight Board. Noah and I are talking about some of the challenges of creating an independent court to make decisions for a global company. Even the practical steps are not that easy to figure out. For example, who would be on the court? Who would be seen as trusted and legitimate?

FELDMAN: They would have to be recognizably global because the legitimacy isn’t just are you legitimate in Kansas and are you legitimate in L.A.? It’s are you legitimate in Singapore, right? That’s an extremely tricky balance to reach. And I don’t know whether the oversight board reached it or not. That remains to be seen in terms of how the public conceptualizes it.

But my core idea is that the proof will be in the pudding. We’ll have to see whether the decisions are treated as legitimate. If the oversight board makes good decisions and explains itself well, even when people disagree with it, they will think, “Okay, well, maybe they’ll do better the next time. We’ll live for another day.” If they make decisions that don’t go well, the oversight board may quickly lose legitimacy.  

When you work inside tech companies, you hear one thing from users over and over. People do not want companies relying on outsiders to make decisions for them. And users felt this way about the court. My teams spoke to users all around the world about the court. They told us firmly, “Hey, Facebook, that’s your job. Don’t outsource your responsibility to keep us safe.” 

I heard this at Twitter, where I worked later. “You guys are making billions of dollars. Spend that money and figure out how to deal with your problems yourself!”

Back in 2018, it was pretty clear that Facebook needed two solutions. We needed to clean up our platform and do a better job of keeping people safe. But we also had to deal with this crisis of legitimacy. It was clear that the executives wanted us to build the court. Even if it just helped with our P.R. mess, that was probably a good enough reason to try to make it work.

We knew that there would be lots of challenges. But there was a real big elephant in the room that we were all staring at. Our teams were wading through millions of pieces of content every day. So, we all wondered — when would the court need to review all those decisions? And how much could they realistically look at? 

There was no way that they would be able to look at them all! So, the first thing that we needed to figure out was — we needed a system to decide which cases the court would take on. And remember, clear and understandable rules. That’s the key to earning trust. So what the public really needed to know was how the court would choose a case. 

FELDMAN: We talk about a constitution as essentially the collective agreement of the people who are going to be the participants in the government ruled by it and participating in ruling through it. 

Sounds like we not only needed a court, but also a constitution.

FELDMAN: And the constitution lays out two things. First, it lays out what the powers of that government are and how they will interact with each other, so what the government can do. In a famous analogy, Stephen Holmes, a brilliant professor at N.Y.U. Law School, compared this to the rules of the game of chess. It tells you what the pieces can do. And then the other half of the constitution is usually the part that limits the government by saying what it can’t do. That’s what people in English-speaking countries think of as a bill of rights. 

So things are getting complicated. This would be hard even if it was just for one country. But for a global company? I mean how do you create a constitution for a platform that, back in 2018, already had more than two billion users around the world? 

FELDMAN: Some people, both inside and outside the company, thought that maybe the right thing to do would be to have different oversight boards or different courts for different regions or to apply different standards in different places. I mean, as a cultural matter, Saint Tropez is not Saudi Arabia. And what is considered socially acceptable to show when you go to the beach in one place is pretty different than it is from when you go to the beach in the other place.

The reason, I think, that ultimately the decision taken within Facebook was to have a single unified set of standards still and a court that would decide issues on a unified basis, is that, at its core, Facebook still aspires to be a global community. And that is how Facebook describes itself. Now, you and I both know and everybody knows that it’s totally experimental to imagine what a community can be like with two billion users.

We could say the community of global Christians, and we’d be in that ballpark of numbers. We could say the community of global Muslims, and we’d be pushing that ballpark of numbers. And those phrases would mean something despite the incredible variation, as among Christians and Muslims, that exists in the world. So, I don’t want to say it’s totally implausible to imagine a global community, just that it’s challenging.  

VENKATESH: Can you give me an example of the kind of work this court is doing? Are there some paradigmatic cases that you were looking at, kinds of speech, kinds of content? 

FELDMAN: So, the everyday questions, for example, would have to do with the nudity policy. So, one issue that was very much in my mind, because Facebook had struggled with it for years, was Facebook’s nudity policy excluded depictions of the uncovered or unclothed female breast. And then there was a large activist group of pro-breastfeeding people who said, “Well, but that can’t be right. Breastfeeding is a fundamental human activity and one of the most beautiful things that it’s possible for us to imagine. And we think we should be able to show pictures of ourselves breastfeeding. It’s shaming otherwise. This is not the kind of nudity you’re trying to exclude.”

So every listener probably has some instinct of, “Oh, come on. It must be this answer. It must be that answer.” But I bet among listeners, not everybody agrees right away about what the right answer is to that question. The U.S. Supreme Court struggled for 35 years with what nudity ought to be protected by the First Amendment.

The Supreme Court justices actually would get together on Wednesday nights in a basement room of the Supreme Court and watch pornographic films to determine whether they counted as obscene or not, which, in retrospect, just seems creepy and horrible. And some of the justices really hated it even at the time. But they did it because it was just hard to have a bright-line rule.

Famously, Justice Potter Stewart said, “I can’t define obscenity, but I know it when I see it.” And so, they were looking at it to see if they could figure out if they knew it. If you don’t do it that way, if you have written rules, you have to have very detailed, embarrassingly detailed, rules that say, “Well, you can’t show this body part when it’s in this stance and it’s in this relation to this other body part.” And you have to list them all.

And indeed, if you go into the community standards that Facebook has, that’s in there. And that seems kind of absurd also. So, I have that question in my mind: the kind of question where reasonable people could actually reach different conclusions. And also, where enforcement is really hard.

VENKATESH: After this experience, do you walk away with a different feeling than you had about the industry and the people that worked there? Do you have more empathy for them? Do you have advice that you might be able to give now that you’ve gone through and are still going through this incredible journey?

FELDMAN: My experience with tech in this instance was incredible, almost unimaginable, openness to new ideas that then, over the course of two and half years, actually got built at very, very great financial cost. Because I was working closely with Facebook for two and a half years on this project, of course I started to feel some identification with the people who were working on the project. So, I’m super not objective on this question. That’s my preface.

Despite the schematic idea that people outside of these companies have, that it’s all about the bottom line, in many instances, they’re human beings who have already made so much money that they’re not motivated by making money at an individual level. Of course, they want their company to do well, but their incentive structure is actually kind of different. They want to think well of the project that they’re engaged in. They want to be seen to make the world better, not to make it worse.

And, again, I might have said from the outside when someone said that to me, “What are you talking about? People are driven by their economic interests.” And my point is, well, not only by their economic interests. The second thing I would say is that It’s a really complicated thing when you have a company that grew to tremendous size relatively quickly by being innovative and doing things, but is now a major dominant market player. Because, on the one hand, there is an ethos of creativity and experimentation. And on the other hand, there’s the inevitable other impulse of, “We’re a very big company. We have many different stakeholders. Anything we do, we should do very slowly.” And I think what you have at a big social media company is that you have both of those impulses present simultaneously because the whole timescale was so compressed.

As a general matter, do I think it’s really, really hard to take outside input? Yeah, I think it’s really, really hard. The people that I interacted with in the last two and a half years were amazingly open to hearing different perspectives, amazingly willing to try new things. And you could say, “Well, that’s because the company was in crisis.” Okay, that’s fine. I mean, that seems like a good thing that the people in a crisis would be open to taking outside ideas.

And if you had asked me at the beginning, “Is this possible that a corporation would do this?” The cynic in me would have said, “Of course, it’s not possible. They’ll never do it. It’s too risky. It’s too expensive. Companies don’t try out things like this. It’ll never happen.” So, that said, my experience is completely outlied. So it’s like asking the person who was lucky enough to win the lottery, “What’s it like to enter the lottery?” And the person’s like, “The lottery is awesome.” 

It’s hard to believe that just a couple of years after Noah had that eureka moment on that bike ride we’re now living in the world where the Facebook Oversight Board exists. And that board just upheld the decision to ban a former U.S. president — with some caveats. 

There are  other things to look out for in the coming weeks and months as the court goes on with its work. What’s the impact going to be on the company? Is this really going to change the way people use the platform, or what’s allowed on it? And how independent will the court be? The people who are on it now, they’re mostly well-trained elites. Like the former prime minister of Denmark. And one of my colleagues from the Columbia University Law School. 

Some of the members haven’t been shy about criticizing Facebook publicly, or about sharing their opinions about how the company should be operating. Which is pretty interesting for members of a court.

We’ll have to see: Can these people be truly independent, in the way that Noah talked to us about? And, even if they are, can a group of elites really understand the lives of the billions of users on the platform? Will they win the public trust? Will you and I feel that they are legitimate? It’s going to be fascinating to watch this experiment roll out.

The conversation with Noah? It reminded me of a New York Times article I recall reading about how Facebook leadership responded to crisis. It was written at the time when Noah was helping build the Oversight Board. I loved the title of the piece: “Delay, Deny, Deflect.” It was the perfect title, not just for Facebook, but for all social media companies.

If your business is about getting hundreds of millions of people to create content for you, whether it’s tweets or posts or videos, those users are going to have to trust you. Winning their trust while policing their content? Well, that’s a delicate dance. I mean after all, if you keep removing their content, they’ll just go somewhere else. And, if you don’t do anything, well, you’ll be overrun by really awful content — and lots of people will probably be scared and leave.

The Facebook Oversight Board was a solution that helped Mark Zuckerberg and Facebook leaders walk this fine line. The board looks at a very, very, very small percent of the content on Facebook. I mean, just a few cases at a time. And, like the ruling on Trump’s account, their decisions are going to make headlines. But, for the most part, the public won’t pay much attention to what they’re doing.

This isn’t necessarily a bad thing. Courts should go about their business quietly, consistently. I mean, that’s part of how they build trust with people, how they gain legitimacy. It’ll take time. But, we also shouldn’t view the Oversight Board as Facebook having solved all its content moderation problems. No. There’s a lot more work to be done. 

It does mean that Zuckerberg figured out how to delay and deflect the problem: the business can move forward without a lot of outside interference.

Facebook was pretty clever to push the Oversight Board toward the policy side of the company and away from the core business, the part that brings in all the money — the product side. Those product teams can breathe a huge sigh of relief. They can go about the hard work of training their computers to get rid of the millions of pieces of content that violate the rules every day. That’s a challenge the Oversight Board could never keep up with.

If you are not a fan of the company, you might not be too happy right now. Maybe you’re thinking, “Hey! Facebook just pulled the wool over our eyes again! They throw out the shiny object to distract us, while they keep their main business intact.” If you are a fan, maybe you’re thinking, “Wow, Zuckerberg. He is sure good at solving problems, giving up some control when he has to, so the business can stay successful.” 

Me? I feel like I’m in the middle. I think both are right. Noah Feldman’s contribution is really an achievement. I really hope that other companies take a hard look at what Facebook is doing and learn from their experiment. I also think that Mark Zuckerberg is a pretty sharp C.E.O. He really understands that sometimes you have to give up a little power — to keep a lot of power. 

All of this stuff is so fascinating. I’m sure we’ll likely dig into the makeup of the court, and how it’s working, in a future episode. In the meantime, maybe I’ll go for a bike ride.

*      *      * 

Sudhir Breaks the Internet is part of the Freakonomics Radio Network, and is produced by Freakonomics Radio and Stitcher. This episode was produced by Matt Frassica, Matt Hickey, and Tricia Bobeda. Our staff also includes Alison Craiglow, Joel Meyer, Mark McClusky, Greg Rippin, Jasmin Klinger, and Emma Tyrrell. We had help on this episode from Jacob Clemente. To listen ad-free, subscribe to Stitcher Premium. We can be reached at sbti@freakonomics.com, that’s S-B-T-I at freakonomics.com. Thanks for listening.

FELDMAN: The Internet actually isn’t a government. And people’s democratic decisions are a little different in that context. You know, the famous example that I like to cite, is the British government ran some years ago a competition, an online competition, to name a new warship. And, you know, people proposed all sorts of names that you would expect, you know, Intrepid, Valor. And then the one that won was Boaty McBoatface. The shorthand for describing the difficulty of running elections online for anything is Boaty McBoatface.

Read full Transcript

Comments