I have a friend, Tracey Meares; she’s a law professor at Yale University. She has a metaphor for understanding social media that I think is really powerful.
Tracey MEARES: A car is an incredibly useful piece of technology. It’s also incredibly dangerous.
Around the world, more than 1.3 million people die in traffic crashes each year. Despite that, most of us will get into a car and hardly give a thought to the dangers. But in the early days of automobiles, those dangers were at the forefront of people’s minds.
MEARES: When cars were new technology, before we figured out ways to make them safe, people would say things like, “We just can’t have cars. They’re too dangerous. Let’s keep living in the world with horses and buggies.” That’s one way to think about it. There’s a point at which masses of dangerous cars can lead someone to conclude, “Well, let’s just get rid of the most dangerous car or let’s get rid of all cars. Let’s do another thing.”
Society decided that the upsides of cars outweigh the downsides. So we didn’t get rid of cars. We worked to reduce the dangers.
MEARES: One answer was making the car itself safer. Another answer was making sure the roads that the car traveled on were well-lit, just so you could see where you were going. Another answer, later on, that engineers came up with was, “Wait, there are actual things we can do to the roads themselves that make travel safer.”
We figured out that we could put rumble strips on the road. We figured out that we could have four-lane highways instead of two-lane highways. We figured out that if the curve was too severe, people would go off the curve. So, let’s figure out ways to take some of the curves out of the road.
And in doing that, you’re not banning the car or even banning people’s use of the car. You’re actually helping them to use this very useful and helpful technology — but also very dangerous — in a much safer way. And that’s what I’m talking about, just designing the space differently for different goals so that we can all interact in ways that are useful, healthful, and beneficial to us.
And of course, we did other important things to make driving safer. We set up rules for drivers and we created traffic school. Maybe you can see where I’m going with this. Like cars, like any technology, really, social media has upsides and downsides. In the first two episodes of this series, we talked a lot about the downsides. But there are upsides, too — really amazing ones.
Think of how quickly you might find news on almost any topic or connect with people. And on a much bigger scale, like say, Black Lives Matter or the Arab Spring movement, the platforms played a role in helping people to create real social change.
Now I want to look at how we can move forward — maybe we can bring these upsides and downsides into a more sustainable balance, kind of like we did with cars. And the fact that we didn’t get rid of cars because they crashed all the time, well, that helps us as a starting point. It leads us to ask, what can we do to improve the — forgive me — the information superhighway?
Tarleton GILLESPIE: I think that for a long time, these platforms faced a first-order problem, which was: you set some rules that people largely agree with, like “Facebook’s not going to have nudity,” and then you get stuck in disputes about where you draw that line.
Tarleton Gillespie is a sociologist who started in academia before making his way into a tech company.
GILLESPIE: I was teaching at Cornell University and then moved over to Microsoft Research. I’m not sure it’s the same thing as working at a tech company where I’m actually trying to make product decisions. I get the privilege of continuing to be an academic while also getting a really nice office and not having to teach classes.
Most of the large platforms started to get pretty good at dealing with the first order problems, like nudity or spam. But, as Tarleton explains, they soon had other, thornier issues to deal with.
GILLESPIE: And then we started to see that platforms get large enough that they become targets for problematic behavior. There were coordinated and deliberate attempts to attack people. That’s a second-order problem because it’s the very enormity of the site — it’s the very richness of the environment, it’s the availability of people — that makes it worthwhile.
Tarleton’s concern is, well, that the platforms have become so massive that they are as much of a problem as the people doing the bad things on them. So if we want things to get better, what do you fix: us, the platforms, or both?
Eli PARISER: So when you think about societies as having centrifugal and centripetal forces — things that push us away from each other or drag us apart and things that pull us together — part of the challenge is getting the right balance of those things.
That’s Eli Pariser. Eli was the executive director of MoveOn.org. Many of us know him as the author of The Filter Bubble, a really influential book on technology and advertising. Eli highlights the first challenge that we face when we try to fix social media: we can’t really go forward unless we get the companies to agree that they have a responsibility to change.
PARISER: So if you were a company, like Facebook or Twitter, that had a really extraordinarily heavily-used newsfeed, let’s imagine that you weren’t just optimizing for ad clicks or for eyeballs or for engagement, and, instead, you decided, “I’m going to make the world a more democratic place.” What would you be trying to optimize for?
What that’s really about is finding these points of group equilibrium. Points where we’re growing enough that there’s vitality and new energy and new life, but not so much that we lose our identity, or points where everybody is getting to speak enough that they feel heard and seen, but not so much that everybody’s drowning each other out.
Ethan ZUCKERMAN: I think the novelty of these platforms actually gives a chance for us to look at them as institutions and through some actual creative thinking about what are these institutions and what do we want… we have the possibility that these institutions can be reimagined, that we can think of them as working in entirely different ways than they work right now.
That’s Ethan Zuckerman. He teaches public policy at the University of Massachusetts at Amherst, and he’s the director of the Institute for Digital Public Infrastructure.
ZUCKERMAN: Some of the time we end up feeling like social media is so new, it’s so fresh. How can it be all that difficult to fix if, frankly, we barely had these systems 10 years ago?
We can look at our current situation as a kind of social media 1.0. I mean, we’re getting a deeper understanding of what’s working and what’s not with our platforms. But, as we ask the questions that we need to, will the companies listen? Can they become part of the solution?
Eli PARISER: I do think there’s a limit to how much we can expect firms that are set up to deliver returns to investors to foreground social good. It’s not even a criticism, really. I think there’s some dissonance between the way that sometimes we talk about these for-profit businesses and then the things that we’re expecting them to hold up in society.
I think the challenge is, on the one hand, because these big platforms are a de facto public square, I want them to serve people well. But I also feel like, at the same time, we need to start to build a different kind of public imagination about what exists beyond them. Because we are expecting an awful lot of these companies that in the scheme of the world are not that large. Twitter is — I don’t know what the latest number is — a few thousand people. And Facebook is maybe an order of magnitude bigger in terms of the staff.
It’s still just a tiny, tiny fraction of people in the world. How could we expect these systems to serve everyone well and to not produce a lot of negative externalities? It’s like we’re asking the bookstore to serve all the different communities that are better served in a library. Let’s just build the library and let the bookstore be a bookstore.
Maybe you can see the large elephant in the room: these companies are here to make money, but observers like Eli and Ethan want them to build something that serves all of our needs, not just the interests of their shareholders.
Maybe this means the public needs to take more control over the large companies, or maybe it means we build a new, publicly-owned social media network. But not everyone thinks we need to wait until we build something new. After all, if it’s money the platforms are currently after, why not start hitting them where it hurts: the bottom line?
Jade MAGNUS: I think in general, when we interact with these social media platforms, we’re not thinking about who’s in charge.
That’s Jade Magnus, the senior campaign director at Color of Change.
MAGNUS: Which is the nation’s largest racial justice organization.
Color of Change wanted to find a way to get rid of the hate speech that shows up on Facebook.
MAGNUS: Facebook would approve ads that said, like, “We need to create a world for white children.” Total Aryan Nation propaganda. And that ad would be literally right next to Procter and Gamble. It just spoke to how Facebook has truly become the wild, wild West. And beyond that, whose money they’ll take.
The Color of Change program, which was called Stop Hate for Profit, asked those companies to pull their advertising dollars from Facebook.
MAGNUS: We had hundreds of corporations pull out. I think what it did more than anything was shine a light to corporations and also consumers about the ways that Facebook makes their money.
The work of any activist is slow going. Yes, organizing consumers may bring victory sooner than building a brand new, publicly-owned Internet. But do these campaigns have any impact on the companies? Well, yes and no. They don’t like the bad P.R. But for Facebook, a relatively small portion of their ad revenue — actually less than 20 percent in 2019 — comes from the large, mostly-U.S. corporations that were involved in Jade’s boycott.
Okay, say you get people involved. Well, as Ethan Zuckerman says, you still need to educate consumers — inform them that their voice matters and that they can bring about real improvements to their lives.
ZUCKERMAN: We need to look at the intersection of technology and the humans who use it and understand very serious consequences can come from that intersection, and that responding by saying, “Oh, well, they’re just using it wrong. If they would only use it the right way, everything would be okay.” That’s not actually an acceptable outcome.
Sudhir VENKATESH: Where does training and education and — they sound quaint, when I even say these words — what is it, a class? A driver’s license? I don’t know what it is. I’m curious to know what you think about that impact.
ZUCKERMAN: I think training people to understand the downside of these technologies is a really good first step. Your question of, “Is it a course? Is it a certification?” That’s a healthy question. I think it’s also worth asking the question: is it ever going to be enough? And if you look at it and say, “Well, that’s not my problem, I built it to be neutral,” you’re avoiding responsibility in two very important ways.
The first is that you didn’t tell people how to use the system in the first place; you’re leaving them to their own devices and you’re leaving a situation where people are simply playing different games. Some people are in it for the lols. Some people are there to provide support for friends. Some people are in it for information. Some people are in it for converting people to your politics. Everyone’s across purposes, and it can be a complex and very dangerous space.
Second of all, once you realize some of the patterns that are coming out of this, some of the harmful patterns where it turns out perhaps that strong emotion is amplified and that may be leading discussions to extremes, you absolutely have a responsibility to step in and say, “Let’s try to figure out how to mediate that. Let’s try to figure out how to move it in a different direction.” The one thing you cannot do is say, “We’re not going to do anything.”
So the platforms have to change. And we have to change as individuals. And our society has to change. That’s a lot of changes at once. How do you bring that about? And is there any reason to believe that this is a realistic possibility? Well, the good news is that there are some signs of a different path forward that are already sprouting up. We’ll take a look at a few of them after the break.
* * *
Mark WEINSTEIN: In recent months we’ve had about eight million new members.
That’s Mark Weinstein.
WEINSTEIN: I am the founder and chief executive officer of MeWe.
MeWe is a social network that Weinstein founded in 2012 to compete with Facebook. And like Facebook, you can read news and you can reach out to friends. But you don’t see as many ads, and you control the content you want to see. Weinstein’s passion is privacy — not sharing your personal information with advertisers. But that’s not the main reason why several million new members recently joined his site.
WEINSTEIN: For sure, we’ve seen recently an inflow of conservatives.
After the recent presidential election and the events of January 6th, conservatives left platforms like Facebook and Twitter. Many were upset at the banning of President Trump. MeWe is happy to have those new users, but Weinstein wants to make sure that growth doesn’t come at the expense of healthy, civil conversation.
WEINSTEIN: We’re not going to amplify outrageous content into your newsfeed to get you aggravated and irritated.
MeWe isn’t the only smaller social media site welcoming some of these users who left the big platforms.
Jeff BRAIN: It was November that we announced that we’re going live, and in a matter of months, we’re two and a half million people.
That’s Jeff Brain, the C.E.O. of CloutHub, another new social networking platform. Like MeWe, CloutHub has also decided not to show you tons of ads and to keep your data out of the hands of advertisers.
MeWe and CloutHub are examples of an important trend; it’s the growing popularity of alternatives to Facebook and Twitter. Maybe you remember how Facebook and Twitter overtook MySpace and Friendster? Well, we may just be seeing a new marketplace of options forming.
BRAIN: I think the reason that CloutHub leans conservative right now is because of the times. Right? The conservative community feels unwelcome on Twitter and Facebook, and they’re looking for alternatives.
Brain’s idea of an alternative is to offer people a different way of engaging in politics. He draws on his own experience as a civic activist in Los Angeles. Brain led a conservative movement by San Fernando Valley residents to split off from Los Angeles County, and recently he spoke at the Conservative Political Action Conference. So it’s not a big surprise that so many Trump supporters see him and CloutHub as a potential ally.
BRAIN: Any platform — Twitter, Facebook — they all go through growth stages where they don’t look like what they’re intended to ultimately look like. I want people to engage in their school board meetings, their city council meetings. I’m filling a category. There’s LinkedIn and it’s business networking. There’s Facebook and Twitter and some others that are social networking. I’m filling a category of civic, social, and political engagement.
In the past few months, many newer social media platforms have experienced a boom in user growth. Maybe you’ve heard of Gab, which has about four million users, or Parler, which claims to have around 15 million. These platforms have received a lot more attention than CloutHub and MeWe recently.
Prominent conservatives, including many in Congress, as well as members of QAnon groups, have applauded Parler and Gab because of their willingness to conduct little to no content moderation. They proudly allow all kinds of speech on their site.
MeWe and CloutHub’s approach is different. Neither of the two founders, Weinstein or Brain, wants their site to be just for one political group or another. They’ve taken a more active role in moderating the content on their platform. But the rising amount of heated political content from conservatives will force them to devote more of their resources to keeping their sites free of hate.
WEINSTEIN: I don’t like “anything goes” platforms. I think they’re disgusting. I believe five years from now that there will likely be more “anything goes” sites and then more sites like MeWe that are very fair and civil about moderation, without a political agenda, without influence by any particular constituency.
Like Weinstein, Jeff Brain has also invested lots of resources to ensure CloutHub does not get overrun by overheated, uncivil conversation.
BRAIN: Moderation is the most important thing on a social media platform. We’re not like some of the other platforms that took this “free speech anything goes.”
VENKATESH: I’m curious to know how you think about your relationship to Silicon Valley, and to leaders like Jack Dorsey or Mark Zuckerberg?
BRAIN: Well, I have a huge amount of respect for both of them. They built huge organizations and platforms. They made decisions without any prior ability to witness what works and what doesn’t work. I would think that maybe if they were doing a platform today, they might do it differently. Facebook and Twitter were never designed to be the center of political debate. There are certain things you need in order to be able to have successful discussion, and I don’t think they were designed for that. I’ve studied Facebook and Twitter and the others. And we have the advantage of seeing what they did and saying, “Oh, I don’t like that. I want to do it differently.”
MeWe and CloutHub are not just part of a growing marketplace. They also embody a second trend. That’s the tendency of people to want to be online with others who are like-minded and who share some of their interests, whether that’s religion or politics or sports. It’s what social scientists refer to as the formation of micro-communities online.
GILLESPIE: Facebook is the gargantuan in its field and Twitter is gargantuan in its field. And each of these have a network effect such that it feels like you have to be there.
That, again, is the sociologist Tarleton Gillespie.
GILLESPIE: And if you’re finding it working against you, then you’ve got to go form a counter site.
VENKATESH: It reminds me of some of the writings of Joseph Turow on the rise of cable T.V. And one of the reactions to that was, “Oh, my gosh, the social contract in America is being torn apart. We’re all going to go into our little caves and we’re not going to want to be with each other. And this is the end of society.” I mean, I’m exaggerating to make a point, but it feels like we’ve been there before. Now when we see conservatives going to chart their own social network sites, well, maybe the solution is just to let everyone do that for a while. Because at the moment it sounds really terrible — it sounds like “this is polarization, etc.” — but when you allow a hundred different sites to flourish, then the threat is reduced.
GILLESPIE: I think that would be part of a really good answer, because part of the challenges that we’re having with the major, major social platforms have to do with both of their immense power, but also the sense that they are where you have to be. So this is a question about market dominance. Right? So having to go from a place because you’ve been removed from what feels like the largest, the only venue in which to be heard — which it isn’t, but it can feel that way — then splintering makes sense. Now, you wouldn’t need to splinter if there were dozens of options and they all had different flavors. This wouldn’t be as big an issue if there were 30 medium-sized social networks.
The two trends we’ve been talking about so far — this marketplace of options and smaller platforms filled with like-minded people — that may all be a good thing, but that doesn’t get us away from the fact that the platforms are all for-profit businesses. Can you really create a healthy, welcoming online ecosystem if everyone is trying to make a buck?
PARISER: In a town, in a community, business is an absolutely important part of what goes on and a lot of business spaces are really important social and community spaces.
That’s Eli Pariser again.
PARISER: But we also recognize that there need to be libraries; there need to be parks; and that these things are best done as enterprises that are accountable to everybody. We imagine that the Facebooks and the Twitters of the world are going to take those roles on and maybe they do, too. But I just think there’s a mismatch in structure. Ultimately, if you have a structure for one particular goal, let’s not imagine that it’s going to serve billions of people well for all the different facets of their life.
VENKATESH: Are there off-line, in-real-life spaces that, for you, serve as the models for what a healthy online ecosystem should look like?
PARISER: One I spend a lot of time thinking about is parks. The reason I like parks is parks are places where we encounter people who are very different from us, but we do so in this very gentle way that’s not about, “Hey, you. I want to argue with you about abortion.” You see someone in the park a few times. They’re with their kids. They’re smiling. And maybe the third time you see them, you wave, and you start to develop this sense of familiarity and comfort with someone who might have a life that’s not like yours.
So, let’s say you build a smaller social media platform, and maybe you take Eli’s advice and you run it as a park — you know, with public ownership so it’s not all guided by the profit motive. Well, that’s just the first challenge. You’ve got another problem: How do you make people on the platform feel like they belong there? Like they’re invested in keeping things healthy and civil? Like it’s their park?
Well, we’ve seen the large platforms have trouble doing this. They put so much emphasis on using machines to automatically detect and remove bad content that there’s no sense of belonging. Most people aren’t really thinking, “Wow, I love being there. It’s my community.” To find that feeling, we have to turn elsewhere.
GILLESPIE: It’s easy to forget that there was another tradition that wasn’t just about, “build a platform and let people fill it.” It was the tradition of community management.
Tarleton Gillespie is talking about the practice when a platform turns over to the users themselves, some of the job of managing content and setting expectations for proper behavior.
GILLESPIE: There were people who had been building websites and running online communities for a long time that understood that communities didn’t just happen. They had to be nurtured and mediated and protected. There had to be guardrails to a healthy community.
We don’t think about that much now because the position that was taken by Facebook went the other way. And I think probably the fingerprints of venture capital where software design was rewarded — there was an investment you could make in someone who had designed clever software that probably didn’t look as appealing to invest in someone who thought they could manage community well.
Community management is a growing trend at the big and small platforms. At Twitter, for example, my former team just rolled out a product called Birdwatch. Birdwatch asks users to label tweets that might be misleading and then the users get to rate them and decide whether they should be removed. The idea is to draw on this tradition of community management and get people involved— get them invested in making Twitter healthy and safe.
As someone who worked on the Birdwatch team, I’m hopeful. But I’m also skeptical. Users are reluctant to volunteer their time to help a multibillion dollar company, which raises the question: where does community management work well? In one corner of the U.S. we may be finding that answer. Ethan Zuckerman explains.
ZUCKERMAN: Front Porch Forum is a hyper-local social network that covers Vermont, one town in Massachusetts, and two counties in upstate New York. It has a very small footprint, but it’s worth paying attention to because it’s cracked the code of healthy community dialogue. It’s professionally moderated. And the moderators will pull your content if they think you’re being racist or sexist or a jerk, and will gently write back to you and say, “Maybe you could put that a different way. We’ll run it again tomorrow.”
This kind of content moderation takes considerable time and energy.
ZUCKERMAN: Michael Wood-Lewis is the co-founder of Front Porch Forum. People always ask Michael, “Well, maybe this can only work in Vermont.” And he points out that people thought it could only work in his particular neighborhood in Burlington, and then they believed it could only work in Burlington. And then they believed it could only work in the state. His argument is that it can work, but it has to work at very small scale. And they require cultivation. They require really careful understanding of community norms. They require buy-in from people in the community. They’re really hard to build at scale.
VENKATESH: This idea of scale, I think, is key.
ZUCKERMAN: It may turn out that ambition of being global and powerful is incompatible with wanting to build a really healthy and resilient community. That’s not the logic that you get from a Facebook or a Twitter. My hope is that we’re going to see less of this talk about, do we break Facebook up into Facebook and Instagram and WhatsApp? I don’t know that those things really change things. I think funding a wave of new social networks that are focused on healthy, resilient communities, networks that are designed to make us better neighbors and better citizens, and working to change the environment so that those networks have a much better chance of success. I think that’s a much wiser way to go.
Let’s stop and put some of the pieces together. MeWe, CloutHub, Front Porch Forum — you might not have heard of these platforms before, but I think they’re worth knowing about. But before we give up entirely on larger communities, it’s worth looking at one more example. Ethan Zuckerman explains.
ZUCKERMAN: The way to understand Reddit is that it is a single site that hosts thousands of semi-autonomous communities.
Wait, Reddit? Many of us probably think of Reddit as a toxic place — an “anything goes” site with a reputation as a fire hose for racist, misogynist, and violent content. But it has evolved.
ZUCKERMAN: Some of those communities are quite poorly run. Some are incredibly well run. R/Science, for instance, is a Reddit that only permits discussions of peer-reviewed scientific literature. And your post will get pulled if you’re posting something that isn’t peer reviewed; your post will get pulled if you are making unsupportable claims; your post can get pulled for all sorts of different reasons.
And the ways in which R/Science is able to do this incredible moderation work is by having more than a thousand moderators who work together to police the community and to keep that conversation healthy.
Places like Reddit, and other big sites like Wikipedia — they use something called “decentralized moderation,” meaning you don’t have just one small team, but you have a lot of different groups of people reviewing the content and trying their best to foster healthy conversations.
Reddit also allows people to modify the design of their area so they can set their own rules and create their own norms. And that can be a good thing. It gets people involved. It creates that sense of belonging.
If these examples that we’ve been talking about don’t look exactly like Twitter or Facebook, that’s a good thing. It’s healthy to have different types of sites that offer different types of experiences and different outlooks. Eli Pariser explains:
PARISER: We’re living inside these very simple and, in a way, homogenous structures, whether it’s Facebook or Twitter — it’s as if you were living in a city where every house was the same as every other house and every building was structured in the same way. I think this is one of the challenges: it’s just we’re trying to cram a huge amount of global discourse through a system which is not very differentiated, and, historically, that differentiation has been key.
ZUCKERMAN: Five or six years ago, I saw a lot of people go into companies like Facebook with a real conviction that that might be the way that they could have the biggest impact on the world.
I know exactly what Ethan Zuckerman is talking about. Five years ago, I went to work at Facebook with exactly that thought: that I’d be working on important problems and having a real impact on people’s lives. And I’m proud of the work I did there. But as you’ve heard in previous episodes, it didn’t always turn out the way I thought it would.
ZUCKERMAN: And we’ve gone from this notion of, “Perhaps I will go to Silicon Valley and I will help save the world” to “someone needs to save the world from Silicon Valley.”
There are two ways that sociologists think about society. Some like to focus on why things fall apart and they focus on the problems. Why is there so much crime? Why do our democracies fail so often? And why do our marriages end up in divorce?
Another kind wonders why anything ever holds together at all. They like to understand our common bonds, the glue of society. And they study things like what makes us altruistic and willing to help others? How do we overcome our differences to do things together? And what makes us trust our sources of news?
When it comes to social media, most of our energy — in and outside of companies — has been on the first approach. Silicon Valley spends a ton of time and energy trying to get rid of bad people who cause bad behavior so that the good people can stay online and get along.
There’s a problem with this one-sided approach. Going back to our cars analogy, it’s like all you do is punish the bad drivers and you forget that someone has to make sure that the road is being built the right way. Otherwise, with one hand, you’re turning good drivers into bad ones and the other hand is trying to get them off the road.
We need to spend more time figuring out how to build a better road — or, in this case, a more durable, healthy and welcoming type of social media. But the platforms aren’t incentivized to build those healthier environments, at least not by themselves. They’re making too much money and Silicon Valley is, well, too self-absorbed to change.
The never ending cycle actually works just fine for them — and for us, too, if we’re happy with the way things are. But if we’re not, we’ll need to figure out how to get the platforms to work for us, and we’ll need to figure out what we want from our social media.
The problems of social media can’t be solved by Mark Zuckerberg or Jack Dorsey or for that matter, by Congress or activists or users or me. They can’t be solved by any one person or group.
Someone does need to save social media. The thing is, that someone is all of us.
Coming soon on Sudhir Breaks the Internet, we’ll have in-depth conversations with many of the voices that you’ve heard in this series, and others from the world of technology, including Tristan Harris, the star of the Netflix documentary The Social Dilemma; Harvard law professor and Facebook Oversight Board architect Noah Feldman; and Jade Magnus from Color of Change.
We’ll keep looking at the complicated world of social media, especially some of the rising personalities like CloutHub founder Jeff Brain, as well as those helping platforms work better, like my friend Tracey Meares at the Yale Law School. And we’re always interested in exploring new topics, beyond the world of social media. So send us your ideas. Here’s how: tweet at S-B-T-I-underscore-Show, or email at email@example.com. That’s S-B-T-I at Freakonomics dot com.
* * *
Sudhir Breaks the Internet is part of the Freakonomics Radio Network, and is produced by Freakonomics Radio and Stitcher. This episode was produced by Matt Hickey. Our staff also includes Alison Craiglow, Mark McClusky, Greg Rippin, Emma Tyrrell, and Lyric Bowditch. We had help on this episode from Jasmin Klinger. To listen ad-free, subscribe to Stitcher Premium. Thanks for listening.
MEARES: If you’ve ever been to Athens, you know, you can even imagine the people in the agora standing on a stone just yelling. And possibly people will hear them or possibly they won’t, right?