Content Control

Content Control

An interview with Jillian C. York, the author of Silicon Values: The Future of Free Speech under Surveillance Capitalism.

Staff at the Facebook Loeschzentrum ("deletion center") in Berlin (Soeren Stache/picture alliance via Getty Images)

Booked is a series of interviews about new books. For this edition, Nicole Froio spoke to Jillian C. York, the author of Silicon Values: The Future of Free Speech under Surveillance Capitalism (Verso).


Nicole Froio: Silicon Values is full of observations about the current state of social media and content moderation. You identify a class structure in the way that speech is controlled. Could you tell me a little bit about the corporate monopoly on speech, and why that endangers social movements?

Jillian C. York: Historically, the state and religion have been the institutions that control what people can and cannot say, punishing people for their thoughts, for their actions, and for their words. Silicon Valley companies have emerged as a third institution. They have put forth a new set of parameters. On their platforms, the people who make the decisions about our speech are not people that we elected, or that we trust for their faith, but people like Mark Zuckerberg. They surround themselves with “yes” people.

In the early days when Facebook was taking up this role, the rooms where decisions were made about what could and couldn’t be said were more diverse than I thought they would be, at least in terms of gender. But there was a lot less diversity in other ways. Most policymakers come from middle-to-upper-class backgrounds and many are Ivy League graduates. Today, a number of Facebook’s executive-level policymakers have law enforcement or government backgrounds.

As these companies have gotten bigger, they have automated a lot of these content moderation processes. Sometimes it’s a human moderator, sometimes it’s a machine, but nevertheless it’s an automated process that’s quickly making a binary decision. On/off. Delete/keep.

All the different ways social movements use social media leave the door open for a social media company to take the content down. You’ve got people posting events. You’ve got people capturing footage of what they’re seeing. You’ve got people reporting things that are happening in the moment. If you livestream from a protest and there’s violence, it’s very likely that your video will be taken down. The context doesn’t matter. You see somebody’s shot; the screen goes blank. Or let’s say you’re filming a protest for Palestine, and somebody walks by who happens to have a Hezbollah flag. That goes down, because an automated tool detects that flag as belonging a terrorist organization. It doesn’t matter if the entire protest is about something else; because that one element is there, it goes down. The ways that these tools affect demonstrations are often by accident. But I don’t like to just call it an accident, because I think it’s carelessness, caused by having people make these decisions without attention to detail or a deeper understanding of context.

Froio: Part of this has to do with how U.S. imperialism is embedded within these companies. Decisions about what stays on social media and what gets deleted affect protesters in the Global South more than in America, because the companies don’t really care about the details in other countries.

York: We could argue that these companies shouldn’t exist in these places at all, but I’m a realist; the internet is global, as are these corporations. But they’re not putting nearly enough resources into understanding local contexts. Myanmar is an obvious, well-known example. The country has a population of about 55 million people, but Facebook hadn’t been paying much attention to it. By 2013, however, as the country started opening up, journalists and researchers began informing Facebook that there was a burgeoning genocide against the Rohingya people taking place, and that hate was being spread on the platform. Facebook didn’t do anything about it. They didn’t hire more content moderators. When they eventually did act, it was with silly little initiatives like, “Here’s how you should speak back to hate speech.” Never mind the fact that the people who were engaging in hate speech on Facebook in Myanmar were people in power.

When it comes to content moderation, it’s important to consider the speaker. If it’s your average person, like a far-right marcher in Charlottesville, I think that there’s valid arguments to be made to not take hate speech down. But when you’re talking about a general, or a president? That’s when the speech matters even more. And yet these companies have flipped this, kicking off people left and right but leaving the most powerful mostly untouched.

Sometimes the content moderation doesn’t go far enough. Sometimes it does too much. I got some information a couple weeks ago that Facebook has a set of countries where it rules out certain LGBTQ tools or features. Things like the special pride flag “like” button that was rolled out to users in 2017—that feature simply wasn’t available in a subset of countries. That may not seem important, but to me, it’s actively undermining LGBTQ movements in other countries by creating a system apart. That list includes pretty much every Muslim-majority country, and others, like Russia and India. Who’s making that decision and why? What are they basing it on? This is the bigotry of low expectations—a paternalistic assumption that the company can’t trust people with its product.

Froio: A lot of these rules seem so arbitrary, but they are also rooted in American culture. I thought one of the most interesting chapters in the book was the one about sex work and what U.S. companies view as deviant bodies or deviant parts of our bodies.

York: I’m from the United States, but I grew up with hippie parents who were pretty open about their bodies, and I did theater, where people just strip down because you have to change your clothes quickly. So I saw a lot of naked bodies by the time I was a teenager. But I also grew up in the mid-1990s, when there were scandals over Britney Spears and her schoolgirl costume. It was a lot of mixed messages for me, which is kind of what got me interested in this topic in the first place. As an adult, I moved to Morocco, where everybody’s much more conservative and yet women freely strip down completely naked in the baths together, and now I’m in Germany, where I can just take my top off and nobody would give a shit. I’ve experienced a lot of different cultural conceptions of nudity.

There’s a tendency among a lot of activists concerned with free expression to separate nudity from sex work, but they’re deeply connected by the expectation that we should feel shame. When it comes to sex work in the United States, strip clubs have been accepted, but on Instagram, if you’re an athletic pole dancer, you get censored. The rules on social media are more prudish than the real-life rules in the United States in a lot of ways. They blame it on their global community. Literally, it says in the rules, “We don’t show this because of our global community.” But I also talked to one person at Facebook who said it was because “there would be boobs all the time.” But they have the technology to filter them out. Why not just have a boob switch? “Boobs off.”

So much of it comes down to patriarchy: what we allow women to do. I don’t want to dismiss the realities of men involved in sex work, but the bulk of the problem for these companies is women’s nudity online, and women’s work online, including trans women. It’s no surprise to me that these companies are all run by men.

Froio: There’s a real double standard. On TikTok, I see a lot of men play out their violent fantasies against women, and yet sex workers I know aren’t allowed to post certain things because they will get shadow banned—not banning accounts outright, but making their content inaccessible to other users.

York: The standards for sex and violence are so different. Violent video games are allowed on Twitch, but you can’t show a hint of a nipple. Most of my friends who do sex work in Germany are not reliant on the internet. When it comes down to it, they actually do have other options, unlike many in the United States. But when the pandemic hit, everybody got stuck online. A couple of my friends are incredible burlesque dancers, and they started a Twitch show, but they got kicked off after week three. They were even wearing pasties, but it was too much breast. Twitch came out with a policy that was supposed to be trans-inclusive, but it made the company look worse because of these rules: they now have a ban on something like “female-presenting nipples.”

Hopefully, social media has helped push this issue forward, not only because we can talk about it on social media, but because these companies have laid bare just how ridiculous it is that certain things are still criminal.

Froio: As you pointed out, there are parts of social media that are really useful to movements. Donna Haraway used to say that on the internet, you can be anybody that you want to be. And you start one of your chapters with a quote by Wael Ghonim: “I once said, if you want to liberate a society, all you need is the internet. I was wrong.” So, what went wrong?

York: I think that everyone should have a voice. But I don’t think that we as humans were prepared for all of this—to be part of a global conversation. When I think about the cultural lines that are most complicated to cross, they’re often not country to country, they’re generational. As an older millennial, I sometimes find it harder to talk to some of the younger Zoomers than I do to talk to somebody my age from Ethiopia. It’s about what we grew up with and how we were taught. Things might have a chance to even out at some point—if we could slow the “progress” a little bit.

Froio: The sheer volume of content has become really difficult to deal with.

York: I would love a Twitter that only let me post five times a day.

Froio: You mention twice in your book that you have always believed these companies shouldn’t control speech at all. And then you changed your mind. What led to that?

York: On a fundamental level, I still don’t think that these companies should control our speech. But as long as they do, we need to figure out a way for them to do it better. After seeing how Gamergate, the rise of white supremacists, and even ISIS unfolded in online spaces, I realized that these companies do have a responsibility. They can’t just let everybody do their thing, because that’s not working, and people are getting hurt.

Around 2017, I started working with a coalition of other groups concerned about content moderation to try to figure out some common ground. Because there is a lot of disagreement. Many U.S. civil rights groups, for example, want to see more censorship of hate speech that isn’t incitement, and I’m not on board with that. I understand where they’re coming from, and I’m not going be the one to stand in their way, necessarily, but I think that right now people are thinking short-term; that’s completely understandable, but it doesn’t make good policy. But we could come together on trying to make these processes better.

One important tool is the right to appeal. If something you post is taken down and you think it shouldn’t have been, a human needs to look at that. That’s a core component of any justice system. Another essential part is transparency. We need to know, for example, what definition of “terrorist” these companies are using—and if it’s based on politicized or problematic U.S. definitions, to be able to try to change that. Facebook has a choice here: it could do more work to determine who’s genuinely dangerous, what groups we don’t allow. Instead, it keeps this very secretive list. And any time you put it under the table, it’s going to be prone to disaster. As we’ve seen from reporting in the Guardian, you’ve got content moderators who are literally trained to memorize faces. They’re required, for instance, to know certain figures from designated terrorist organizations (such as Hezbollah leader Hassan Nasrallah) in order to be able to identify them and remove content that speaks about them in positive terms. I just don’t see how that’s a policy that you can put into action in a consistent manner.

Froio: There are bad things about AI moderation, and there are bad things about human moderation. Is there a better way to bring them together?

York: A former Facebook employee told me that there are two different kinds of content moderation. There’s the stuff that’s binary: is it boobs or not? And then there’s stuff that’s contextual. The binary stuff can be dealt with by automation fairly well at this point. But I still think that for the majority of it, that choice should be in the hands of the users, not in the hands of the centralized content moderation system. So, if we’re talking about boobs, let the user turn off the boob switch. If necessary, Facebook itself can do this to comply with the local law.

Then there’s the really horrendous stuff. Child sexual abuse imagery is abhorrent. There’s no space for it in this world. And the way that it is dealt with is that it is classified and identified by law enforcement and tagged in a way that can create a match. Law enforcement submits images to a database run by the National Center for Missing and Exploited Children, and those images are tagged, or “hashed,” then submitted to a database. The companies employ artificial intelligence to identify images that match those in the database and automatically remove them before they’re displayed. That system works, and should continue to work. But when it comes to terrorism, we’re dealing with a much trickier topic. There are some things we need to see to understand, and others that nobody ever really needs to see. So you do still need a human touch. You can’t just throw everything to automation. But you can use automation to do the first checks. To sort things for the human. To make predictions. You can have a human in the loop, but not as the first response.

Froio: Many of us are citizens of the internet. We’ve built communities in these very flawed social media platforms, but they are still our communities. And I don’t want to see harassment there. What does it look like to work at this while giving power back to people?

York: One of the biggest problems that we have right now is the lack of competition among these platforms. Twitter is actually the small guy compared to the two big players: Facebook and Google. They own the fiber, and they’re acquiring other companies all the time. They’re the ones that prevent us from having alternative options. As problematic as Twitter can be, I think that it is better at a lot of this stuff than a lot of the other companies are. But it’s also a simpler platform.

What would a user-powered movement look like? I want to see more users brought into the policymaking process. You can literally hire a diverse selection of your users, from around the world, and bring them to work for the companies—people who understand your product, and who understand the issues. There are other ideas, like the oversight board at Facebook (a body that is funded by, but operates independently from, the company and is tasked with adjudicating difficult content moderation decisions submitted either by Facebook or the public). We need more ideas like that. But why is this board 25 percent American when Facebook is not 25 percent American? More idealistically, I think we’re talking about creating a new platform, or platforms, from the ground up. I don’t think I’m going to be the one to do that, but I leave that page open for the next person to come along.


Nicole Froio is a Colombian-Brazilian journalist, researcher and writer. She writes about pop culture, violence against women, inequality, movements for justice, digital cultures, and books.

Jillian C. York is a writer and activist whose work examines the impact of technology on our societal and cultural values. Based in Berlin, she is the Director for International Freedom of Expression at the Electronic Frontier Foundation, a fellow at the Center for Internet & Human Rights at the European University Viadrina, and a visiting professor at the College of Europe Natolin.


Socialist thought provides us with an imaginative and moral horizon.

For insights and analysis from the longest-running democratic socialist magazine in the United States, sign up for our newsletter: