Facebook

Facebook’s Oversight Board has upheld Trump’s ban — what’s next?

Views: 156

Today we’re focused on one of the most complicated problems of all: content moderation. If you’ve kept up with Decoder, you know that content moderation seems to come up almost every week — the question of how platform companies decide what to leave up and what to take down is messy, controversial, and extremely political.

And if something on the internet is messy, controversial, and political, you know that Facebook will be at the bleeding edge of it. Last year, the company announced that it would send difficult moderation problems to a new entity it calls the Oversight Board — a committee made up of lawyers, politicians, and speech experts that would rule on whether specific content takedowns on Facebook were appropriate.

That board just got its first big test last week, as it issued a decision about whether former President Trump’s indefinite ban from Facebook platforms would stay in place. And the decision was to kick the issue back to Facebook — the board said Facebook didn’t have an actual policy in place for it to review, and that Facebook should make one and send it back to the board in six months. In the meantime, Trump remains banned from the platform.

So what does all that mean? What is the Facebook Oversight Board, what are its powers, and how is it even independent from Facebook itself? You’ve probably heard people call it the Supreme Court of Facebook — is that the right way to think about it? Will every platform require a moderation court like this in the future, or is this just another way for Facebook to exert influence over the internet?

This is a big new experiment, and the Trump decision is a big moment for that experiment. To help figure it all out, I asked Kate Klonick, a law professor at St. John’s University Law School, to join the show. Kate has been researching and studying the Oversight Board from the start — she embedded with the board as it was forming to write a definitive piece for The New Yorker called “Inside the Making of Facebook’s Supreme Court.”

Kate and I talked about what the board is — and isn’t — what its powers are, and what this decision means for the board’s authority in the future. And we talked a lot about what it means for private companies to have things that look and feel like legal systems — if you step back, it is bonkers to think that any company needs to fund anything that looks like a Supreme Court. But Facebook is that big and that globally powerful. So here we are.

One note, I mention a Supreme Court case called Marbury v. Madison — that’s the very famous case from early in the county’s history where the Supreme Court basically gave itself the power to invalidate laws passed by Congress. Oh, and you might pick up that I’m nervous here and there — that’s because I always, always get nervous talking to law professors. I feel like I’m back in my 1L days every time. Bear with me.

Okay, Kate Klonick, from St. John’s University Law School. Here we go.

This transcript has been lightly edited for clarity.

Kate Klonick, you’re a law professor at St. John’s University Law School. You are also one of the foremost chroniclers of Facebook’s moderation efforts. Welcome to Decoder.

Thank you so much for having me.

We’re talking the day after the Facebook Oversight Board released a big decision about whether Facebook was correct to indefinitely ban Donald Trump from its platform. There’s a lot of concepts in that sentence alone. Let’s start with the Facebook Oversight Board itself.

In February, you published a long piece in The New Yorker called “Inside the Making of Facebook’s Supreme Court,” which detailed the process by which Facebook conceived of having an oversight board, literally the meetings and the software they use to create this board virtually during a pandemic. This decision feels like the first big moment for that board. What is the Facebook Oversight Board?

We still don’t know exactly. I know that’s the worst answer ever to start out, but I think it’s the right one. So we can talk about how it’s been talked about, and I think that that’s going to lead us to what we saw yesterday and what we can make of this opinion.

So in November of 2018, Mark Zuckerberg announced that he was going to set up this, what had been colloquially joked about and called the “Supreme Court” of Facebook. This idea that they were going to start running certain types of content moderation decisions. Once they had finished being appealed internally at Facebook, they would make it possible to appeal to an outside, independent oversight board.

And the question was, how the hell do you set that up? You have a pretty fundamental principal-agent problem right off the bat. And then how do you set that up? What does it make you do? And how do you make this work when you’re talking about public rights of freedom of expression and international human rights law? This is where the conversation was and still is, to some degree. And you’re talking about a private corporation. How do you make something have teeth? How do you make it legitimate? How do you make it independent? All of these questions were very open and on the table.

And so about six months after that announcement from Zuckerberg I started following, inside Facebook, the government and strategic initiatives team, which was the team that had basically been tasked, under Nick Clegg, to come up with this solution to what exactly this board was going to be and how they were going to solve all of these institution-building problems.

And so I started following that and I ended up doing that for 18 months, watching as they wrote their documents and figured out what they were going to do. And there’s a lot of stuff to unpack just on how they did make decisions. And so I’m happy to go over the basic framework of how they solved the problem of independence. Because I think one of the biggest things that we don’t know about the board is what it is, and what it is indebted to for Facebook, and what makes it plausibly independent?

Let’s step back for one second. So Facebook is a huge company. They operate a massive platform, several massive platforms, around the world. Their content moderation decisions have an enormous impact on people, on culture, on democracy. They wander into speech issues at a global scale where a team of people in the United States cannot plausibly understand it — speech issues in other countries that Facebook runs into at scale. You brought up human rights law, some of those things directly lead to horrible outcomes like genocides, literally, with Facebook.

Mark Zuckerberg is unaccountable to the shareholders of Facebook, the corporate structure. So it’s a very unique company. He owns a majority of the shares. He can’t be removed as CEO, fundamentally. And so his solution is, “I’m going to set up a different thing to hold Facebook accountable and to review our content moderation decisions.” And it sounds like what you’re saying, the first problem is how do you create that thing to be independent? And how do you pay for it in a way that maintains its independence?

Yes, exactly.

How did they solve that problem?

Well, that problem was interesting. So they decided to set up a Delaware trust corporation. They set up a Delaware trust corporation in October of 2019, and then the next day they arranged for that trust corporation to serve the Oversight Board, LLC, which is a limited liability corporation.

The entire purpose of the trustees, and the entire purpose of the trust, was to administer a $130 million irrevocable grant that Facebook gave and then snipped the purse strings from, to this trust. It’s not an endowment. This is an important distinction because they cannot, actually, the trustees can’t invest the money. There’s no investment committee. It’s specifically not allowed. It’s not enough money to be able to endow it. It is enough money to probably have it run for five to six years, and then it is contemplated that there will be an endowment.

So there is the question of well, if they really do go at Facebook and really do hold Facebook accountable in some way, they are running the risk of having their funding cut off. But in the short term, six years feels like a long time, and this is what they were charged to do. So we’re seeing what the board is doing. But this is basically how it works. All of the board members and the administration are basically employees of the LLC, which is itself controlled by the trustees. So that’s how this breaks down from a business standpoint.

It was a fairly elegant solution. The trust documents are pretty interesting to read if you’re into that kind of thing, which I wasn’t but I had to do anyway. So I think there actually is, for right now, for the next five to six years, I think there is a fair amount of financial independence. We’ll see what happens four years down the road.

Who’s on the Oversight Board? We keep talking about it like it’s a court. I don’t know if that’s correct. I want to get there. But they have some people who are making decisions and writing opinions and disagreeing. Who are those people?

Right now the board is four co-chairs with a total of 20 members, that all are basically hearing cases related to user appeals that come out of Facebook on content moderation, or things that Facebook itself kicks to the board to review, like the Trump suspension.

So those 20 individuals, it’s a pretty illustrious group of people. There’s a Nobel peace prize laureate, a former prime minister of Denmark, former editor-in-chief of The Guardian, a former circuit court judge, a former Supreme Court clerk and Columbia law professor. Not to mention a ton of other people that are just experts in human rights law and lawyers and freedom of expression in their own right. So it’s a pretty well-staffed group that has a lot of experience, both with institution-building and freedom of expression and international human rights.

So for the Trump decision, did all of them hear it and vote, or was it a small group? I mean, normal courts, you go to the appeals court, they have a lot of judges, but the first three of them hear it, right?

Yeah. Usually. So that’s precisely what happens here. So the process around hearing cases is a five-person, randomly selected, anonymous panel hears the case. And then right now how they basically have this working, because some of these things were not laid out in the bylaws and it’s up to the board to decide them on its own, is basically that right now the panel is writing up a draft of whatever it is that they determine is the right solution to the problem. After, they arrange and collect facts and ask people for more opinions and read all the briefs and everything else. And then they start circulating that. And a majority of the full board has to approve the final decision. And that was basically exactly how it worked for the Trump case.

One thing that struck me reading the Trump decision is that it is anonymous. There’s frequent reference to the minority, and how the minority would have judged things differently. We don’t know who’s in the minority. We don’t know how big the minority is.

We don’t know how big but it can’t be bigger than nine people.

Sure. Yeah. But that’s a pretty big range from one to nine. And that’s all fine. So it’s pretty anonymous. And then the members of the Oversight Board are out in the world. They’re doing interviews with Axios. They’re doing interviews at the Aspen Institute. They’re publishing their own blog posts about this decision. I can’t quite understand what part of it is supposed to be anonymous and why, and what part of this is really public.

Yeah. No one knows. That’s a great question. All day yesterday I was a little shocked at all of the different types of thoughts and collective pronouns that were being used when people were talking. One of the main reasons to have anonymity on a panel like this is that anonymity gives you a certain amount of intellectual privacy. The idea that you’re not going to be publicly shamed for being in the minority is pretty key.

The other thing is that these people that are on the board are all over the world. And a lot of the cases that they are touching on possibly pose really very real security risks to them, should their names be attached to the outcomes. And so that was another consideration for when all of this was being discussed about whether they should be anonymous or not.

What we didn’t know is whether there were going to be dissents. And so what’s super interesting here is that there’s no dissents. There is one decision and then they’ve decided to fold the idea of a dissent or a concurrence into these notions of a minority of the panel, a paragraph that says, “The minority of the panel felt differently about the reasonings for this.” So that was fascinating.

So that would be one thing, if it was just getting read from the bench and that’s all you heard about it, and you let the reasoning of the opinion or the decision stand on its own. But you didn’t. You then had everyone off tweeting all of their thoughts about everything and talking. And I think that that’s one of the things. I even asked one of the people who came on my show that day, I was like, “What’s going on?” If this was a court you wouldn’t be doing a TV hit to talk about what the conversation was like in chambers. That would be totally verboten. So it is like, “Well, are you a court? Because this sounds a lot like a court but you’re not exactly comporting yourself as a court.”

I don’t know, the Supreme Court, doesn’t say, “Our big decision in this massive policy issue is coming out tomorrow at 9AM. Everybody get ready.” And then queue up its press hits. I keep coming back to that question. Is it a court? We’ve all called it this Facebook Supreme Court. It seems right, conceptually.

Facebook has difficult decisions to make. It doesn’t seem to want to make them or be accountable for them so it’s going to kick them over to this other place. They will take the hit and Facebook will presumably do what they say. But if you’re going to be that kind of institution, there’s just a part of it where that’s not how courts actually act. But just conceptually, I can’t tell if this is a court or not.

Yeah. I think they’re there with you. I think they’re still deciding what they want to do. So for example, just to put it right on the nose, they talked to me, a number of the board members, on condition of anonymity. They talked to me for The New Yorker piece explaining some of their first decisions and how they had reasoned through them and what they had thought. And that was a level of access that a court would not usually grant to a reporter or anything else. And after that, they decided not to do that anymore.

So there are changes that they’re making to their policy. It’s not set in stone. I’ve heard a lot of people wondering, “Well, this is an incredibly well-reasoned and rigorous legal opinion, so this is most definitely seeming like a court.” But the way that it’s being talked about by the people who are making this decision is not quite the same as what we expect to see out of courts that, traditionally, we’ve seen in the United States.

What is the board’s power over Facebook? What can the board make Facebook do, specifically, if it’s unhappy with how Facebook is acting?

So the power that the board has over Facebook is incredibly narrow, but it is a small devolution of power that I’ve always argued is actually a lot bigger than it seems. For right now, for content that is removed or kept up on Facebook, after a user has appealed it internally with Facebook, they can take that appeal and give it to the board. The board hears their case and makes the determination overturning Facebook’s decision to keep down the content or put it back up. And it’s only content, single-object content. It’s not even pages. That was a special thing that the board considered for Trump. Users can’t appeal their page takedowns right now.

Facebook agreed to adhere to whatever the board’s decision is on that specific piece of content. So my specific piece of content, my picture of my dog that gets accidentally removed from Facebook and the Oversight Board says it has to go back up, that has to go back up. But if you had also posted a picture of my dog and got it taken down, they have no obligation to restore your content. They have said that they will make efforts to restore similar content, but there’s no promise. That’s it.

That’s very narrow.

It’s super narrow.

But they’re not actually saying whether the policies are right or wrong. They’re saying whether the enforcement of policies is right or wrong.

Yes. But there is one other thing that they obligated themselves to do, which is that when the Oversight Board makes public policy recommendations, Facebook has obligated itself to respond within 30 days to the Oversight Board’s policy recommendations and whether they have been implemented or not, how they have been implemented, or if they weren’t implemented, why not?

And this is this form of weak-form review, is something that Harvard law professor Mark Tushnet puts it this way. That it’s the court calling on basically the executive, the legislative body, to come back in and fix their problem, and then report back as to how they ended up doing that. And so that is actually also a pretty important reputational pressure that the board can put on Facebook.

So this leads right into this decision. The simplest way of understanding this decision is: The Capitol riots happened. Trump posted a bunch of videos and posts that ostensibly encouraged people to stop acting badly at the Capitol, kind of supported it, pretty messy, actually, in terms of just the straight interpretation of what Trump meant to have happen because of these videos and posts.

Facebook tries to take them down. And he tries again. They take them down and say, “You’re indefinitely banned from Facebook.” This is the 6th and the 7th of January, a very chaotic time in America. All the other platforms are doing the same thing. Facebook then says, “Well, we got this board. We’re going to kick the indefinite ban of Trump to the board to see if that was the right decision.”

And the board comes back and they say, “Fine. It’s fine for you to have indefinitely banned him. We’re not going to write a policy for you on indefinite bans. Also, you have no policy on indefinite bans.” And they seem very unhappy with Facebook. There’s actually a tone to me in this decision that’s like, “Don’t put this on us.”

Yeah.

And I think you could read that [as], “Come back to us in six months with a real decision. And we’ll tell you what to do.” And that reads to me as asserting its authority in a way, but it’s also really, really narrow and kind of punts the issue.

I think it punts almost not at all, actually.

Okay.

I think that that’s been one of the worst takes that’s come out of this. It’s not punting the issue at all. Because the issue’s coming back. What they’re saying is, we’re not going to carry water for you, Facebook. How I think about it is this, that one of the conceptions of what, as we’re talking about, what is this board? Is it a court? What is it?

One of the questions is what type of, level of, court is it going to be? If it’s a trial court, then it’s a fact-finding body and it does all of this work. And maybe it makes statutory interpretations of rules.

Or is it more like a Supreme Court or court of appeals court, which is going to actually review the law and whether the law matched the facts of the thing? And what we see out of this decision is they are doing all three. I want to just hit on this really quickly. You talk about the fact that they lay out the events of the 6th and the 7th of January and that those were really hectic, crazy times.

Do you know what a gift it is to have a cogent, rigorous, well-researched record of everything that we know happened on the front end, but matching that with Facebook going on the record in the backend of what it was doing and how they were doing this? For the last 20, 15 years, we’ve only had people leak stuff out of the companies to tell us this type of thing. There has been no process for this. It was like a breath of fresh air to have that to resource going forward.

But that’s kind of a trial court type of hat. And so then they kind of get into this appellate court type of hat. And they’re like, “No, we’re not a legislature for you. And our mandate is to see what your rules are and whether or not they were consistent with your values and international human rights standards. And if you don’t have a rule, we can’t do that. And we’re not going to make one for you because that’s not our job. And we know what you’d like us to do. You’d like us to basically make this somebody else’s problem for you. But we’re not going to do that. And we resent that you would even ask us.”

And I just thought it was a powerful, powerful response, really rooted in the rule of law and procedure, instead of getting sucked into the vortex of intractable online speech problems and the definition of newsworthiness and public figures and things like that, [which] is never going to get you anywhere.

Let me push back on that. I find myself really wondering what the limit of authority for this Oversight Board is.

I think they punted specifically because they said, “You need to come up with a proportionate response to Trump’s actions and come back to us in six months with that response based on any rule,” which sounds like, “Someone’s got to reinstate Trump to Facebook and it’s not going to be us. We’re telling you that an indefinite ban is not acceptable. There’s no rule that says indefinite bans exist. But we’re not going to tell you the term of an indefinite ban.” The temptation to say that the board claimed for itself a greater authority by not making a decision is very high.

I see a lot of lawyers making that comparison to Marbury v. Madison in the Supreme Court. There’s this big historical parallel that seems very tempting. But here, narrowly, it’s just like, they didn’t want to be the ones to put Trump back on Facebook.

Yeah, I think that that’s probably also right. So here’s what I’ll say about what the term proportionality means in an international human rights context. Which is basically, the idea of proportionality is that you have some type of ability to atone for your punishment, or you have some type of proportionate response to whatever it is, the underlying problematic act that you’ve committed.

And it’s not clear to me that it is possible to have a permanent suspension of someone’s freedom of expression on a platform, or ability to be on a platform, and have it ever be consistent with international human rights standards. A permanent suspension is basically de facto disproportionate.

But I think that it’s a good argument. Because I do think that may be right. But they want that finality of that decision. They want Facebook to have said that that’s what’s at stake. And they don’t want to have to say it for them. And maybe it’s like punting the issue because they don’t want to deal with it.

But they’re going to have to deal with it one way or the other when this comes back.

But this is where I think that the wishy-washiness really gets me. Shouldn’t they have said, you’ve got to un-ban him until you come up with a rule that would properly ban him, as opposed to this reflexive reaction to, something bad’s happening and he’s making it worse. And we’re going to turn him off?

Yes. But I think that it’s so messy. But this is almost an interesting question of administrative law, which is that they defer to the decision of the underlying body. Right? And so, they agreed that they made the right decision at the time to take him down. And it’s not clear that enough time has passed, that that problem of imminence, or dangerous organization affiliation, or lauding dangerous organizations are passed.

And so that was the other part; “Well, we can’t make that decision. You have to do that.” That part was the puntiest of all of it, I think. You’re right. They could have basically reached some type of determination that was the opposite, around the ban. And that part, they absolutely did put it back on Facebook to do something one way or the other.

One of the things that really came up in this decision a lot was the board asked Facebook, “Have you ever applied this exception called the newsworthiness exception to Trump, where he’s doing something that breaks your rules, but because he’s the president and he’s newsworthy, you’re giving him a pass?” Everyone assumed that Facebook, this was their justification. Facebook said, “No, we’ve never applied the newsworthiness exception,” which, A, I know you have some strong feelings about newsworthiness as a concept, but, B, that is a big surprise.

It’s not a surprise, because I don’t think that they’re lying. I think that they didn’t technically apply their newsworthy exception to Trump. Trump is instead on a special list of newsworthy people, so it’s a different standard. They didn’t lie. They have different rules for different types of people, for people that are high-profile, people who have certain numbers of followers. We’ve known this for a long time. We’ve never had access to this list. We don’t know how it’s administered. We don’t know what goes into deciding content that those individuals say, necessarily. So this is one of the best parts of this, is kind of, “You have to tell us how this all works, because this doesn’t make sense to us, and it seems like you made it up as you went along.” So this isn’t a standard at all, and so we can’t even review it.

I think that this is great, because, I mean, I wrote a paper in 2018 called Facebook v. Sullivan, which kind of was supposed to be a little play on New York Times v. Sullivan, which established the public figure and newsworthiness kind of considerations in First Amendment law. It toyed with this idea of, “Well, how are they possibly defining newsworthiness and public figures?” I talked to a number of former policy people that had left the company in 2013 specifically around this newsworthiness question.

Because they thought that this was an intractable standard that could never be consistently applied and was always going to be a question of, “Newsworthy to who?” That was always going to be a group of people in Silicon Valley, and that was bullshit. Then a group of people that ended up being the winners, just wanted to use it as a way to ad hoc make determinations on a case-by-case basis. I love that it’s coming up and that this is something that the board raised, because I think it’s just absolutely fascinating.

That special list that Trump was on, we don’t know how big it is, right? But Trump, his relationship to Facebook and to Twitter, he’s always sort of gotten his own space, right? It’s always been nuclear to moderate Trump in any way, shape, or form until it crossed the threshold of January 6th.

Is this decision from the board saying, “You can’t have those kinds of soft exceptions anymore. You have to treat everybody the same,” or is it, “If you’re going to have exceptions, you have to be clear about them”?

I would say the latter, but I don’t know. They might say that it does not comport with international human rights law and principles of law to have two different sets of speech rules for people.

Even if you’re the President of the United States?

Even if you’re the President of the United States. Or they might say that you are allowed to have different classes and types of things, but you have to be consistent about what classes people are in, and you have to tell us what it means to be put into this class. I mean, so one of the things that I wrote about with some of my research is at some point, they used to define public figures by the number of Google hits you had or the number of times you showed up in Google News.

They would use Google to determine whether or not someone was a public figure or a combination of that and how many people followed you on the platform. But these are standards that…they changed all the time. I have no idea where that standard is now, because we’ve had no transparency into how it changed or where it’s gotten into. So I don’t know. I’m excited for this conversation to have gotten to such a sophisticated place finally, after the last 10 years of nonsense, and yeah, I’m really looking forward to it.

There’s just a part of this where Facebook gets to pretend it’s the most important thing in the world all the time. And it’s created this board. And here we are talking about it as though it’s a supreme court.

And right next door is Twitter. And they’re like, “Yeah, we’re just banning the guy. He’s gone. We’re not telling you if he’s ever coming back. Maybe he’ll never come back. You’re just never going to know.” And there’s no process by which Trump or any of his team can appeal that decision because Twitter is a private company, it’s their platform. They can do what they want.

Facebook is trying to create this other thing that provides moral, legal, spiritual justification for a snap decision. I just can’t tell if that is correct, or whether it’s fundamentally distracting, or whether everyone should have a giant oversight board.

Yeah. I mean, I think that all of those things are the things to be thinking about. I think that that’s all I’ve been thinking about for the last two years. But I’ve been by myself.

That’s why I wanted to talk to you.

So this is so much nicer than just sitting in my apartment, staring at my dog being like, “Why won’t you tell me what to do about the Facebook Oversight Board?”

So I think that that’s completely correct. So the way that I think about this is that Facebook has basically chosen a path of governance. They are in — and so is Twitter — an intractable kind of situation in which they are forced to make these terrible content moderation decisions that govern and have huge public ramifications and that compromise human rights, like freedom of expression and safety.

And at the same time, they’re privately held companies. And they’re privately held companies that operate transnationally and in the exchange of information. And so they are basically, they are a pretty big deal. They’re pretty much outside, in strength and power, outside the ability of any one country to shut them down everywhere. Not even the United States could shut down Facebook or Twitter everywhere. They could shut it down in the United States briefly if they really wanted to. But that’s about it.

And so I think that, you have to galaxy-brain yourself a little bit and make yourself go to a new place of, what is the world going to look like if entities like this exist? And they’re governing public rights? And we need to figure out a way to democratize or hold accountable private companies governing public rights, especially when there’s public rights like freedom of expression, or rights that you have traditionally excepted from government control. Because governments are traditionally bad at telling us what to do with our speech, and dangerous when they tell us what to do with our speech. So it’s hard to make government the solution to this either.

And so I think that Facebook chose a governance path with the Oversight Board. And I think that they hoped that they would be shoving off these substantive decisions. What we saw yesterday was that that didn’t work out quite the way that they thought, or at least it isn’t so far.

I mean, there’s a couple of things that could happen out of this. The Oversight Board could be a total distraction. We never get anywhere with it. It gets disbanded in five years or whatever. But at least it was a pretty noble experiment and gave us a new valence or way of thinking about how to solve some of these problems.

The other thing is that it could take hold in the idea of people being entitled to boards like this, or [that] being able to avail themselves of boards like this becomes something that is either mandated and regulated by governments — right now Canada’s contemplating mandating the creation of oversight boards within their country for these speech platforms — or it’s something that the public simply demands of these speech platforms. And they have to put them in place themselves, and Twitter gets forced into it because it becomes the next wave of how we deal with platforms.

I have no idea. I think about this all the time. But I think that, at least for now, it looks like the Oversight Board is pretty serious. It really could have been people taking their checks and rubber-stamping Facebook’s decisions. And it could have been nothing. But a 40-page decision that cites international human rights law and outs Facebook for not answering questions that were posed to it after it created this board to begin with. I think that this is a pretty serious — Well, right now, it looks like a pretty serious group and decision.

So the board submits a bunch of questions to Facebook as part of its Trump decision-making process. Facebook just says, “We’re not going to answer seven of your questions.” Is Facebook allowed to just not answer the board’s questions?

See, I love this. This is like, people and my students are being like, “Is this legal? Can they do this?” And I’m like, “Anything is legal or not legal until someone tells you not to do it.” Can they do this? Well, they just did. And I don’t know what we do to stop them. I mean, what we do to stop them is the board tells them they can.

And the board did that as much as they could, and they made it public, and it’s not been well-received that they didn’t answer those questions. And I think that there is probably definitely a number of people at Facebook right now that are panicked over how to move forward with more requests from the board in the future. But If you think about it from the perspective of, if a government was called before a court to answer questions and the government said, “I’m really sorry, court, we can’t tell you.” It just wouldn’t fly. You’d be in contempt of court. That’s the end of that.

I think that the interesting thing here is that it ends up being a real question of legal realism and public pressure and reputation for the company, like how bad it’s going to look if they spent $130 million in two years and 20 brilliant people’s time to do this, and then they don’t pay any attention to it.

One of the things that strikes me as you describe that process is, the United States Supreme Court likes to say that it makes very narrow decisions, but it actually has sweeping authority over American public life. Should our schools be segregated or not? The Supreme Court said not. And they kind of make these decisions that have sweeping impact over our lives. And they actually kind of restructure society.

The board’s power here is limited to, “Well, you took something down, but you should put it back up.” And it seemed like in this decision, they want the additional power to say, “How does your algorithm work? What do you see your algorithm promoting or disincentivizing or otherwise modifying in the conversations had on Facebook? And how does your business model plug into that algorithm and how do you make decisions about it?” And they just don’t have that power, and it seems like they really want it.

Completely. I also got that out of the opinion. I was in the House of Lords testifying about the Oversight Board with Alan Rusbridger, who’s on the board, former editor-in-chief of The Guardian. He basically said in his House of Lords testimony was that they were coming after the algorithm. And I was like, “That’s interesting. That wasn’t in the charter.” And so it was kind of foreshadowed, but I saw that in the opinion as well and I think that it’s really interesting, and I think that it is absolutely the right question because as Jack Balkin says, it’s the crown jewels of how all of this works and what the real power source of Facebook is.

I know for a fact, because I’ve read the founding documents so many times, that it is not contemplated at all that they’ll have any type of visibility into that, but I’ve always argued that that doesn’t mean anything. Just for the same reason you’re talking about it not being legal, there’s no reason that they can’t use their public pressure and authority and sway, which is really all that it is, to start asking these hard questions. And I think this is pretty soon to do it, but I think that it’s great that they’re going in that direction.

That leads me into the connection between, maybe five years from now every company will want an oversight board, or governments will demand that you have such a contraption connected to your company.

But if your expertise and your precedent is all about Facebook’s algorithm, then how on Earth can you connect that to TikTok or connect that to YouTube, which have wildly different business models, wildly different algorithmic inputs and outputs? These are different products. They have different formats. They have different business pressures.

Oh, you don’t. And I would actually say that that would be the worst possible outcome is to have one oversight board.

Well, I think this one kind of wants to be the one oversight board. They don’t even call themselves the Facebook Oversight Board. They’re just the Oversight Board.

That’s true. I thought that that was weird. And honestly, you want to know why I think that they did that? I think that they all are so…the name Facebook is so toxic that I think that they don’t want to be associated with it by name. And so this is something I read about in the Yale Law Journal article that I went into, which is the different ways that this could play out. And one of the ways is that basically, the trust corporation that I’ve mentioned before, that Twitter dumps $130 million into that and forms their own oversight board to apply Twitter’s terms of service and Twitter’s community standards to whatever it is that Twitter wants to set those at.

Or, Twitter makes its own, which is just as easy, I think. Twitter makes its own trust and its own oversight board and endows that and does their own thing.

And finally, the last thing is I think it would be terrible for freedom of expression globally if we started to have one set of merged standards that came together, that were all the same industry standard and that you couldn’t have nudity on one platform and no nudity on another platform, or something like that, if that’s what you decided. I think the differentiation is key to preserving freedom of expression. But I think that you’re exactly right. That’s one of the things that was specifically contemplated by Zuckerberg when he started to create these documents.

Right. I think he said, “Well, maybe someday, other companies will use our board.”

He literally made the documents so that you could control-F, replace all, Facebook for Twitter. They’re really meant to be usable breakouts for other companies.

I mostly agree with you that it would be bad to have one weird public, quasi-private entity controlling all speech in America. It just seems bad on its face. On the flip side, there’s just an enormous amount of instability in people’s expectations of what they can do on services. The First Amendment means that it’s a free-for-all, which is good. The government can’t make speech regulations.

But the idea that Twitter will take something down and Facebook won’t, and YouTube will demonetize a creator for doing pranks that are too dangerous, and TikTok — people accuse TikTok, literally the algorithm of TikTok, of being racist all the time. I hear from the audience all the time that “I don’t know what’s going to happen.

Where are the rules these platforms have to abide by from a baseline?” And I think that’s how you get to this very popular political posturing that we should just impose the First Amendment, and they’ve got to do whatever the First Amendment says, which is somewhat nonsensical.

It’s not somewhat. It is literally nonsense, gobbledygook. It doesn’t make sense.

But I’m very sympathetic to where that comes from, that you’re going to go seek out some other authority that has this spiritual place in American life, and then everyone has to just do that.

Okay. I understand the impulse, but I don’t know how to square it with, these are different companies with different roles, different algorithms. And yeah, if Twitter wants to be a little looser with its nudity standards than Facebook, that is actually a good thing for speech in America.

I understand the impulse. I’m going to be really mean about people for a second. I understand the impulse to stop, to turn off your brain and take the easiest possible solution that gives you absolutely meaningless results and won’t have any type of procedural fairness over time. Sure. That seems great. We did that with newsworthiness. Newsworthiness is a circularly defined concept that people rely on all the time and it actually means nothing. And it’s time we finally started talking about that. And just because the Supreme Court uses the language and circularly defines it still doesn’t mean that it’s not a problem philosophically to rely on that for a standard to police people’s speech on.

I really do understand the notion that people want there to be some answer out there that is going to solve this problem, but here’s the thing. You just were talking about this in the US and the First Amendment. Facebook’s global user base is 7 percent in the US; mostly it’s everywhere else. This isn’t even other people’s standards. Facebook talks about itself as a community. It’s not a community. It is a couple billion communities all overlapping on top of each other, that have almost that necessarily binds them together.

A community is defined by a group of individuals that have a shared sense of norms and values and responsibilities. And there’s no global community that can even agree on whether we should allow female breasts online, let alone whether or not to allow when a Mexican cartel has a beheading, whether or not to let someone put that on a platform or whether it’s too violent or it’s gore, or whether we should do something in between.

This is one of the other things that I’m excited about about this decision is because I think it starts to go to a place that is so much more useful and rigorous than how we’ve been having this conversation for the last 10, 15 years, which is that it is time to stop letting people make these, “Oh, he’s a public figure. Oh, he’s a political figure. Oh, he’s newsworthy,” types of arguments and stop there. It’s time to get to the next level and dig deeper and figure out what it is we value and mean by that. And to your point about all this stuff about TikTok being racist, Twitter making arbitrary decisions, all of this stuff. Those have grounded, intuitive roots in procedural justice and the rule of law that we can start to tie some of these things back to. And if we could start to have procedures around some of this stuff, then once we apply the substantive rules, they won’t seem so arbitrary and capricious and these companies won’t seem so unaccountable.

What’s interesting about that is we keep coming back to the laws and courts, which are fundamentally governmental functions and powers. At least in this country, there’s just not a way to have a speech court like that. So these all have to be corporate powers and enforcement mechanisms.

At least here in America in 2021, I cannot see everyone coming together and agreeing that this corporation, this LLC, has the power over speech on one of the largest platforms, and that its decisions are going to carry the psychic weight of a Supreme Court decision.

Well, to your point, pretty much the US courts have passed the buck on this all the time. They don’t make decisions on the substantive nature of viewpoint discrimination. They just say, “There can or can not be viewpoint discrimination, and this is how we’re going to determine this.” I think that what you’re going to see is that Facebook’s going to still get to substantively decide what its policies are, but they’re just going to have to be fair and consistent and proportionate in how they enforce them. And right now, that is the biggest hurdle, that people who have worked around content moderation are the most upset about; it’s not that Trump comes up or comes down when he incites violence or lauds a dangerous org. It’s that there is a different rule for you and for me, and for Trump and for Alex Jones, and that we don’t know what any of those decisions are, any of those rules are.

People are constantly having unfair outcomes. I think it was an 80 percent error rate on content moderation decisions. I’m just like,”That’s nuts.” Can we just work on getting that lower for a while, never mind keeping Trump down? That’s a lot of people that are censored. At the end of the day, this is really about just establishing some procedures. The substantive decisions, you’re right, everyone’s going to fight about them. No one’s going to be happy about them. There’s lots of laws people don’t agree with now, too, but they feel protected by the fact that there’s transparency and accountability of how they’re enforced. Well, kind of, depending on plenty of other systemic things, but that’s kind of something that I think that we can get into once we have this baseline.

You mentioned the error rate of moderation decisions. Connect this spectrum for me. We’ve done a lot of coverage of individual moderators at Facebook and their working conditions and how they feel, and the fact that they have fundamentally bad jobs and often get PTSD afterwards.

How should a contractor working in a Facebook moderation shop feel about the Oversight Board? What is the relationship that they should have? And what is the relationship they have now?

Oh, I think they should be very excited about this. And in fact, one of the most interesting series of calls I got yesterday, or a text that I got yesterday from people that were inside the company, inside Facebook, were from people, as I would call it, in the factory floor or in the policy shop, that were very happy because they had agitated for these types of changes for a long time and felt like this gave them the clout and authority that they needed to put forward this rigorous, new agenda and not have a series of trying to rework the same terrible, ad hoc rules.

And so I think that that’s going to filter down to content moderators and their job being easier. I will say that I think that it continues to be something that we need to talk about and the fact that we are outsourcing this labor in this way and that we’re using individuals to cleanse things. We still talk about it like there is this…you talk about the algorithm — either from data that people are generating from even being on the site and where their eyeballs are going, to people making the content moderation decisions — I think that probably the algorithm is less sophisticated than we think.

Inside of Facebook, one thing that we’ve heard a lot about is the content moderation shop is connected to their political operation shop; that the lobbyists of Facebook are the people who write the rules for speech on Facebook, and that is deeply problematic.

That’s gestured at in this opinion, but they’ve now kicked it back to that same shop, which has faced any number of controversies over the years. Is that something that Facebook needs to change to make all of this more credible?

I have heard that they’re starting to. I heard this through other people who have heard this, so this is getting deep into kind of hearsay and just kind of rumor mill inside Silicon Valley. But I think that they are definitely wanting to devolve product and policy more distinctly. I think that that’s been happening for a while, and I think the Oversight Board is a huge part of that.

But I think that also, they are trying to get away from the idea that Joel Kaplan is in charge of so much and doing so much and trying to put more onto Nick Clegg. I think that the New York Times article that came out today is kind of part of that. I think that there is a desire to kind of set Nick up more for being a policy head. I mean, I think that’s maybe it’s too little, too late, but I have no idea.

So we’re looking at this whole sweep of the Oversight Board being created, this big decision being referred to it, it asserting itself in saying, “No, you actually have to make a policy,” kicking it back to Facebook. In six months, they’ll send it back to the Oversight Board. What should regular Facebook users be looking for next from this process?

One of the things that’s been lost in all of the nonsense around Trump, and honestly, I know that people are like, “The Oversight Board is a distraction.” I feel like Trump is a distraction. I mean, for always and for all time, he has been a distraction from so much, but also from [Indian Prime Minister Narendra] Modi and from [Brazilian President Jair] Bolsonaro and all of the other leaders that are still in power that are threatening, and still on Facebook. I think that this is just one moment. So I think the next thing that users can expect is that whatever happens coming out of this next six months is going to have a huge impact on other types of world leaders.

The other thing I was going to say is that one of the things that’s been lost in all of the emphasis on the Trump suspension is that in the last couple of weeks, Facebook actually had implemented something that they had promised to do eventually, but we never knew how soon or when, which was to start putting their decisions to keep up content that had been flagged by other users into the jurisdiction of the board on appeal. So this means that if I find something that you said offensive and I flag it to Facebook as being lauding dangerous organizations or inciting violence, and they say, “No, it’s fine. We decided to leave it up,” I can now appeal that decision.

I think that that’s a huge deal, because it puts the board into both the role of being watchers of the censors and now being the censors themselves, basically being like, “No, that speech is too harmful. It has to come down.” I think that that’s actually going to be a really weird thing for a lot of these people to do, because I think a lot of these people are used to being like, “No, that has to go back up for the sake of freedom of expression.” I think it’s going to be a lot harder to take down certain people’s speech when it’s harmful.

Yeah. I think you actually see that in the Trump decision where, as they keep referencing the minority, they keep saying, “The minority would have gone farther.” That balance to me seems incredibly fascinating.

Kate, I suspect we’re going to have you back on the show a lot as the Facebook Oversight Board continues its metamorphosis into something credible. Thank you so much for being on the show.

Thank you so much for having me. Well, it was really fun.

Meet the real Alexa: voice actor reportedly responsible for Amazon’s AI assistant revealed
The team behind Alto’s Adventure is launching a new app — and studio — aimed at kids

Latest News

Film

Cars

Artificial Intelligence

SpaceX

You May Also Like