Nick Clegg doesn’t think Facebook is polarizing

Views: 266

There is no shortage of criticisms that get leveled at Facebook: it’s spreading misinformation and hate speech, it’s too polarizing, it’s responsible for fraying the very fabric of society. The list goes on.

This morning, Facebook’s VP of global affairs, Nick Clegg, published a lengthy Medium post addressing some of these criticisms and unveiled some changes the company is making to give users more control over their experience. Specifically, the company is going to allow Facebook users to customize their feeds and how the algorithm presents content from other Facebook users to them. “People should be able to better understand how the ranking algorithms work and why they make particular decisions, and they should have more control over the content that is shown to them,” Clegg writes. “You should be able to talk back to the algorithm and consciously adjust or ignore the predictions it makes—to alter your personal algorithm in the cold light of day, through breathing spaces built into the design of the platform.”

There’s a lot to discuss there. And to help us unpack the post, Clegg sat down with Platformer editor and Verge contributing editor Casey Newton yesterday for a special episode of Decoder.

In particular, Clegg does not think Facebook is designed to reward provocative content, which is a new rebuttal to the company’s critics (and likely a surprise to anyone who’s paid attention to their Facebook feeds). “The reality is, it’s not in Facebook’s interest – financially or reputationally – to continually turn up the temperature and push users towards ever more extreme content,” Clegg writes in his post. “Bear in mind, the vast majority of Facebook’s revenue is from advertising. Advertisers don’t want their brands and products displayed next to extreme or hateful content – a point that many made explicitly last summer during a high-profile boycott by a number of household-name brands.”

Fundamentally, Clegg’s argument is that the Facebook backlash isn’t rooted in fact or science, and that if it gets carried away, we’re never going to get to the better version of the internet that a lot of us want, and he’s trying to reset the debates on Facebook’s terms.

We’ll leave it to you to decide how successful he is.

Okay, Casey Newton with Nick Clegg, VP of global affairs at Facebook. Here we go.

Below is a lightly edited excerpt from their conversation. This post will be updated with a full transcript of the interview on Friday, April 2nd.

Most of us, we’re not math majors or computer science majors. And so there is some sort of fear and uncertainty about what’s going on in the background. One of the things that Facebook is now doing is giving us some new ways to change up what we see in the News Feed. So what are some of these new controls?

So some of the controls are old. We’ve had them for a while, but we’re just going to make them a lot more prominent. So for instance, you could always switch to a chronological feed. But candidly, it wasn’t easy for people to find. So we’re now going to have a feed filter bar. When you scroll to the top of your feed, it’ll be there. It’ll always be there, and you can toggle between the feed as it currently exists, to have it chronologically ordered, or crucially, and this is new, so that you can create your own new feed of favorites — of favorite groups, friends, posts, and so on. And you’ll be able to curate that, if you like, for yourself and toggle between those three — the feed as it is, the chronological feed, and your new favorites feed — in a much, much more effortless way.

It’ll be much more visible. It’ll be visible there when you scroll to the top of your feed. There are other new controls as well, which I’m announcing this week. You’ll be able to curate with much greater granularity than before who can comment on your posts. And that is something which wasn’t available before. And we’re also going to extend something which has existed for ads, for instance, and for connected content. Namely, why am I seeing this? So you can go to the three dots and you can see, “Why am I seeing this ad?” We’re now going to extend that to suggested content. So when something’s suggested to you, that cooking video, you can go on the three dots, and you can see why you’re seeing that.

So I think, collectively, it’s a start. I’m not going to pretend that those changes in and of themselves will lift all the questions that people have about how social media operates and how they interact with Facebook. But I do feel that they are significant steps in a better direction, putting users more in charge, being more open and transparent about things, and we will follow up with a number of additional steps, greater transparency, greater controls in the months to come.

Is that also a suggestion that maybe this is the beginning of this feed filter bar that you’re introducing, that that might have more filters that come to it over time? Is the idea that users will have more and more control over how the stuff they see is ranked?

Yeah. Look, in an ideal world, you just want to push ever more forcefully in the direction where people can personalize their feeds. And if people want to see more or see less of particular forms of content, from particular pages or groups, there is also, conceptually, at least the possibility of exploring whether people can or can’t, if you like, turn the dial up or down on particular classes of content. That’s exactly the kind of work that we want to do. Now, exactly how granular, exactly which dials apply to which kinds of content, all of that still needs to be filled in. But that is very much the direction we’re going.

So the conventional wisdom about how the feed works now, I think for a lot of folks, and certainly of the folks who are most critical of Facebook, is that it rewards the most polarizing and outrageous content. And this is something that you really take on in this piece and push back against. I suspect if there’s one sentence in your piece that most people will take issue with, it’s when you write, “Facebook’s systems are not designed to reward provocative content.” At the same time, when we look at lists of pages that get the most engagement, it does tend to be pages that seem to be pushing really polarizing content. So how do you reconcile this at Facebook?

Well, firstly, I of course accept that we need to just provide more and more data and more evidence about what is the specific content that is popular on News Feed. And then of course, although Facebook’s critics often talk about sensational content dominating News Feed, of course we want to show, as I think we can, that many of the most popular posts on News Feed are lighthearted. They’re feel-good stories. We want to show people that the overwhelming majority of the posts people see on News Feed are about pets, babies, vacations, and similar. Not incendiary topics. In fact, I think on Monday, one most popular posts in the US was a mother bear with three or four baby cubs crossing a road. I saw it myself. It’s lovely. I strongly recommend that you look at it. And I think we can, and we will, do more to substantiate that.

But beyond that, I do think, and I do try to grapple with this as thoroughly, as is possible in a 5,000-word piece. Firstly, the signals that are used in the ranking process are far more complex, are far more sophisticated, and have far more checks and balances in it than are implied by this cardboard cutout caricature that somehow we’re just spoon-feeding people incendiary, sensational stuff. And I’m happy to go into the details if you like, but thousands of signals are used, literally from the device that you use to the groups that you’re members of and so on. We survey evidence. We’re using more and more survey evidence. We’ll be doing more of that in the future as well to ask people what they find most meaningful. There’s been a big shift in recent years anyway to reward content that is more meaningful, your connections with your families and friends, rather than stuff that is just crudely engaging — pages from politicians and personalities and celebrities and sports pages and so on.

So that shift has already been underway. But in terms of incentives, this is the bit that maybe we have not been articulate enough about. Firstly, the people who pay our lunch don’t like the content next to incendiary, unpleasant material. And if you needed any further proof of that, this last summer, a number of major advertisers boycotted Facebook because they felt we weren’t doing enough on hate speech. We were getting much better at reducing the prevalence of hate speech. The prevalence of hate speech is now down to, what? 0.07, 0.08 percent of content on Facebook. So every 10,000 pieces of content you see, seven or eight might be bad. I wish it was down to zero. I don’t think we’ll ever get it down to zero. So we have a massive incentive to do that.

But also if you think about it, if you’re building a product which you want to survive for the long term, where your people in 10 years, in 15 years, in 20 years to still be using these products, there’s really no incentive for the company to give people the kind of sugar rush of artificially polarizing content, which might keep them on board for 10 or 20 minutes extra. Now, we want to solve for 10 or 20 years, not for 10 or 20 extra minutes. And so I don’t think our incentives are pointed in the direction that many people assume.

That all being said, it is of course true… any sub editor of a newspaper will tell you. It’s why tabloids have unused striking imagery and cage–rattling language on their front pages since time immemorial. Of course there are emotions of fear, of anger, of jealousy, of rage, which of course provoke emotional responses. They’ve done so in all media, in all time. And so of course emotive content provokes an emotive reaction amongst people. We can’t reprogram human nature, and we don’t want to deny that, which is why our CrowdTangle tool actually elaborates on that and shows how things have been engaged. But as you know, there is a world of difference between that which is most engaged with — in other words, where comments and shares are most common — and actually the content that most people see. And that’s quite, quite different. Actually, if you look at what most human beings, if you look at eyeballs, rather than comments and shares, you get a quite different picture.

A first look at Xbox running Discord and Google Stadia in its new Edge browser
Google Meet’s free ‘unlimited’ calls will continue until June

Latest News



Artificial Intelligence


You May Also Like