Film

YouTube’s recommendations pushed election denial content to election deniers

Views: 132

YouTube’s recommendation algorithm pushed more videos about election fraud to people who were already skeptical about the 2020 election’s legitimacy, according to a new study. There were a relatively low number of videos about election fraud, but the most skeptical YouTube users saw three times as many of them as the least skeptical users.

“The more susceptible you are to these types of narratives about the election…the more you would be recommended content about that narrative,” says study author James Bisbee, who’s now a political scientist at Vanderbilt University. The research was done through the Center for Social Media and Politics at New York University.

In the wake of his 2020 election loss, former President Donald Trump has promoted the false claim that the election was stolen, calling for a repeat election as recently as this week. While claims of voter fraud have been broadly debunked, promoting the debunked claims continues to be a lucrative tactic for conservative media figures, whether in podcasts, films or online videos.

Bisbee and his research team were studying how often harmful content in general was recommended to users and happened to be running a study during that window. “We were overlapping with the US presidential election and then the subsequent spread of misinformation about the outcome,” he says. So they took advantage of the timing to specifically look at the way the algorithm recommended content around election fraud.

The research team surveyed over 300 people with questions about the 2020 election — asking them how concerned they were about fraudulent ballots, for example, and interference by foreign governments. People were surveyed between October 29th and December 8th, and people surveyed after election day were also asked if the outcome of the election was legitimate. The research team also tracked participants’ experiences on YouTube. Each person was assigned a video to start on, and then they were given a path to follow through the site — for instance, clicking on the second recommended video each time.

The team went through all the videos shown to participants and identified the ones that were about election fraud. They also classified the stance those videos took on election fraud — if they were neutral about claims of election fraud or if they endorsed election misinformation. The top videos associated with promoting claims around election fraud were videos of press briefings from the White House channel and videos from NewsNow, a Fox News affiliate.

The analysis found that people who were the most skeptical of the election had an average of eight more recommended videos about election fraud than the people who were least skeptical. Skeptics saw an average of 12 videos, and non-skeptics saw an average of four. The types of videos were different, as well — the videos seen by skeptics were more likely to endorse election fraud claims.

The people who participated in the study were more liberal, more well-educated, and more likely to identify as a Democrat than the United States population overall. So their media diet and digital information environment might already skew more to the left — which could mean the number of election fraud videos shown to the skeptics in this group is lower than it might have been for skeptics in a more conservative group, Bisbee says.

But the number of fraud-related videos in the study was low, overall: people saw around 400 videos total, so even 12 videos was a small percentage of their overall YouTube diet. People weren’t inundated with the misinformation, Bisbee says. And the number of videos about election fraud on YouTube dropped off even more in early December after the platform announced it would remove videos claiming that there was voter fraud in the 2020 election.

YouTube has instituted a number of features to fight misinformation, both moderating against videos that violate its rules and promoting authoritative sources on the homepage. In particular, YouTube spokesperson Elena Hernandez reiterated in an email to The Verge that platform policy doesn’t allow videos that falsely claim there was fraud in the 2020 election. However, YouTube has more permissive policies around misinformation than other platforms, according to a report on misinformation and the 2020 election, and took longer to implement policies around misinformation.

Broadly, YouTube disputed the idea that its algorithm was systematically promoting misinformation. “While we welcome more research, this report doesn’t accurately represent how our systems work,” Hernandez said in a statement. “We’ve found that the most viewed and recommended videos and channels related to elections are from authoritative sources, like news channels.”

Crucially, Bisbee sees YouTube’s algorithm as neither good nor bad but recommending content to the people most likely to respond to it. “If I’m a country music fan, and I want to find new country music, an algorithm that suggests content to me that it thinks I’ll be interested in is a good thing,” he says. But when the content is extremist misinformation instead of country music, the same system can create obvious problems.

In the email to The Verge, Hernandez pointed to other research that found YouTube does not steer people toward extremist content — like a study from 2020 that concluded recommendations don’t drive engagement with far-right content. But the findings from the new study do contradict some earlier findings, Bisbee says, particularly the consensus among researchers that people self-select into misinformation bubbles rather than being driven there by algorithms.

In particular, Bisbee’s team did see a small but significant push from the algorithm toward misinformation for the people who might be most inclined to believe that misinformation. It might be a nudge specific to information on election fraud, although the study can’t say if the same is true for other types of misinformation. It means, though, that there’s still more to learn about the role algorithms play.

Update September 1st, 1:58PM ET: Updated to include the institution where the research was conducted.

Tags: , ,
Lenovo’s 120Hz Chromebook promises a smooth scroll
The newest Pokémon is a loner artist

Latest News

Film

Cars

Artificial Intelligence

SpaceX

You May Also Like