National

YouTube's algorithm pushes right-wing, explicit videos regardless of user interest or age, study finds

You may think you're too smart to fall for a conspiracy theory. Your social media is dedicated to cat videos, Trader Joe's hauls and Saturday Night Live sketches. You think you're safe in this self-created online bubble.

But according to a recent study published by the Institute for Strategic Dialogue (ISD), a nonprofit that studies online extremism, it doesn't matter what your interests are. If you use YouTube, eventually the video-sharing platform's algorithm will start serving you with misinformation and problematic content.

YouTube's recommendation algorithm is a massive traffic driver for the platform. Recommended videos make up 70% of all video views, YouTube CEO Neal Mohan said in 2018, compared to videos found through a search or clicked on from outside sources.

For years, there have been questions swirling around YouTube's algorithm. In 2019, YouTube vowed in a blog post to "improve recommendations" after an investigation by the Wall Street Journal found that the platform was presenting "divisive, misleading or false content" in its recommendations, leading "users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven't shown interest in such content."

But it’s not clear how much, if at all, the algorithm has improved, according to ISD’s findings published five years after YouTube’s blog post.

ISD researchers spent a week building YouTube profiles tailored to specific ages and genders that fell into one of four general interests: gaming, male lifestyle gurus, mommy vloggers or Spanish-language news.

“What we wanted to do was actually start not from the point of extreme ideas, but start from kind of more mainstream ideas and mainstream subject matters to see what the algorithm would then produce,” Aoife Gallagher, a senior analyst on the project, told Yahoo News.

Gallagher explained how the categories were selected, noting that gaming and male lifestyle content has earned a reputation as a pipeline for more extremist ideas, while researchers also observed an uptick in female influencers embracing wellness- and health-related conspiracy theories during the pandemic. Since most research on online conspiracies focuses on English-speaking videos and influencers, Gallagher said Spanish-speaking internet users are an under-researched area.

Initially, researchers built the personas and intentionally searched for and watched videos that fit the interests. Then the group let the accounts run on autoplay for about a month to analyze how YouTube’s recommendations evolved.

Gallagher said the team didn’t have set expectations for what would happen to the algorithms, although some findings didn’t surprise her.

“We had suspected that someone like Andrew Tate might appear in the [male lifestyle] recommendations,” she said.

YouTube's algorithm can't differentiate between content or quality, so it gravitates to videos with high traffic and engagement. A significant number of these videos are sensationalist or controversial, and Tate, a self-proclaimed misogynist, fits the bill. (Tate has been charged with human trafficking and rape — allegations he has denied — and is banned from YouTube, among other social platforms.)

But what surprised researchers was that the profile programmed for a 13-year-old boy interested in male lifestyle content was getting recommended more Tate videos than the one for a 30-year-old man interested in male lifestyle videos.

“Videos of Andrew Tate were also recommended to both the child and adult accounts despite neither account showing an interest in him,” the ISD report says. “YouTube did not place any age restrictions or content warnings on these videos.”

Elena Hernandez, a YouTube spokesperson, told Yahoo News in response to the ISD study that while the company “welcomes research” on its algorithm, “it’s difficult to draw conclusions based on test accounts created by the researchers, which may not be consistent with the behavior of real people.”

“We continue to invest significantly in the policies, products and practices to protect people from harmful content, especially younger viewers,” Hernandez said.

For gaming videos, the study found that gender didn’t play a big role in terms of what was being recommended. But Gallagher, who focused on the gaming investigation, found the findings “shocking,” such as the number of recommendations for videos on Minecraft, a popular video game, that featured some sexually explicit or violent content.

Gallagher couldn’t explain why these videos were being recommended to profiles set up for a 14-year-old boy and a 14-year-old girl.

“The thing about this project and this kind of analysis is that it allows us to understand what has been recommended to users, but it doesn't allow us to understand why,” she said. “Those answers lie with YouTube.”

In an Op-Ed for the New York Times on June 17, Surgeon General Vivek Murthy called for Congress to make social media platforms implement warning labels for young users. Gallagher compared the idea to putting a Band-Aid on a gushing wound.

“Labels, they may help, but I don’t think they’re going to tackle the fundamental issues,” Gallagher said. “Always on my list is more data transparency, more transparency from the platforms — give researchers and journalists and academics, give them access to the data around the algorithm so we can actually understand how they work.”

0
Comments on this article
0