Even when users tell YouTube that they are not interested in certain types of videos, similar recommendations keep coming, a new study from Mozilla found it.
Using video recommendations from more than 20,000 YouTube users, Mozilla researchers found that buttons like “not interested,” “dislike,” “stop recommend channel,” and “delete from watch history” are largely ineffective at preventing similar content from appearing. is recommended. Even at their best, these buttons still let more than half of the recommendations through, similar to what one user said they weren’t interested in, the report found. At worst, the buttons barely made a dent in blocking similar videos.
To collect data from real videos and users, Mozilla researchers enlisted volunteers who used the foundation’s data RegretReporter, a browser extension that overlaps a generic “Stop Recommending” button with YouTube videos viewed by participants. On the back, users were randomly assigned a group, so every time they clicked the button Mozilla posted, different signals were sent to YouTube – dislike, disinterested, don’t recommend channel, delete from history, and a control group for who’s there no feedback sent to the platform.
Using data collected from more than 500 million recommended videos, research assistants created more than 44,000 pairs of videos: one “rejected” video, plus a video that is then recommended by YouTube. Researchers then self-assessed pairs or used machine learning to to decide whether the recommendation was too similar to the video a user rejected.
Compared to the baseline control group, sending the “dislike” and “not interested” signals was only “marginally effective” in preventing bad recommendations, preventing 12 percent of 11 percent of bad recommendations, respectively. The “Don’t recommend channel” and “delete from history” buttons were slightly more effective — preventing 43 percent and 29 percent of bad recommendations — but researchers say the tools offered by the platform are still inadequate to weed out unwanted content. send.
“YouTube should respect the feedback users share about their experience and treat them as meaningful signals about how people want to spend their time on the platform,” researchers write.
YouTube spokesperson Elena Hernandez says this behavior is intentional because the platform doesn’t try to block all content related to a topic. But Hernandez criticized the report, saying it doesn’t take into account how YouTube’s controls are designed.
“Importantly, our controls don’t filter out entire subjects or points of view, as this could have negative effects for viewers, such as creating echo chambers,” Hernandez said. The edge. “We welcome academic research to our platform, so we recently expanded access to the Data API through our YouTube Researcher Program. Mozilla’s report doesn’t take into account how our systems actually work, so it’s difficult for us to gain a lot of insights.”
Hernandez says Mozilla’s definition of “similar” doesn’t take into account how YouTube’s recommendation system works. The “not interested” option removes a specific video, and the “don’t recommend channel” button prevents the channel from being recommended in the future, Hernandez says. The company says it doesn’t want to stop recommending any content related to a topic, opinion or speaker.
In addition to YouTube, other platforms such as TikTok and Instagram have increasingly introduced feedback tools that allow users to train the algorithm, supposedly, to show them relevant content. But users often complain that even when they indicate they don’t want to see something, similar recommendations persist. It’s not always clear what different controls actually do, says Mozilla researcher Becca Ricks, and platforms aren’t transparent about how feedback is taken into account.
“I think in the case of YouTube, the platform strikes a balance between user engagement and user satisfaction, which is ultimately a tradeoff between recommending content that leads people to spend more time on the site and content that the algorithm thinks people will like them,” Ricks said. The edge via email. “The platform has the power to tweak which of these signals is given the most weight in its algorithm, but our study suggests that user feedback may not always be the most important.”