Apparently YouTube's algorithm doesn't care if you 'thumbs down' videos

Apparently YouTube’s algorithm doesn’t care if you ‘thumbs down’ videos

A photo of a screen on YouTube with the mouse hovering over the dislike button.

YouTube has already stopped videos from displaying the number of dislikes it is received, but apparently giving a thumbs up to a video doesn’t change the number of similar videos the platform recommends to you.
Photo: A Chiwi (Shutterstock)

My YouTube recommendations are full of old reruns of Gordon Ramsay’s Kitchen Nightmares. Maybe it was partly my mistake that I got drunk one night and watched a full episode. Let me tell you, if there’s one thing I don’t want on my stream anymore, it’s the famously ruthless Brit tearing down another leader while the world’s most obnoxious sound effects (braaa-reeeee) scroll in the background. I hated a lot of those videos, but now I have Hell’s Kitchen popping up on my page, and I’m feeling more and more like a “raw” steak that Ramsay is pushing and scolding.

But apparently, I’m not alone with my YouTube recommendation issues. A report by the Mozilla Foundation released monday asserts, based on survey and crowdsourced data, that the “Do not likeand “don’t recommend channel” don’t actually change video recommendations.

Well, there are two points here. One is that users constantly feel that the controls provided by YouTube, owned by Google, are not. really make a difference. Second, based on data collected from users, the controls offer a “negligible” impact on recommendations, meaning “most unwanted videos still play through.”

The foundation relied on data from its own RegretsReporter browser plug-in tool that allows users to block certain YouTube videos from appearing on their feed. The report says it based its analysis on nearly 2,757 survey respondents and 22,722 people who granted Mozilla access to more than 567 million video recommendations taken from the end of 2021 to June 2022.

Although researchers admit that survey respondents are not a representative sample of YouTube large and diverse audience, a third of respondents said that using YouTube’s controls didn’t seem to change their video recommendations at all. A user told Mozilla that he would flag the videos as misleading or spam and come back to his feed later. Respondents often said that blocking a channel would only lead to similar channel recommendations.

YouTube’s algorithm recommends videos that users don’t want to see, and it’s often worse than the old Ramsay cable. A 2021 report from Mozilla, again based on crowdsourced user data, claimed that people surfing the video platform were regularly recommended violent content, hate speech and political misinformation.

In this latest report, Mozilla researchers found that pairs of videos including rejected users, such as a Tucker-Carlson screed, would just result in another video from Fox News’ YouTube channel being recommended. Based on a review of 40,000 video pairs, often when a channel is blocked the algorithm will simply recommend very similar videos from similar channels. Using the “dislike” or “not interested” buttons prevented only 12% and 11% of unwanted recommendations, respectively, compared to a control group. Using the “Do not recommend channel” and “Remove from watch history” buttons was found to be more effective in correcting user feeds, but only by 43% and 29%, respectively.

“In our analysis of the data, we determined that YouTube’s user control mechanisms are inadequate as tools to prevent unwanted recommendations,” the Mozilla researchers wrote in their study.

YouTube spokeswoman Elena Hernandez told Gizmodo in an email that “Our controls do not filter entire topics or views, as this may have negative effects for viewers, such as creating echo chambers.” The company said it doesn’t block all related topic content from being recommended, but it also claims to push “authoritative” content while removing “borderline” videos that border on violating Google’s policies. content moderation.

In a blog post 2021, Cristos Goodrow, vice president of engineering at YouTube, wrote that their system is “constantly evolving”, but ensuring transparency in their algorithm “isn’t as simple as listing a formula of recommendations” since their systems take into account clicks, watch time, survey responses, sharing, likes and dislikes.

Of course, like all social media platforms, YouTube has struggled to create systems that can combat the full extent of bad or even predatory content being uploaded to the site. A forthcoming book shared exclusively with Gizmodo said YouTube was poised to take billions of dollars in ad revenue to deal with weird and disturbing videos recommended for kids.

While Hernandez claimed the company has expanded its PLC dataThe spokesperson added “Mozilla’s report does not take into account the actual operation of our systems, and therefore it is difficult for us to gather much information.”

But it’s a criticism that Mozilla also lays at Google’s feet, saying the company doesn’t provide enough access for researchers to assess what’s affecting YouTube’s secret sauce, AKA their algorithms.

#Apparently #YouTubes #algorithm #doesnt #care #thumbs #videos

Leave a Comment

Your email address will not be published.