• Proteams Information Tech

YouTube Regrets- the dark downside of the YouTube algorithm


Many people enjoy watching cooking videos, how to do yoga, or even just listening to music on YouTube, and when the algorithm goes right, it’s pleasant. You can have a continuous stream of similar content without spending time flicking through what you want to and don’t want to see. But there are some particularly harmful videos out that that make their way into your playlists after being suggested, and this is where the algorithm goes wrong quickly. Conspiracy videos, sexualised content, and even the mistreatment of animals are among the things that are often promoted on the media giant’s platform, this is generally because these are the types of videos that will go viral. A crowdsourced investigation into YouTube’s recommendation algorithm was carried out by Mozilla, called “YouTubeRegrets”, allowing participants to have a browser extension called RegretsReporter, to allow them to report any findings on where the algorithm messes up and to bring it to the attention of those working at YouTube.

The study began in 2020, lasting for ten months with around 37,000 volunteers. The browser extension they used to report findings allowed participants to flag a video as problematic, and it will record if the video was a recommendation or their own choice. Around 71% of the videos that were recommended by the algorithm were flagged as offensive by the volunteers. Amongst the videos that had been reported, some were considered harmful, and were able to be viewed by anyone, some of the video findings included content that promoted 9/11 conspiracies and encouraging white supremacy. One of the parents who were involved in the study said their daughter, whom is only ten years old, had been looking for dance videos, and instead ended up on a reel of extreme dieting videos which led her to restrict her eating habits and become concerned with her looks to an extreme level. With the study lasting between July 2020 and May 2021, the volunteers had flagged around 3,300 “regrettable” videos, coming from 91 countries over the world.

Being the second-most visited website in the world, YouTube is a major platform, accessible to almost anyone with a range of videos, the algorithm that suggests videos drives around 70% of total watch time on the platform which equates to around 700 million hours every day. The crowdsourced research report highlights three main findings. The first being YouTube Regrets (the name given to the flagged videos) are often highly disturbing. As previously mentioned, there are a few categories which should simply not be allowed in public circulation, let alone be recommended to anyone, such as fear-mongering surrounding Covid19, false political information, inappropriate “children’s” cartoons, hate speech, violent or graphic content and scams/spam. The second major finding is that the algorithm is the problem. Most of the disturbing content came from content by the automatic recommendation system, of which were 40% more likely to be reported by volunteers than videos they specifically searched for, and much of the recommended content violates YouTube’s own community guidelines and were not in any way related to videos that were previously watched. The third major finding was that non-English speakers get the worst of it, the rate of YouTube Regrets in countries that don’t have English as a first language such as Brazil, Germany and France, and is 60% higher. Pandemic-related scaremongering Regrets were particularly widespread in non-English languages.

When videos have the ability to go viral, they are more likely to be recommended. Shock content will manage to acquire millions of views and will subsequently be circulated and recommended rather than videos that instil personal interests of which the videos may only have tens of views. YouTube did actually remove around 200 videos that were flagged throughout the study, and a spokesperson had said that YouTube has reduced harmful recommended content to below 1% of videos viewed. Over the past year the company has made 30 changes to attempt to address the issue, and apparently the automated system will detect and remove 94% of videos that violate the community guidelines before they reach ten views.

Of course the findings of this study by Mozilla are concerning, but YouTube is working to make changes to detect and remove harmful content. It’s easier to remove videos with the likes of racism and violence, but with conspiracy theorists there is a slight element of “free speech” which they often use to enable them to continue posting information. Same with radicalisation groups who feel they have the right to express their opinions. All in good time YouTube should eventually work out all the downfalls of their algorithms, but this study has certainly brought to light some of the more disturbing things that are shared and recommended on the platform.

Keep up-to-date with the latest tech industry insights, trends as well as information technologies, app development, and small business content with the Proteams Blog


Follow us on LinkedIn for updates on the latest tech news here

0 views0 comments

Recent Posts

See All