Mozilla's RegretsReporter: YouTube recommends dangerous videos

Mozilla's RegretsReporter: YouTube recommends dangerous videos

Mozilla's RegretsReporter

The phrase "Down the Rabbit Hole" is probably not only known to YouTube users. It actually comes from the famous children's book Alice in Wonderland by Lewis Carroll and describes a branching, seemingly infinite tunnel in which one has long since moved away from the actual starting point.

In the context of YouTube, this means: You have to relax a Let's Play by Ratchet & Clank: Rift Apart, only to land two hours later with conspiracy theories or info videos about the great emu war. How one made the leap from a video game to this historical conflict is usually completely incomprehensible. Mozilla has investigated exactly this phenomenon with the add-on RegretsReporter - and has come to questioning results.

Recommended editorial content Here you can find external content from [PLATFORM]. To protect your personal data, external integrations are only displayed if you confirm this by clicking on "Load all external content": Load all external content I consent to external content being displayed to me. This means that personal data is transmitted to third-party platforms. Read more about our privacy policy . External content More on this in our data protection declaration.

The dark side of YouTube

The add-on was released in September 2020 and has since collected data from 37,380 users who installed the plug-in. 3,362 of these users also sent reports to RegretsReporter sharing their own experiences.

RegretsReporter chose the term "Regret" (in English: something that one regrets) to describe the phenomenon when users come across videos that they actually did not want to see. Not because they are outside of their area of ​​interest. But because they show misinformation, graphic representation of violence or other harmful content.

In a detailed report, RegretsReporter summarizes the collected data and describes known cases in which YouTube achieved exactly the opposite of what users wanted or even recommended malicious videos to them. A 10-year-old girl searched for dance videos and ended up with content promoting extreme diets. Another person looked for affirmative videos on the topic of LGBTQ + and instead found tirades of hate against queer people.

This is what RegretsReporter found out using the YouTube algorithm

RegretsReporter reports that around 12.2% of Reported videos do not exist on the platform or at least should not be recommended according to the YouTube Community Guidelines. Most of the reported videos contain misinformation, closely followed by graphic and violent representations.

Read also 0

YouTube: This is how much the gaming PC from Dr Disrespect

costs streamer and internet celebrity Dr Disrespect presented his gaming PC in a stream - and it was pretty expensive. 0

Drachenlord: Dorf imposes general decree

The market town of Emskirchen in the Franconian town of Altschauerberg has issued a general decree due to the YouTuber Drachenlord. 0

YouTube: Current PewDiePie video removed to protect children

YouTube has removed a current disstrack of PewDiePie from the platform because it contains unsuitable scenes for children. var lstExcludedArticleTicker = '1375644,1375257,1374524,1367246'; In addition to the examples already mentioned, RegretsReporter lists a whole host of other malicious videos in its report. Some of them show that YouTube is in no way inferior to other social networks such as Facebook or Twitter in spreading false information. But also sexual content that finds its way into children's rooms all over the world disguised as child-friendly videos.

A statement from Mozilla and one from YouTube

Brandy Guerkink, Senior Director of Advocacy at Mozilla, sees the blame on YouTube: "YouTube has to admit that their algorithm is designed in such a way that it harms people and misinforms them." In addition, Mozilla reports that "43.3% of the cases where we have data about what videos a user has previously watched, the recommendation had nothing at all to do with the videos previously watched."

The problem is particularly noticeable in countries where English is not the first language. The amount of harmful videos recommended there is 60% higher. It seems that such countries are given little attention when it comes to meaningful regulations.

The news platform NBC News reported a statement from YouTube that said: "We are constantly working on improving the experience on YouTube and in the past year alone we made over 30 changes that reduce the recommendations of dangerous content. Thanks to these changes, the consumption of borderline content that takes place thanks to our recommendations is well below one percent. "
< As always, YouTube has put a cloak of silence on what changes these are. YouTube also did not reveal whether and how the platform would act against such dangerous content, regardless of recommendations.

Source: Mozilla's RegretsReporter






Powered by Blogger.