University of Exeter logo

Research and Innovation blog

Home About Contact Toggle navigation Open menu

New research from the University of Exeter and Mozilla investigates ‘YouTube Regrets’

2 December 2022

4 minutes to read

New research from the University of Exeter and Mozilla investigates ‘YouTube Regrets’

A recent report from the University of Exeter and the Mozilla Foundation has investigated the potentially harmful effects of videos suggested to people on YouTube.


YouTube, the world’s second-most popular website and home to 2 billion active users each month, is facing criticism for the opacity of its recommendation algorithm, which suggests the videos that users should watch next. The algorithm – which serves to sustain YouTube’s ad revenue by keeping eyeballs on content – has been widely criticised for indiscriminately feeding users with videos that misinform, offend, and even violate the platform’s Community Guidelines.

One in a pioneering set of projects designed to scrutinize and gather evidence on the negative consequences of big data on users across the world, the YouTube Regrets project analysed 3,362 videos which, though recommended to them by YouTube’s algorithm, viewers regretted watching.

The crowdsourced investigation made use of Mozilla’s Regrets Reporter, a browser extension for Firefox and Chrome the foundation’s Firefox browser which allowed volunteers to report their ‘regrets’. Dr Chico Camargo – a researcher at the University of Exeter’s Institute of Data Science who studies the evolution of information and worked with Mozilla on the project – coordinated a team of 41 student research assistants to analyse these regrets for the report.

Key Findings: YouTube Regrets are harmful, misinformative and, in many cases, recommended.

Although some of the self-reported Regrets were benign, such as videos that didn’t correspond to what users were watching at the time, the report uncovered a mass of regrets that were misinformative, offensive and violently graphic, through which some concerning themes emerged.

1. YouTube’s recommendations are misinformative, especially with regard to Covid-19. 20% of users’ Regrets involved misinformation, such as conspiracy theories and political disinformation. Add to this Covid fearmongering (12%) – which researchers bracketed into a separate category due to its relevance amid a global pandemic – and misinformation accounted for a third of all YouTube Regrets.

2. YouTube’s recommendations are indiscriminate, and sometimes harmfully so. Also ranking highly in the Regrets were videos that involved violent or graphic content (14%) and hate speech (12%), with examples including racist hate speech (“Blacks in Power Don’t Empower Blacks”), distressingly graphic (“7 Jokes Ending in Tragedy”) and overtly sexualised (“Woody’s Got Wood”) content.

3. Non-English speaking countries are hit the hardest. Alarmingly, and with similar political implications as YouTube’s wealth of Covid misinformation, the YouTube Regrets report found that the rate of Regrets is 60% higher in countries that do not have English as a primary language. This raises issues about YouTube’s ability to enforce Community Guidelines in practice. “With content moderation,” Chico said, “more attention is given to the United States and to richer countries, for example. These issues might arise simply because YouTube doesn’t employ enough people to moderate content in poorer countries where most of the content is not in English.”

4. The algorithm is not fit for purpose. 71% of all users’ Regrets came from videos that were recommended to them, as compared with videos they’d searched for themselves. This makes for a stark contrast with YouTube’s promise to “make sure we’re suggesting videos that people actually want to watch,” and indicates problems with the mechanics of the algorithm itself.

Since the YouTube Regrets report was published, 9% of the recommended Regrets have been taken down by YouTube for violating the platform’s own Community Guidelines. Before being removed, these videos had received a collective 160 million views.

Impact: Creating a fairer, safer internet

In a world where entire populations are reliant on the digital platforms of a few Silicon Valley giants for their information and connectivity, companies like Google (YouTube’s parent) have a responsibility to ensure their platforms – and the algorithms driving engagement with them – protect people from the most harmful content.

The YouTube Regrets project is a crucial stepping-stone in holding these platforms to account and, where algorithms are found to be unfit for purpose, developing an evidence base to convince public and policy communities of the need for better regulation.

“Needless to say, what this research reveals is only the tip of the iceberg,” says Chico Camargo. “But without intervention to enable greater scrutiny of YouTube’s algorithms, these problems will continue to go unchecked and their consequences will only grow worse. The YouTube Regrets project is a call for transparency: to enable independent audits of recommendation systems, to keep them from harming users all over the world, and ultimately, to give people more control over how their data is used .”

“The YouTube Regrets project is already driving impact,” says Brandi Geurkink, Senior Manager of Advocacy at the Mozilla Foundation. “The campaign that inspired the development of RegretsReporter was cited in the European Commission’s impact assessment for its proposal of a Digital Services Act. The collaborative research inspired the Washington Post’s Editorial Board to write an opinion piece calling on social media platforms to make their algorithms more transparent. And in September 2021, just two months after the research made headlines in more than 30 countries, YouTube released more information to the public about how their algorithm works and made changes to reduce harmful vaccine misinformation on the platform.”

“This shows that our efforts are moving the needle in the right direction – but there is still work to be done. That’s why on 2 December, Mozilla released an updated version of the RegretsReporter extension which aims to study the extent to which YouTube’s algorithm listens to user feedback.”

 


More information:

The Mozilla Foundation is a global non-profit dedicated to keeping the Internet a global public resource that is open and accessible to all.

The University of Exeter’s Institute of Data Science and Artificial Intelligence brings together leading interdisciplinary researchers to develop new approaches to the use of data and artificial intelligence in modern society.

Chico Camargo is a lecturer in Computer Science at the University of Exeter. His research focuses on the evolution of information and developing new tools to study the media we produce and consume. He is a research associate at the Oxford Internet Institute, a member of the Institute for Data Science and Artificial Intelligence, and Director of the CC Lab.

Brandi Geurkink is Senior Manager of Advocacy at the Mozilla Foundation. Her work focuses on building public advocacy campaigns that mobilise millions of people to pressure decision-makers in governments and tech companies to protect the values of an open, safe and private web.

The RegretsReporter is a browser extension which allows users to report YouTube Regrets and can be downloaded here. When you send a YouTube Regret, the video and recommendations that led you to it are submitted to Mozilla researchers privately.



Researchers

Dr Chico Camargo
Back home
TOP