| Friend, From conspiracy theories to propaganda, YouTube’s algorithm often promotes a minefield of controversial videos. Have you ever wondered why it keeps recommending the same clickbait content even after you click “dislike” or “do not recommend”? So have we - and we looked into it. What we found out is alarming: YouTube’s user controls do not always work and largely fail to stop unwanted recommendations. Read the report now & call on YouTube to give users real control over their video recommendations! For our report “Does This Button Work? Investigating YouTube's ineffective user controls” we studied YouTube’s feedback mechanisms with the help of 20,000 Mozilla supporters who donated their data through Mozilla’s RegretsReporter browser extension. Our main findings are: Users are not in control of their experience on the video platform Even after using the feedback mechanisms, YouTube’s algorithm often recommends unwanted videos YouTube can fix this problem Our findings call into question two of YouTube’s favorite marketing claims: that people can shape their own experience and that the algorithm optimizes for user satisfaction. It’s time for YouTube to fix its feedback tools and put people in the driver’s seat. Read the full “Does this button work?” report & sign our petition asking YouTube to fix its feedback tools! Our current findings are even more worrying in the context of the previous YouTube research we have done: In our 2021 “YouTube Regrets” study, we found that YouTube’s algorithm promotes videos containing misinformation, hate speech and violence. A video recommender system that has been found to regularly recommend dangerous content and also does not consistently listen to user feedback desperately needs to be fixed. We have clear recommendations on what YouTube should do - please help us put pressure on the company by sharing our report and signing the petition. Learn more about our findings & tell YouTube to fix its feedback tools! Thank you, Christian Bock Advocacy Lead Germany Mozilla | | |