Hello, We have some good news. After sustained pressure from the Mozilla community, WhatsApp has responded to our collective concerns about election-related disinformation and other harmful content on their platform. Over the past few months, Mozilla has been calling on WhatsApp to take more action to protect upcoming elections from political disinformation, misinformation and hate speech – an urgent call made during the biggest election year in recorded history. WhatsApp has now responded with a detailed account about its elections response. That includes sharing new information with us on key product features like forwarding restrictions and labeling, as well as how it plans to tackle the rise of AI-generated content. We welcome WhatsApp’s response about its elections strategy, but there are key areas where it’s been silent. The company has provided little information on how it’s working to stop political actors from abusing its platform, and how it’s measuring the success and failure of its different efforts. That’s important given that WhatsApp is a major player in the social media ecosystem, and in the light of how the platform’s sharing and broadcasting features have allowed disinformation and other harmful content to reach large audiences during elections. Right now it is up to all of us in the Mozilla community to continue holding irresponsible tech companies accountable and protect election integrity worldwide. Together we can bring about positive changes to major platforms like WhatsApp – but only if we are in this together. Will you add a $10 USD donation today to power Mozilla's ongoing research and campaign work? Together we can protect election integrity worldwide and combat disinformation. Here’s more detail on what we learned:
Limiting Virality of Messaging: WhatsApp is still refusing to share how it's measuring the success and failure of limitations on virality of content. WhatsApp says one of the key ways it addresses disinformation and other harmful content is by limiting the virality of messaging: told us it is “one of the few technology companies to intentionally constrain sharing”, and pointed to the limits on message forwarding in chats and channels.1 This is a useful insight into WhatsApp’s elections strategy, but doesn't answer the question. In other words, how does the company judge whether the changes that it’s put in place to try and stop the spread of harmful content have gone far enough?
Forwarding Labels: We had asked WhatsApp to add verification prompts to its “Forwarded many times” label to emphasize its role in addressing mis- and disinformation. WhatsApp says it’s tested different styles of forwarding labels: One of WhatsApp’s key measures to address disinformation is its “Forwarded many times” label, which the company says it uses to slow down “rumors, viral messages, and fake news”. WhatsApp told us it has tested a number of different versions of this labeling across different countries, including in low-literacy contexts, and said it had found “...using more complex words may actually reduce their effectiveness”. While this sheds new light on how WhatsApp arrived at this wording, we wish it had provided us more information on how it had tested it.
Banning Political Actors: We asked WhatsApp about banning political actors for abusing its platform, but the company did not provide specifics. WhatsApp says it uses "advanced technology" to identify accounts engaging in abnormal behavior, such as bulk or automated messaging, but hasn’t provided additional detail. The company told us it bans over 8 million accounts a month for such practices. However, it is clear that political actors have, in many cases, succeeded in evading WhatsApp’s safeguards. Mozilla’s own research has documented a pattern of political actors abusing the platform’s broadcasting features to spread disinformation and other harmful content.2 This is another area where we would have welcomed more information.
AI-Generated Disinformation: WhatsApp says its fact-checking partnerships can help address AI-generated disinformation on its platform. WhatsApp says it’s developing “new partnerships specifically focused on AI”, for example with fact-checking organizations. The company’s strategy for AI-generated content, then, uses similar tactics to its wider elections response.
You can read the full statement from WhatsApp, alongside Mozilla’s responses, on our blog.3 We welcome WhatsApp’s detailed response, including new information it has provided – including around the specific changes the company has made to its product to address the issues. We hope that the information provided by WhatsApp will be useful to researchers, journalists, non-profits and election integrity experts, many of which have continued to document disinformation and hate speech on WhatsApp and other social media platforms during this year’s elections. For example, during India's 2024 elections, they documented4 how political parties5 actively spread disinformation through WhatsApp which included the use of deep fake videos and misleading content to manipulate voter perceptions. WhatsApp is yet to commit to the specific product changes we called for – but to engage with our community and answer our questions is a big step in the right direction. It’s a testament to the power of the Mozilla movement. A result of the power of tens of thousands of us across the world who joined this campaign and chipped in. Moments like these show that, together, we can build an internet where irresponsible tech companies are held to account. A future where the integrity of elections is protected from online disinformation and where we can reclaim the internet. But that is only possible if we do it together. Will you add a $10 USD donation to Mozilla today if you are ready to reclaim the internet from irresponsible tech companies? Grassroots donations are how we fund our movement to take on Big Tech. Thank you for being part of this powerful community – and for everything you do for the internet. Nicholas Piachaud Director, Campaigns Mozilla
More Information: 1. WhatsApp Help Center. About Forwarding Limits. 2. Mozilla. Mozilla’s Elections Casebook. 27 February 2024. 3. Mozilla WhatsApp Responds to our Elections Questions. 6 August 2024. 4. Rest of World. Inside the BJP’s WhatsApp machine. 15 May 2024. 5. TIME. How Modi’s Supporters Used Social Media to Spread Disinformation During the Elections. |