By kanda Honourine & Donald Tchiengue
On Tuesday 7th January 2025 Mark Zuckerberg announced with the caption “More speech and fewer mistakes” that Meta was ending its fact checking programme with third party fact checkers on Facebook, Threads and Instagram and they will be replaced by “community notes” starting in the United States of America. According to the founder of Facebook, third party fact-checking programme has become too politically biased and it was time to go around freedom of speech. However, in practice since 2016(when the program was launched), as it’s mentioned in this article, the programme has helped organizations around the world to prevent and mitigate the spread of false information and the escalation of tensions and violence between communities. While this decision is limited to USA for the time being, there is concern that it could extend to the rest of the world and in Africa;
Particularly in Cameroon, where misinformation and hate speech have in the past contributed to amplifying tensions, tribal hatred and plunging the country into a socio-political crisis since 2017. In 2025 this is likely to worsen; as it’s a year of major elections, for which malicious misinformation, disinformation, fake news, hate speech and incitement to violence on Meta platforms have already been observed, posing a serious threat to the promotion of democracy, human rights and digital rights.
Civic Watch Cameroon through its fact-checking components ( 237 Check and Africa Fact-checking Fellowship), is deeply concerned by this decision, which appears to be a deal breaker on the primacy of credible information and truth on the internet, and implicitly on the promotion of social cohesion. Because without truth, there can be no peace amongst men. As the UN Secretary-General Antonio Guterres once said, “the denial of historical or scientific facts creates a vacuum of truth that is all too easily exploited by the voices of intolerance and hate”. So, today the question on the one hand is how can this policy affect social media users in Cameroon who are in an electoral year in case the policy is implemented in Africa? On the other hand, what can be done to move beyond Mark Zuckerberg’s decision?
Potential effects Of Meta’s Policy to drop third party fact-checking program in Africa
This decision to drop fact-checkers could have several effects in Africa and Cameroon in particular .
- Increase dis/misinformation and incitement of hate and violence: It’s important to point out that the practice of fact-checking fundamentally advances in an environment where freedom of expression is valued and encouraged, as it’s a tool for accountability, transparency and monitoring of public speech and action. It is imperative to note that fact-checkers are not against freedom of expression and free speech. Rather, with their work, they enable the right to freedom of expression to be respected and promoted, along with the right to credible information and respect for human life. However, while the right to freedom of expression is an inalienable right for all, it should not be used as an unhealthy pretext to produce and disseminate erroneous discourse with the aim of stigmatizing or smearing individuals or communities. Thus, the work of fact-checking organizations through this program has helped to prevent and protect vulnerable and/or minority communities from being exposed to hate and violence caused by untrue information. Putting an end to this programme today would be tantamount to giving a green light to malicious people to spread messages that are biased or fabricated to cause harm.
- Undermining Trust in the Media and Digital Information: For some time now, social media has played an important role in shaping the lives of the public, who sometimes rely on them to make informed decisions that are crucial to their health and well-being. Many people turn to social media as an alternative to mainstream news, but this shift comes with challenges. Removing this program means exposing billions of Internet users to the hoaxes and conspiracy theories that circulate online with the aim of influencing perceptions and ideas. Reliable, verified facts are the basis for effective decision-making by individuals, companies and governments. They also contribute to healthy, objective and constructive public debate. They therefore enable citizens to make informed decisions on important issues, be they political, health or other social. So, the absence of such a fact-checking programme could contribute to a decrease in trust in digital information, particularly among users who are not well-versed in identifying unreliable sources.
- Transparency and responsibility of content are challenged: While the democratization of the use of ICTs and Internet appears to be an important revolution in the history of humanity, it also presents a threat to the social contract as defined by Jean-Jacques Rousseau. In an environment where anyone can create and disseminate information without a prior verification exercise, the trust gap between citizens and the State and its constituent institutions will widen without moral, ethical and regulatory accountability. To avoid this, fact-checkers, and by extension journalists, have a responsibility to answer for the information they disseminate. In their day-to-day work, fact-checkers are governed by a code of ethics that sets out their working methods. In the case of international fact-checking, for example, there is a code of principles and ethics that assesses the level of impartiality, transparency and objectivity of the verification content to which organizations must subscribe. So when information is verified by non-partisan actors, it reinforces people’s trust in news sources and the media. This contributes to the openness and credibility of institutions.
- The Risk of Amplifying Harmful Narratives ( Hate Speech, Incitement To Violence): In Cameroon, Meta’s platforms (Facebook – Instagram – WhatsApp) have been used to spread both constructive discussions and hate speech. In times of crisis, such as the ongoing conflict in the Anglophone regions, false information about government actions, human rights abuses, and international interventions are often amplified through social media.
The removal of fact-checking third parties programmes means that these harmful narratives, which have the possibility to escalate violence or spark unrest, will not be easily mitigated. It can be dangerous during periods of social unrest, where misinformation can lead to further violence, human rights violations, and destabilization.
Alternatives to continue the fight against dis/misinformation
Needless to say that Mark Zuckerberg’s decision will not stop the work of fact-checkers around the world. Even if it does set back the fight against misinformation on Meta platforms. This unexpected decision is a timely reminder of the urgent need to develop other alternatives to support the work of fact-checkers and journalists.The aim is to increase the number of fact-checkers who can contribute to mitigating misinformation within their community, based on a bottom-up approach. This will not only increase the amount of content available for verification, but also multiply the number of communication channels to help rebalance the virality of fake news. The greater the number of credible, community-based fact-checkers who use local techniques, languages and contextual realities, and communication channels ( such as community radio, WhatsApp platforms and shouting), the easier it will be to stem the tide of false information, especially in public health and during electoral periods. This is the challenge we’ve set ourselves with the #AFFCameroon program over the past 5 years through the organizations of physical meetups specially in conflict-affected areas in Cameroon.
Cree awareness of the consequences of public health misinformation on WhatsApp in a rural community
On the other hand, it’s a question now of continuing to federate the efforts and resources of different players to tackle the spread of dis/misinformation. This involves setting up forums for collaboration and resources mobilization between journalists, civil society, state institutions, multinationals/companies and digital platform proprietors to find protocols and mechanisms for regulating and censoring digital offenses, while encouraging the promotion of human rights, democracy, the safety of Internet users and the consolidation of peace. The recent examples of coalitions in Nigeria and Senegal are an illustration of how it is possible to bring together those involved in countering misinformation to reduce the curve of fake news during crucial periods such as elections.
Another alternative especially in an African context where digital literacy and internet penetration rate are still low, Media and Information Literacy (MIL) is essential to develop critical skills to analyze, evaluate and understand the information to which individuals are exposed. Given the deletion of this program on Meta, it’s necessary to step up awareness-raising and training campaigns on MIL, to enable people to distinguish between reliable and dubious sources, to spot bias and manipulation, and to encourage responsible media consumption.
To sum up, if we don’t have a fact-checking programme, why can’t Meta set up autonomous regional fact-checking teams, totally independent of political influences and ideologies, to work with moderation teams? In other words, as proposed by the International Fact-checking Network (IFCN), in addition to community notes, it would be interesting to consider setting up teams of fact-checking professionals to work with Meta’s content moderation teams. Particularly in Africa, user experience on X (formerly Twitter) shows that very little information about most sub-Saharan African countries is debunked via community notes. Rather, it would be advisable to have such decentralized teams operating not located in the USA, but in the African geographical space, constantly immersing themselves in local realities, linguistic, cultural, socio-political, even anthropological and economic dynamics to reduce bias in content moderation.