Avatar photo

By Faizel Patel

Senior Journalist


WATCH: Meta’s decision to end fact-checking is a ‘reckless and dangerous gamble’, says SA ethics group

Meta said it would end its third-party fact-checking programme and replace it with Community Notes.


A South African non-profit organisation has slammed Meta’s decision to end its third-party fact-checking programme as a “reckless and dangerous gamble” which could have major implications.

This comes after Meta this week said that it would end its third-party fact-checking programme and replace it with a Community Notes system, which users can use to identify posts of others that may have misleading or falsified information.

The Community Notes system will allow users on Meta-owned platforms, like Facebook and Instagram, to add additional context in an information box when they deem posts are potentially misleading and need more context.

Watch the video with Mark Zuckerberg

‘A tool to censor’

In a video titled, “More speech and fewer mistakes” posted to social media platforms earlier this week, Meta CEO Mark Zuckerberg explained that fact-checking organisations had proved to be “biased” when it came to selecting content to moderate and added that he wanted to ensure free speech on all platforms.

“It’s time to get back to our roots around free expression. We’re replacing fact checkers with Community Notes, simplifying our policies and focusing on reducing mistakes. Looking forward to this next chapter,” he wrote in the post with the five-minute video.

“Our system attached real consequences in the form of intrusive labels and reduced distribution. A programme intended to inform too often became a tool to censor.”

ALSO READ: Meta addresses rise of AI and youth online safety

‘Poor content moderation’

Kavisha Pillay, executive director of Campaign on Digital Ethics (CODE), a Johannesburg-based digital rights research, policy advocacy, and education group, said poor content moderation has already fuelled real-world harm such as election interference in the United States in 2016 and Brazil in 2022, violence and hate speech such as the Rohingya genocide in 2016 and amplified conspiracy theories such as through the Covid-19 pandemic.

“This new move risks triggering an even bigger explosion of misinformation, amplifying societal divisions, and harming vulnerable communities on a global scale.”

Relaxing enforcement of content policies

Pillay argued that despite what Zuckerberg claims, unmoderated platforms do not foster a diversity of voices and often “amplify the loudest and the most harmful perspectives”.

“While Meta argues that its content moderation systems have led to overreach, dismantling safeguards entirely is not the solution. Mistakes in moderation call for improvement, not abandonment.

“Relaxing enforcement of content policies and relying on user reports instead of professional oversight risks creating an environment where harassment, hate speech, and disinformation can flourish unchecked. Free speech should not mean having free rein to spread falsehoods, harm vulnerable communities, or destabilise democracies,” Pillay said.

Calls to SA government

Code has called on the South African government to develop robust and progressive regulatory frameworks, akin to the European Union’s Digital Services Act, to hold tech companies accountable for the content that they host and amplify on their platforms.

It also called for a nationwide digital literacy campaign to be implemented to empower the public to navigate the digital landscape responsibly.

Meta’s upcoming modifications will take effect across its trio of major social media platforms: Facebook, Instagram and Threads, which are used by more than 3 billion people worldwide.

ALSO READ: What to do with social media accounts when someone dies?

For more news your way

Download our app and read this and other great stories on the move. Available for Android and iOS.