LONDON, Jan 17 — Ahead of Donald Trump’s presidential inauguration, Meta head Mark Zuckerberg ended its US fact-checking programmes meant to label viral, misleading content.
Meta will review posts only in response to user reports, Zuckerberg said on January 7.
Automated systems will flag “high-severity violations” around terrorism, child exploitation, scams and drugs. The changes only apply to its US operations, not to other regions.
Fact-checking organisations said the move could encourage hate speech online and fuel violence offline.
“Mr Zuckerberg doesn’t want to be in the business of arbitrating truth, but he is,” said Sarah Shugars, assistant professor of communication at Rutgers University, in New Jersey.
“The removal of fact-checkers and loosening of policies will only serve to discourage free speech and exacerbate bias on the platform. Claiming otherwise would be laughable if the repercussions were not so serious.”
Here’s what you need to know about the new rules and their impact.
What are Meta’s new rules?
Instead of using trusted media organisations to fact-check, Meta will use “community notes” to check content similar to those on X, formerly Twitter.
On X, users apply to contribute community notes. When enough users from “different perspectives” rate a note as “helpful”, it is publicly shown on a post.
X works out a user’s perspective based on the topics they choose to fact-check.
Meta will not write the community notes itself. A published note will require the backing of users “with a range of perspectives to help prevent biased ratings.”
With the changes, Meta’s algorithm will no longer make content that has been rated poorly by fact-checkers (now users, rather than professionals) less visible, while the company will make the labels on any fact-checked content “less obtrusive.”
Who are Meta’s fact-checking partners?
Meta’s decision will have ramifications for global media and fact-checking websites.
In the United States, Meta partners on fact-checking with various news organisations including Agence France Presse, USA Today and Thomson Reuters’s Reuters Fact Check unit, among others. The Thomson Reuters Foundation, the charitable arm of Thomson Reuters, runs the Context media platform.
Around the world, Meta works with 90 fact-checking organisations, covering more than 60 languages.
Fact-checking companies heavily rely on Meta’s funding for their revenue, according to a survey by the International Fact-Checking Network (IFCN).
How did fact-checkers respond to Meta’s move?
The IFCN said the decision threatened to “undo nearly a decade of progress”.
They rejected Zuckerberg’s claim that the fact-checking programme had become a “tool to censor” users, as Meta makes the final decision on how to deal with content flagged as false by fact-checkers.
“The freedom to say why something is not true is also free speech,” they said.
Meta works in several countries that are “vulnerable to misinformation that spurs political instability” and any plans to end fact-checking worldwide “is almost certain to result in real-world harm in many places,” they added.
Milijana Rogač, executive editor of Serbian fact-checking outlet Istinomer, said Meta’s decision would hurt the media landscape at large.
“By removing fact-checkers and their analyses from social media — platforms that many citizens use as their primary source of information — Meta further hinders access to accurate information and news,” Rogač told the Thomson Reuters Foundation in an email.
What do digital rights experts say?
The UN High Commissioner for Human Rights Volker Türk criticised Meta’s decision.
“Allowing hate speech and harmful content online has real world consequences. Regulating this content is not censorship,” said Türk.
Digital rights organisation the Electronic Frontier Foundation said that while community notes can be useful, work by experts was also vital.
“We hope that Meta will continue to look to fact-checking entities as an available tool,” it said.
“Meta does not have to, and should not, choose one system to the exclusion of the other.”
Other experts warned that misinformation about science and health could increase on Meta’s platforms.
Will Meta face legal challenges to the changes?
Meta could face legal challenges abroad should it disband fact-checkers internationally. Meta says it has “no immediate plans” to do so.
The European Union’s Digital Services Act requires that platforms cooperate with researchers and fact-checkers to mitigate risk from online disinformation. Meta could not stop its programme in Europe without submitting a risk assessment to the EU Commission.
Ofcom, the media regulator for Britain’s Online Safety Act, said it would assess Meta’s compliance when the Act is enforced in March 2025.
Jacob Mchangama, executive director of Vanderbilt University think tank, The Future of Free Speech, told the Thomson Reuters Foundation via email that a lot will depend on how Meta implements changes — as each European state can have different legal thresholds for free speech laws.
Brazil’s Solicitor General Jorge Messias said the government had “enormous” concerns over Meta’s decision.
Are community notes effective?
Research from the University of California and Johns Hopkins University found in 2024 that community notes on X for Covid-19 misinformation were accurate, cited moderate and high credibility sources, and were attached to widely-read posts.
However, the study’s sample size was small and the effects on users’ perceptions and behaviour is unknown.
A 2023 study from the Journal of Online Trust and Safety said it was harder for users to achieve consensus when they assessed posts about political issues.
Traditional fact-checking is likely to result in a person rejecting misinformation even if that person is politically aligned with the source of that misinformation, a 2024 study from European universities, led by the University of Amsterdam, said.
“It is the content of the fact-check that matters for its effectiveness, and this content can be persuasive even for those who support the source of the misinformation,” the report said. — Reuters
0 Comments