In northern India, a 20-year-old Muslim man was beaten to death by a mob after being accused of transporting cows, considered sacred to Hindus, for slaughter. He was a victim of "cow protection" vigilantes, who have posted at least 166 other videos on Instagram documenting their attacks in the name of "protecting cows."
Experts point out that the occurrence of such incidents exacerbates concerns about the reduction in fact-checking of social media content. "Unverified hate speech doesn't just stay online, it spills over into the real world with deadly consequences," said Raqib Hameed Naik, head of the Center for Organized Hate Studies (CSOH).
Earlier this month, Meta announced it would stop using independent third-party fact-checkers to review content on its platforms, Facebook, Instagram, and Threads. CEO Mark Zuckerberg stated that fact-checkers were "too politically biased" and that his new plan would help ensure freedom of speech. However, in major markets like India, where Meta has over 500 million users, Mr. Naik warns that unverified narratives pose a serious threat not only to online discourse but also to public safety.
Media experts say that third-party fact-checking helps reduce the spread of false and misleading information in their news feeds. Meta has relied on subcontractors to flag and debunk posts. But the third-party fact-checking program, which began in 2016, is being terminated in the US, and experts expect it to expand to other parts of the world in the coming months. The Australian Broadcasting Corporation (ABC) asked Meta about future plans, but the company declined to comment publicly. Currently, in the US, Meta is switching to a feature called "Community Notes," which relies on platform users to flag potentially misleading content. Notes are displayed once a majority of users agree on their accuracy.
Critics warn that if Meta's changes extend to fragile democracies like India, Myanmar, and the Philippines, the consequences will include more polarization, violence, and social unrest. The International Fact-Checking Network (IFCN) stated, "Some of these countries are highly vulnerable to disinformation that can trigger political instability, election interference, mob violence, and even genocide." In the Philippines, Meta's fact-checking partners, including Agence France-Presse and Vera Files, have played a crucial role in combating election-related disinformation.
Nobel laureate and Filipino journalist Maria Ressa warned that journalism and democracy are facing "an extremely dangerous period." Celine Samson, a fact-checker at Vera Files, said that roles like hers were particularly important during the last election. Vera Files documented an increase in disinformation posts that used a particularly dangerous tactic in the Philippines - portraying opposition leaders as communists. While the word "communist" might seem relatively harmless elsewhere, in the Philippines, it can be life-threatening. When fact-checkers flag false content, Meta can limit its reach or remove it if it violates standards.
Ms. Samson said, "If someone repeatedly posts disinformation, it gets flagged, its visibility is reduced, and they lose the ability to make money on the platform, and I think that's where the biggest hit is." "Removing this program means removing a layer of protection against disinformation." In Myanmar, Facebook is widely regarded as "the internet." During the 2017 Rohingya crisis, Facebook was found to have played a key role in the spread of hate speech, leading to violence against the Muslim minority. Despite promises to address failures, Facebook remains a hotbed of disinformation there.
Nawa Wah Wah Poe, the head of Red Flag, an organization focused on research and social media monitoring, said that the military and other actors have become very adept at evading detection and inciting violence. "We are facing a rise in military disinformation, propaganda, and disinformation campaigns," said Nawa Wah Wah Poe. "We have tracked posts using words like 'laying eggs' to mean bombing or 'putting on makeup' to mean beating someone up. Without fact-checkers, platforms like Facebook could become even more chaotic." She said that Meta's fact-checking partners, who must meet strict non-partisan standards, are crucial for understanding local languages and contexts.
Meta has defined its decision to reduce fact-checking as a commitment to "free speech," but critics say that this hands-off approach is dangerous. "Mark Zuckerberg's claim that fact-checkers are 'biased' completely ignores how important they are in areas where disinformation is a tool of oppression," said Jonathan Ong, a professor of global digital media at the University of Massachusetts. "Zuckerberg's statement signals to the world that they are no longer apologizing for the harms of social media or placating legacy media." Professor Ong said he was worried about areas that were already vulnerable.
"He's positioning Meta as a defender of free speech—at the expense of marginalized communities globally." "Those countries that are most harmed by unverified social media... will now bear the brunt of Meta's exit from accountability." Adi Marsiela, from Cek Fakta, Indonesia's largest fact-checking organization, said the changes were worrying for countries like his with lower digital literacy. Indonesia had over 119 million active Facebook users and at least 100 million Instagram users last year. Mr. Marsiela said the Covid pandemic demonstrated the importance of checking social content.
"During the pandemic, rumors spread quickly, and fact-checkers were crucial in combating misunderstandings," he said. Mr. Marsiela highlighted the surge in fake job scams and sensationalist environmental disaster content designed to increase engagement and profitability. "Algorithms promote stories that are more interesting than relevant, which is why independent fact-checkers remain vital," he said.