January 07, 2025
Facebook parent company Meta revealed sweeping changes on Tuesday that will alter the way the social media giant moderates content and combats the spread of misinformation on its platforms. The policy shift marks an end to Meta's partnerships with independent fact-checking organizations, which started just before President-elect Donald Trump took office for his first term in 2017.
Meta CEO Mark Zuckerberg called the decision a response to the outcome of November's elections, which he described as a "cultural tipping point" toward prioritizing freedom of speech.
MORE: Medical debt to be removed from credit reports of 15 million Americans
"We're going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms," Zuckerberg said in a video, arguing that fact-checking posts too often resulted in biased scrutiny, censorship and mistakes.
Moving forward, Facebook and Instagram will adopt a "community notes" model similar to the system used by X, formerly Twitter. Instead of relying on third-party sources to vet the legitimacy of content, the community notes model draws on the viewpoints of users to add context to posts that may contain misinformation.
Philadelphia-based FactCheck.org, a nonprofit tied to the University of Pennsylvania's Annenberg Public Policy Center, was among dozens of nonpartisan organizations that had partnered with Meta and received funding from the company over the last eight years. The program filtered viral content that Facebook users had marked suspicious and sent it to the fact checkers, who would then provide Facebook with referral links to label and debunk false claims.
In some cases, Meta would go a step further by reducing the visibility of fact-checked posts to prevent misinformation from spreading on Facebook users' news feeds. The company said Tuesday this enforcement led to censorship and a reduction in "civic content" that addresses political issues.
FactCheck.org's director, Lori Robertson, said the nonprofit's work had nothing to do with Facebook's own motivations to promote or suppress certain types of content.
"Our work isn't about censorship. We weren't advocating for posts to be taken down and we could not do that," Robertson said Tuesday. "We didn't do that. What Meta decided to do about removing things or demoting things — that was their decision. Facebook had no control over what we wrote or what we wrote about, but we would select claims that we wanted to write about."
FactCheck.org was founded in 2003 with a mission of reducing deception and confusion in U.S. politics. Journalists at the nonpartisan organization monitor political discourse in campaign ads, speeches, interviews and on social media sites that facilitate the viral spread of misinformation. Other organizations that have partnered with Meta include PolitiFact, ABC News and Snopes.com.
Robertson said there had been rumblings about Meta possibly abandoning its fact-checking program, but she didn't learn of the company's decision until Tuesday. The funding Meta provided helped enhance FactCheck.org's existing work monitoring social media, but the end of this program won't dim the organization's mission.
"It's going to reduce the reach of our work. We get asked by people all the time, 'Hey, I saw this on Facebook. Is this true?' And the labels — based on our work and that of other fact checkers — were helpful to people to give them more information on whether or not what they're seeing is correct," Robertson said. "It is going to be more incumbent on individual social media users to do some of their own fact checking and to stop when they see something suspicious or questionable before sharing."
At Penn's Annenberg School for Communication, sociologist Sandra González-Bailón has studied the ways misinformation spreads on Facebook and how it can impact elections. She was part of a team that analyzed more than 1 billion Facebook posts in the months before and after the 2020 election. Misinformation most often could be traced to about 1% of users — primarily older conservatives — and it tended to be shared directly between users rather than on pages and groups that are watched more closely by Meta's content moderation systems, the researchers found.
"My reaction to these changes is that they have less to do with a real concern for free speech and more with an attempt to avoid being targeted by Trump's government," González-Bailón said. "Community notes may help — although the evidence of their effectiveness is scarce — but they would definitely be a more effective measure if combined with the work fact-checking organizations do."
González-Bailón pointed to multiple studies that underscore how misinformation is more abundant among conservative audiences. She said Meta's claims of political bias in fact checking do not address this imbalance or offer clarity on the company's own role in censoring content.
"Crucially, this change does nothing to address the real problem, which is the lack of transparency preventing the public from understanding how Meta makes decisions on which content to demote or render invisible, once labeled as problematic," González-Bailón said.
FactCheck.org reassured readers that its work will go on regardless of Facebook's policies.
"Our journalists will continue to provide nonpartisan coverage of false and misleading political claims, helping you to sort fact from fiction, just as we have done for more than 20 years," the organization said in a statement.
Robertson said she was surprised to see Meta frame its new strategy as a response to bias. Every organization that was part of its program was required to be an approved signatory of the International Fact-Checking Network, which has a code of principles on nonpartisanship and transparency.
"These are journalistic organizations that adhere to ethical standards," she said.
González-Bailón suggested Facebook is taking a risk that will be consequential to the public. By removing fact checkers from its process and leaning on community notes, Meta is turning away from qualified people to assess the legitimacy of social media content.
"Even after these changes, the power that companies like Meta have to shape the information landscape remains unchecked, given the lack of access to data for external researchers and journalists," she said. "Assuming that community notes will be able to solve the trade-off of free speech (and) healthy conversations without any data substantiating it is just that, an assumption."