Social media giant Meta announced on Tuesday that it is discontinuing its U.S. fact-checking program, replacing it with a community-driven system similar to that of X (formerly Twitter). This change represents a significant shift in Meta’s content moderation approach, as CEO Mark Zuckerberg has long advocated for robust content moderation despite criticism from some conservatives accusing the company of censorship.
New Appointments Align with Free Expression Push
The move also comes after recent high-profile appointments at Meta, including Joel Kaplan, a Republican policy executive, as the company’s global affairs head, and Dana White, CEO of Ultimate Fighting Championship, to its board. This aligns with Zuckerberg’s new focus on encouraging more free expression across its platforms, including Facebook, Instagram, and Threads, which collectively serve over 3 billion users worldwide.
“We’ve reached a point where it’s just too many mistakes and too much censorship. It’s time to get back to our roots around free expression,” Zuckerberg said in a video. “We’re going to focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms.”
Under the new system, Meta will phase out its independent fact-checking program that has been in place since 2016. This decision has surprised some partner organizations, including Check Your Fact, whose managing editor, Jesse Stiller, expressed concern about the impact this will have. Other partners, such as Reuters, AFP, and USA Today, have yet to comment on the shift.
The change will introduce Meta’s “Community Notes” feature in the U.S., which allows users to highlight posts they believe to be misleading, rather than relying solely on external fact-checkers. Meta emphasized that it would not control which Community Notes are applied to posts, allowing the community to self-regulate content accuracy.
The effectiveness of this new model remains uncertain, especially in light of the ongoing investigation of X (formerly Twitter) by the European Commission over concerns about illegal content and information manipulation. Meta’s independent Oversight Board welcomed the changes, but the broader implications for content moderation remain to be seen.
In addition to these policy shifts, Meta also plans to relocate its trust and safety teams to Texas and other U.S. locations, further decentralizing the company’s content oversight operations. The company stated that automated systems will now be primarily focused on illegal content and high-severity violations, such as terrorism and drug-related posts.
Meta’s shift towards a community-based approach has sparked interest in the tech industry, with X CEO Linda Yaccarino calling it “a smart move” that she expects other platforms to follow.