It comes at a tough time for Meta to phase out its US-based fact-checkers. Just days after CEO Mark Zuckerberg made the announcement, wildfires devastated Los Angeles, leading to a huge flow of conspiracy theories online.
Meta's fact-checking partners, including organizations such as Lead Stories and PolitiFact, fought back against misinformation. They dismissed viral claims about looting and AI-generated images purporting to show the Hollywood sign engulfed in flames.
The inaccuracies were not confined to Meta's services, as well-known personalities nurtured and circulated vacuous stories. CNN shared that new US President-elect Donald Trump and others used this situation as a platform to make highly politicized accusations. In his version, Elon Musk attributed the fires to "diversity policies," and Alex Jones peddled a "globalist plot." These conspiracy theories—like those in the 2023 Maui fires—unhelpfully took down people's confidence in emergency agencies.
Meta is substituting professional fact-checkers with a community-based system called Community Notes. Here, users will flag and provide context for posts through a collaborative mechanism. Although some success has been seen, it is criticized because it does not have the professional fact-checker's strict ethical code. Posts carrying misleading content tend to go unchallenged, and hence, false narratives spread.
Read More: Maui Wildfires: Officials Sue Cell Carriers for Failing to Send Alerts During Deadly Lahaina Fire
Meta Copies Musk's 'Community Notes'
Experts fear that the absence of professional fact-checkers would worsen the spread of conspiracy theories in Meta apps, especially Facebook.
Generally, fact-checkers like Lead Stories argue that combating factual errors with sufficient information prevents public distrust during emergencies. The absence of these fact-checkers could make it difficult to deal with complex misinformation, making the platforms vulnerable to manipulation.
The shift into Community Notes means a significant turn in the content moderation approach from Meta, similar to how Musk applied the same to X (formerly Twitter). The Community Notes program of X has been shown to be incapable of effectively dealing with the US election misinformation that has gone rampant on the platform. A report by the Center for Countering Digital Hate, as shared by AP News, revealed that 74% of misleading posts, including false claims about the 2020 election being stolen, were not corrected by the Community Notes feature, despite fact-checks being available. This failure is compounded by the fact that even when posts were accompanied by notes, incorrect posts received 13 times more views than their corresponding corrections.
While X's leadership claims that the program maintains high standards, the report suggests that Community Notes only scratches the surface of disinformation on the platform, which Meta could see in the future.
Join the Conversation