Us News

Facebook Shifts Content Moderation for Its Users. Are you ready?

Meta would like to introduce its next checker – which will detect falsehoods, verify pen corrections and warn others about misleading content.

It’s you.

Mark Zuckerberg, Meta’s chief executive, announced on Tuesday that he was ending many of the company’s moderation efforts, such as third-party fact-checking and content restrictions. Instead, he said, the company will delegate fact-checking duties to everyday users under a model called Public Notes, which was popularized by X and allows users to leave a check or correction on social media posts.

This announcement marks the end of the era of content moderation and the adoption of loose guidelines even by Mr. Zuckerberg has agreed to increase the amount of false and misleading content on the world’s largest social network.

“I think it will be a spectacular failure,” said Alex Mahadevan, director of the media literacy program at the Poynter Institute called MediaWise, who read Community Notes on X. “The platform is now not responsible for anything that is said. . They can put the burden on the users themselves.”

Such a turnaround would have been unthinkable after the 2016 or 2020 presidential elections, when social media companies saw themselves as unlikely heroes on the front lines of the disinformation war. The lies that spread during the 2016 presidential election caused a public backlash and an internal debate at social media companies about their role in spreading so-called fake news.

Companies have responded by pouring millions into content moderation efforts, paying third-party fact-checkers, creating sophisticated algorithms to limit toxic content and issuing dozens of warning labels to reduce the spread of falsehoods — steps seen as necessary to restore public trust.

The efforts worked, up to a point — the fact-checking labels were successful in reducing false belief, the researchers found, although they did not work well for conservative Americans. But efforts also make platforms – and Mr. Zuckerberg in particular – the political targets of President-elect Donald J. Trump and his allies, who said that content moderation was no small matter of censorship.

Now, the political situation has changed. As Mr. Trump is about to take control of the White House and the governing bodies that oversee Meta, Mr. Zuckerberg has tried to mend his relationship with Mr. Trump, dining at Mar-a-Lago, added a Trump supporter to Meta’s board of directors. and donating $1 million to the fund for the inauguration of Mr.

“The recent election also feels like a cultural moment to re-prioritize speech,” Mr. Zuckerberg in a video announcing the moderation changes.

The bet of Mr. Zuckerberg’s use of Public Notes to replace professional fact-checkers was inspired by a similar test on X that allowed Elon Musk, its billionaire owner, to release the company’s fact-checking to users.

UX now asks everyday users to spot falsehoods and write corrections or add more information to social media posts. The exact details of the Meta program are unknown, but in X, notes are initially only visible to users who have signed up for the public notes program. Once a note has received enough votes that it deems important, it is linked to a social media post for everyone to see.

“The social media dream is an automatic measurement that they, one, don’t have to commit to and, two, don’t pay anyone,” said Mr. Mahadevan, director of MediaWise. “So Public Notes is an absolute dream for these people – they’ve actually tried to create a system that will automatically check facts.”

Mr. Musk, another Trump supporter, was an early champion of public notes. He quickly canceled the program after firing most of the company’s security and reliability team.

Research has shown that Social Notes are effective in dispelling certain viral myths. This approach works best on topics where there is broad consensus, the researchers found, such as misinformation about Covid vaccines.

If so, the notes “emerge as a new solution, going back to accurate and reliable health information,” said John W. Ayers, vice chief of innovation in the division of infectious diseases and global public health at the University of California. San Diego, School of Medicine, who wrote a report in April on this topic.

But users with differing political views must agree to fact-checking before posts are made public, meaning misleading posts about politically divisive topics are often ignored. MediaWise found that less than 10 percent of Social Notes written by users ended up being published in offensive posts. The numbers are even lower on sensitive topics like immigration and abortion.

Researchers have found that most posts on X get most of their traffic in the first few hours, but it can take days for a Public Note to be approved for everyone to see.

Since its launch in 2021, the program has sparked interest in other arenas. YouTube announced last year that it was starting a pilot project that would allow users to post notes to appear below misleading videos. The usefulness of that fact-check is still being tested by third-party testers, YouTube said in a blog post.

Existing tools for measuring Meta content have proven to be overwhelmed by false and misleading content, but the intervention has been seen by researchers as effective. A study published last year in the journal Nature Human Behavior showed that warning labels, such as those used by Facebook to warn users about false information, reduced belief in lies by 28 percent and reduced the number of times content was shared by 25 percent. The researchers found that right-wing users were less reliable in fact-checking, but that the intervention was still effective in reducing their belief in false content.

“All the research shows that the higher the speed, in fact, the more friction there is on the platform, the less the spread of low-quality information,” said Claire Wardle, associate professor of communication at Cornell University.

Researchers believe that social fact-checking works best when paired with efforts to measure internal content. But Meta’s hands-off approach can be dangerous.

“The community-based approach is part of the puzzle,” said Valerie Wirtschafter, a fellow at the Brookings Institution who has studied Community Notes. “But it can’t be the only thing, and it can’t just be thrown out as a crude, blanket solution.”


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button