` `

Meta Released New Feature to Curb Misinformation Within Facebook's Groups

Wesam Abo Marq Wesam Abo Marq
Technology
24th October 2022
Meta Released New Feature to Curb Misinformation Within Facebook's Groups
Misinformation occasionally originates from Facebook groups (Getty)

On October 20, 2022, Meta released a statement announcing that the company had added new tools to Facebook to stop the spread of false information and encourage communication within groups.

A New Method for Dealing with False Information

According to data, most Facebook users participate in at least 15 active groups, and more than 100 million people join groups every day. In order to make it simpler for admins to manage their groups, several modifications were unveiled during the sixth annual Facebook Communities Summit.

To make content more trustworthy for the public, Meta announced a new method to curb the spread of misinformation on Facebook groups.

Misinformation control is a constantly developing issue that Meta cannot solve on its own. Independent fact-checkers are used by Meta to verify and evaluate the veracity of news based on original reporting. This can involve primary source interviews, use of publicly available data, and media analysis, including images and videos.

Meta considerably slows down the distribution rate at which fact-checkers flag content as incorrect on the platform. The company then labels the content as such, notifies those who attempt to spread it, and reduces the number of people who view it.

Group moderators can now automatically move posts that contain information rated false by independent fact-checkers to pending posts so that they can review them before deleting them. This applies to posts that are identified as containing false information both before and after they are posted in a group.

A supporting image within the article body

Meta’s Effort to Promote Conversation

There are also initiatives to facilitate communication. Facebook is testing a feature that allows admins to allow posts that would otherwise be reported for bullying, harassment, and hate speech. An admin for a group could approve a flagged comment through this test if it was not meant to be offensive. Only actively involved admins who have not either led a group that was removed or broken a significant policy rule will have access to this.

A supporting image within the article body

Facebook’s Failure to Handle Misinformation

Facebook was blamed for its major failure to restrict the dissemination of false information during the 2016 presidential election campaign. Furthermore, fact-checkers alerted Facebook about 59 accounts that disseminated false information about COVID-19 in April 2020. 31 of them were still in operation as of November 2021. Chinese state-run Facebook profiles have been recently disseminating false information on the conflict in Ukraine in English.

The modifications aim to increase positive interaction while admitting that Facebook's largest misinformation issues have occasionally originated from groups.

 

Misbar’s Sources

Meta

Meta

engadget.com

Scientific American