` `

Facebook Groups and Their Role in Promoting Fake News

Misbar's Editorial Team Misbar's Editorial Team
Technology
5th December 2021
Facebook Groups and Their Role in Promoting Fake News
Facebook groups are branded as reliable spaces (Getty).

“Disinformation runs on emotion,” noted researcher Nina Jankowicz, adding that groups on Facebook (now Meta) are “highly emotional spaces.”

Not only that, the diversity and variety of Facebook groups has turned them into specialist centers offering members information that is not always reliable, which over time traps users into narrow echo chambers. In these echo chambers, or “rabbit holes,” individual users meet in one place, sharing content reinforcing their existing beliefs and biases without criticism, thus creating the perfect atmosphere for the spread of misleading news and information.

Once an individual has created his or her Facebook account, their initial interactions on the platform shape their future trajectory with regard to the content they are likely to view. Creating this trajectory, Facebook suggests and recommends content that is similar to what users have already viewed. If an individual frequently likes posts skeptical of the novel coronavirus (COVID-19), the types of friends, pages, and groups suggested to him or her will be relevant. The end result will be that like-minded users will join groups that support and reinforce their views, and will push them to read news and information that will activate their confirmation bias against what they already know.

The Private Space Trap

Recently, private Facebook groups have been active, in particular, in spreading fake news and conspiracy theories, mobilizing thousands of people receptive to the circulated news and information.

Following the 2016 elections in the United States and the Cambridge Analytica data scandal, Facebook tried to make amends by featuring content suitable for family and friends, while limiting the reach of companies and marketing campaigns to users. In this sense, Facebook groups were promoted as trusted spaces, where people with similar interests can meet and chat. In this regard, Mark Zuckerberg explained in 2019 that “many people prefer the intimacy of communicating one-on-one or with just a few friends.”

These developments have led to further improving and ensuring a wider presence for Facebook groups as part of Facebook’s campaign to create communities that enjoy both privacy and trust.

But this “privacy first” approach to how groups are used is exactly why Facebook is in crisis now. That is, Facebook’s policy of nominally creating privacy-oriented trusted communities has had the unfortunate effect of having a very small number of people influence the opinions of large groups of people confined in private spaces and directing them to share misleading and/or fake content.

The same applies to chat groups or chats for individuals on messaging apps, where those sharing misleading content can connect with one another and build a relationship of blind trust, especially when the content shared reinforces existing personal beliefs, creating an atmosphere for confirmation bias. Facebook groups eventually become polarized, and sometimes extremist, spaces.

The Pandemic of Misinformation

Dubbed the Infodemic, the COVID-19 pandemic has become one of the most critical stages in the history of misinformation. After the outbreak of pandemic at the beginning of 2020, a huge amount of information flooded the internet until it became difficult to distinguish which information is reliable and where misinformation has originated.

The tech watchdog organization Media Matters for America has found “284 active private and public Facebook Groups” with over a half million users “currently distributing vaccine misinformation.” The study prepared by Media Matters revealed that more than half of these groups were closed; i.e., they cannot be joined without an invitation. This limits specialists’ ability to refute the content shared in these groups. The dilemma hence prompted U.S. President Joe Biden to declare in July 2021 that social networking sites are “killing people.”

The diversity of Facebook’s groups contribute to the social media giant’s growth. At the same time, this diversity brings with it a set of challenges, especially with regard to monitoring private and closed groups. These spaces, after all, can be a hotbed of misinformation at times, for selling weapons, or even for human and sex trafficking at other times.

What’s Facebook doing?

Following Joe Biden’s comment that social media is killing people, primarily in reference to how misinformation is discouraging people from following specialist and government advice on COVID-19, Guy Rosen, vice president of integrity Facebook, said in a blog that Facebook had removed “more than 18 million pieces of Covid-19 information since the pandemic started and limited the spread of 167 million pieces its fact checkers judged as inaccurate.”

Rosen added that Facebook has given group admins some tools to help them get rid of false information. But what if group admins are themselves promoters of misleading and false claims?

Even if Facebook deletes some groups promoting false information, the platform will not be able to entirely remove such information, because doing so often depends on artificial intelligence algorithms. Facebook cannot prevent people from joining such groups either, nor can it prevent the spread and propagation of misleading content, especially since its algorithmic filters bring like-minded people together. It allows them to exchange information and ideas, affecting the lives of those same people on different levels, from politics to the very health of those individuals and all people surrounding them.

The answer does not lie in removing certain groups or posts. Instead, Facebook should improve the level of transparency regarding private and closed groups based on multiple factors, including their type (public, health or political), and after reviewing these groups’ descriptions as well as the descriptions of those posting information.

Facebook should also inform users when groups and pages are operated by the same accounts. This allows users to learn if multiple and concerted efforts are being exerted to attempt to influence users’ political, religious, health or other views.

Is Facebook still a Public Space?

Ever since the onset of the internet as a public information network, it was seen as a revolutionary tool against monopolized control of the public space, especially that official media outlets are now owned by financiers who dictate their publishing policies and standards. Editing and fact-checking information are still a possibility, but so is the potential for manipulating content through fake accounts and other more complex methods. Such efforts have turned the internet into a limited but powerful space influencing decisions pertaining to countries’ economies, health and other areas vital to the survival of the state as the ultimate sovereign. What individuals, decision makers, and corporations need to do is be more aware of and appreciative of how social media platforms operate to avoid falling prey to algorithmic traps and filters.

Misbar’s Resources

NPR

NBC News

WIRED

WIRED

Forbes

McAfee Institute

Politico

NPR

Medscape

 

Translated by Ahmed N. A. Almassri