` `

A Study Examines the Spread of Misinformation in the 2020 U.S. Presidential Elections

Khadija Boufous Khadija Boufous
Politics
15th December 2024
A Study Examines the Spread of Misinformation in the 2020 U.S. Presidential Elections
Content moderation reduced misinformation during the U.S. elections

As exposure to legacy media declined, social media gained a larger audience, particularly during critical periods like the COVID-19 lockdown, when the internet and social media platforms were the primary means for people worldwide to stay connected. Over time, social media became a key public space for global social and political debates, especially during election periods, when these platforms were used not only to discuss, share, and promote political views but also to spread deception and propaganda.

The Spread of Misinformation on Facebook During the 2020 U.S. Elections

Amid growing concerns over social media's role in spreading misinformation during the 2020 U.S. presidential elections, particularly regarding Facebook and X (formerly Twitter), a new study explores how misinformation spread on Facebook during this period, focusing on the interaction between sharing features and changing content moderation policies.

A recent study from Northeastern University, published in the Sociological Science Journal, found that misinformation spreads differently than every other content on Facebook during the U.S. 2020 elections, based on progressive peer-to-peer sharing from a small number of users following campaigns on misinformation from social media pages and groups.

The Spread of Misinformation on Facebook During the 2020 U.S. Elections

To explain how misinformation labeled "false" by third-party fact-checkers spread on Facebook, researchers examined the diffusion trees of approximately 1 billion posts that were reshared at least once by U.S.-based adults from July 1, 2020, to February 1, 2021, on the social media platform and mapped out their distribution network.

The Key Findings

The paper found that most content on Facebook is shared through pages and, to a lesser extent, groups. The resulting distribution network resembles a "tree"—initially broad but relatively short. It highlights that misinformation spreads more slowly than non-misinformation, relying on a small group of active users who propagate it through long chains of peer-to-peer sharing, reaching millions. In contrast, non-misinformation spreads primarily through one-to-many features, especially on pages.

A key finding of this study is that the significant role of peer-to-peer misinformation spread can be attributed to a gap in enforcement within content moderation policies, which were primarily focused on pages and groups rather than individual users.

The researchers also discovered that periods of intense content moderation close to the election were associated with a significant decrease in the spread and reach of misinformation and political content.

The findings support two key points: First, Facebook primarily facilitates broadcast-style exposure, with pages driving the reach of content. Second, misinformation deviates from this trend, spreading more virally through a small group of older, more conservative users. Contrary to previous research, this paper shows that while misinformation spreads more slowly and garners fewer overall views, it still reaches millions.

The study’s results highlight the significant impact of content moderation on diffusion and virality patterns, an aspect the paper considers “overlooked” in previous research analyzing the full range of posts. The researchers provide clear evidence that diffusion dynamics are shaped by platform features, like pages, and the shifting nature of content moderation policies, such as the more serious targeting of misinformation on pages and groups, which allowed peer-to-peer spread to bypass moderation.

In addition, the study shows that observed trends on average do not hold during specific periods of increased vulnerability. While the observational data does not prove direct causality, the paper suggests that platform features and low moderation at key moments played an important role in shaping the spread of misinformation.

The 2024 U.S. Elections

During the 2024 U.S. elections, as part of its efforts to monitor and address election-related misinformation and disinformation, Misbar highlighted a concerning trend: right-wing groups are playing an increasingly significant role in spreading misinformation. This rise was fueled by both technological advancements and the growing influence of populist and far-right movements globally.

In the midst of these developments, moderate political movements grappled with significant challenges, balancing the need to maintain transparency with the urgent task of combating the rising tide of misinformation in the digital realm.

Media reports cited that while misinformation circulates on both sides, it is more widespread and harmful from right-wing sources, explaining that individuals on the political right are more likely to share false news than those on the left. Additionally, research from Northeastern University indicated that Democrats are generally better at distinguishing between true and false news than Republicans.

Misbar focused its efforts on monitoring mis/dis-information related to the 2024 U.S. elections, where the team worked to debunk false claims, including unrelated videos in the context of elections.

Video of Trump Peeking at His Wife’s Ballot Is Unrelated to the 2024 U.S. Election

Read More

Study Shows That Most Americans Are Concerned With Fake News in the 2024 Elections

Truth, Lies, and Democratic Discourse

Most Read