The last few days have seen the circulation of a screenshot on social media showing a warning message next to one of the videos of the Palestinian activist and photographer Saleh al-Jaafarawi, stating that its content is false. Those sharing the image claimed that France 24 filed a complaint about it, while others commented that it was "international agencies colluding to serve Zionism."
After investigation, Misbar found that the displayed message below Saleh al-Jaafarawi's video is related to the Meta program dedicated to combating false information in partnership with independent fact-checkers accredited by the International Fact-Checking Network. The program involves reviewing and classifying claims made by partner institutions to take appropriate actions on the accompanying content, whether by deleting, restricting, or sending a warning notice to the publisher.
In one of its blog posts published on its website, Meta clarifies: "every time a fact-checking entity evaluates content as false on our platforms, we significantly limit its distribution, so fewer people see it. We label it accordingly and notify individuals attempting to share it.
Meta adds, "Fact-checking entities do not remove content, accounts, or pages from our applications. We take care of content removal when it violates our community standards, which are independent of fact-checking programs."
Al-Jaafarawi’s Video Was Published on October 23
By examining the original video posted by Saleh al-Jaafarawi, which was referred to by accusers, Misbar found that he shared it on his Instagram account on October 23. In the video, he appears with a newborn baby found during the bombing of his grandfather's house at the time, and no one from his family has been identified.
There are red stains on Al-Jaafari's shirt that appear to be blood, as well as bloodstains on the face of the newborn baby he is holding.
Misbar did not find any warning on it at the moment, and it became clear that many of the video clips he posted in the form of "reels" on Instagram have restrictions because the application's policy considers them violent.
A France 24 Investigation Uncovers an Indian Network Leading a Campaign Against Palestinians
Upon examining the classification provided by the claim, Misbar noted that it refers to a report prepared by the "Observers" unit at France 24 about a disinformation campaign initiated by users in India to target and discredit Palestinians, but not a complaint against the content of Al-Jaafarawi.
The attached France 24 report debunked the campaign, which included Indian activists falsely claiming that Palestinians were faking injuries from Israeli army shelling using cosmetics, while, in reality, the injuries were either old or from a film set.
However, the report did not include a debunking of Al-Jaafarawi's video showing bloodstains on his shirt while holding a baby.
One of the examples highlighted in the France 24 report, featured on its cover, and utilized by the deceptive Indian campaign, was a picture of a young man in a hospital, falsely claimed to be Saleh Al-Jaafarawi faking his injuries. But an earlier investigation by Misbar revealed that the image actually belonged to Mohamed Zandeeq, who was injured during the raid on the Nour Al-Shams camp in Tulkarm, West Bank, on July 24 of the previous year, indicating its outdated nature and different context.
Meta Uses Artificial Intelligence To Detect Misinformation
Meta relies on artificial intelligence to identify and match claims with previously published false claims. It developed the artificial intelligence model SimSearchNet++, which is trained using self-supervised learning to match various shapes in images spread on Facebook or Instagram. Therefore, there may be confusion in matching Al-Jaafarawi's video and the claim denied by France 24.
It is worth noting that Meta has previously sparked controversy by adopting classifications from fact-checking platforms for content considered false or inaccurate. One notable example of issues faced by Meta's fake news detection system is the tag applied to one investigation by the British medical journal about questionable research practices in the clinical trials of the company Ventavia. The tag was classified as inaccurate, despite the journal being one of the most reliable peer-reviewed medical journals.
The hashtag on Facebook for posts sharing the investigation by the medical journal was labeled as "inaccurate" or "missing context," according to a fact-check by the information verification platform Lead Stories. Lead Stories stated in its fact-check that the British journal's investigation had flaws and promoted conspiracy theories, without providing evidence supporting those claims.
Read More
Can the Bombing of Gaza Cause a Significant Earthquake, as Claimed by Frank Hoogerbeets?