Print Friendly, PDF & Email

This past Wednesday, Facebook announced that between April and June it removed 20 million posts that contained COVID-19 misinformation. The platform also confirmed that warning labels had been added to more than 190 million COVID-19-related posts. The data was released as part of the platform’s Community Standards Enforcement Report, and, starting this past Wednesday, it is accompanied every quarter by the Widely Viewed Content Report.

Also on Wednesday, according to the First Draft misinformation tracking source to which Ark Valley Voice belongs, Facebook outlined in a blog post the steps it is taking against vaccine misinformation “superspreaders.”

“They included removing over three dozen Pages, Groups and Facebook or Instagram accounts linked to the “Disinformation Dozen,” even as it shot back against the report from the Center for Countering Digital Hate that said 12 individuals were responsible for 65 percent of vaccine misinformation on Facebook. (First Draft has also pointed out some limitations of the CCDH report, which had been cited by the White House and the U.S. Surgeon General amid the government’s push against COVID-19-related misinformation.)”

In a new study from the Pew Research Center, almost half of the country (48 percent) thinks that the government should be taking steps to curb false information, up from 39 percent in 2018. Correspondingly, the percentage of Americans who think “the freedom to publish and access information” must be protected, even if that information is false, has dropped since 2019, from 58 to 50 percent. (Pew Research Center)

Facebook CEO Mark Zuckerberg. Image courtesy of Wibw.

Facebook’s efforts would surely appear to be big steps, but the disinformation problem is so bad — it is mounting faster than the posts are being removed. Facebook isn’t necessarily transparent about its progress. In fact, earlier this spring, First Draft informed us of Facebook’s decision to suspend the accounts of researchers from New York University’s Ad Observatory. But last month we learned that in April, Facebook gutted its CrowdTangle tool used by misinformation experts.

Vox’s Recode reported that “That’s in large part because Facebook isn’t giving researchers enough of the real-time data they need to figure out exactly how much COVID-19 misinformation is on the platform, who’s seeing it, and how it’s impacting their willingness to get vaccinated,” wrote Shirin Ghaffary.

If experts can’t quantify the full extent of COVID-19 misinformation on the platform, the real-world impact can’t even be measured. According to First Draft Suffice “a July report from the COVID States Project — a collaborative effort by researchers from several universities — found that Facebook news consumers were less likely to get vaccinated than Fox News audiences. With cases, hospitalization, and deaths surging in many parts of the US, the benefits of vaccination — and the harm from anti-vaccine content — are hard to overstate.”