
Facebook is under fire once again over the proliferation of vaccine misinformation on its platform, after Joe Biden said tech giants such as Facebook are “killing people” for failing to tackle the problem.
The White House has also zeroed in on the “disinformation dozen”: accounts that have been shown to be responsible for the bulk of anti-vaccine misinformation on social media platforms.
And while Facebook has defended itself, saying it has removed more than 18m pieces of Covid misinformation, experts who study online misinformation say it has still largely failed to address the issue and that falsehoods about the vaccine are still reaching millions of people.
“Facebook has repeatedly said it is going to take action, but in reality we have seen a piecemeal enforcement of its own community standards where some accounts are taken off Instagram but not Facebook and vice versa,” said Imran Ahmed, the CEO of the Center for Countering Digital Hate (CCDH), the organization behind the “disinformation dozen” study cited by the White House. “There has been a systemic failure to address this.”
That report from March identified the 12 “superspreader” accounts. A Facebook spokesman said the company permanently bans pages, groups, and accounts that “repeatedly break our rules on Covid misinformation”, including “more than a dozen pages, groups, and accounts from these individuals”.
In the months since the study was released, the CCDH confirmed social platforms have taken action against members of the “dozen”, removing 35 accounts across social media. They have lost 41% of their followers – 5.8 million – but still have 8.4 million followers total and 62 active accounts.
Read more: ‘A systemic failure’: vaccine misinformation remains rampant on Facebook, experts say
