Facebook announced on Monday that it was stepping up efforts in favor of vaccination against the coronavirus, with more information on national campaigns and strengthened measures against false rumors spread by anti-vaccine groups.
The American social media giant wants to do more to “remove false claims on Facebook and Instagram on Covid-19, on vaccines against Covid-19 and vaccines in general during the pandemic,” he said in a press release.
Facebook has therefore extended its list of misconceptions that will not be tolerated, and are already prohibited in advertisements.
In particular, it includes messages stating that Covid-19 was made by humans, that vaccines are not effective, that it is less dangerous to catch the disease than to be vaccinated, or that the vaccines are toxic or cause autism.
People who share this kind of disinformation could be banned, the California group warned.
Group admins have been told that they will need to approve posts from members who tend to spread misinformation, before they are shared.
And on Instagram, user accounts looking to discourage their followers from getting the vaccine will be harder to find.
The dominant platforms have been collaborating for months with large health organizations to highlight information “which has authority” on the health crisis, in particular through its “information center on Covid-19”.
“More than 2 billion people from 189 countries have been connected to reliable information” via this tab, argues the company.
But Facebook critics are not convinced.
“Facebook has repeatedly promised to crack down on disinformation related to Covid and anti-vaccines for a year,” tweeted an NGO fighting “digital hatred”, the Center for Countering Digital Hate.
“Each time, they fail to meet their goals.”
The social network is soon to publish the results of a large study on the pandemic, which collected 50 million responses from people expressing their opinion or recounting their experiences on subjects such as symptoms of Covid-19, wearing a mask or access to care.