Social media firms make $1bn a year from anti-vax followers, report says
Conspiracy theorists at Hyde Park Corner on 16 May 2020 in London: Getty Social media
Social media platforms are making up to $1bn a year from people following anti-vaccine misinformation that could cause “tens of thousands” of coronavirus deaths, researchers say.
The Centre for Countering Digital Hate (CCDH) said the number of people viewing pages and posts claiming that a Covid-19 vaccine is unnecessary or would pose a health risk had risen dramatically during the pandemic.
Despite pledges by Facebook and others to crack down on harmful posts, a report found that at least 57 million users now follow anti-vaxxers on mainstream platforms across the UK and US – up 7.7 million since the start of the outbreak.
A YouGov poll suggested that almost one in five British adults say they would refuse the injection if it becomes available, and a further 15 per cent are unsure.
The research suggested that people who use social media more than mainstream news outlets as a source of information are more likely to reject a coronavirus vaccine.
“If 31 per cent of people do not take the vaccine we will not achieve herd immunity,” Imran Ahmed, chief executive of the CCDH, told The Independent.
“If we don’t reach that we will not be able to contain the virus. We will have waves and waves of years and years and tens of thousands of people will die.”
Estimates of the proportion of people who must be immunised to coronavirus, either by prior infection or vaccination, for herd immunity to take effect vary between 55 per cent and 82 per cent.
High vaccine coverage has largely eliminated many potentially fatal diseases from the UK, but some – like measles – have seen a resurgence when uptake has fallen.
Studies have linked the trend to activism by online anti-vaxxers, who oppose immunisation because they believe it is either unnecessary, risky or harmful.
Mr Ahmed said social media firms had “both powered and profited from” the spread of anti-vax theories, by amplifying them through algorithms and making money from advertising.
“It’s an ideological dirty bomb that will spread toxic misinformation throughout a population,” he added.
“If social media companies do not react in the right way, the government needs to take action.”
The report called for firms to be fined if they fail to combat anti-vaxxers, but a new regulatory system proposed under the UK’s Online Harms Bill has not yet been considered by parliament.
The CCDH said that changing medical advice during the coronavirus pandemic had provided the group an “opportunity to exploit” by seeking to discredit official health bodies.
Its analysis of more than 400 anti-vax Facebook groups and pages, YouTube channels and Twitter and Instagram accounts found that some were selling fake cures for coronavirus and telling people not to seek medical treatment for symptoms.
Private Facebook groups were found to be “radicalising sceptics into determined anti-vaxxers” by spreading emotional posts about supposedly ill or cured children without challenge.
The CCDH documented one case where a mother said she feared her two-year-old daughter, with a fever and cough, had coronavirus but was told by members to put lemon and onions on the child’s body rather than going to a doctor.
Other pages were selling substances they claimed would cure or prevent coronavirus, including silver, hormones and plant extracts.
Some anti-vaxxers have moved into wider conspiracy theories, including those claiming that Microsoft founder Bill Gates created the pandemic, that vaccines cause Covid-19, and that tests for a coronavirus jab had made women infertile.
Mr Ahmed said that the coronavirus outbreak had created a “perfect storm” for people to be drawn into anti-vax movements, as fear and confusion combined with increased isolation and time spent online.
“There are some new groups springing up and there are more people coming into those spaces,” he added.
“We also know that existing groups are very opportunistic and will pick up on any opportunity to draw people into their radical worldview.”
The report found that both anti-vax campaigners and social media companies were profiting from increased interest during coronavirus.
It said that through all their online activity, the followers of anti-vax accounts could be worth up to $1bn (£800m) in annual revenue for technology giants.
Researchers found that Facebook and Instagram get up to $989m (£792m) in revenue, Twitter $5.6m (£4.5m) and YouTube $797,000 (£634,000).
The report said that Facebook’s advertising library also showed that at least 28 anti-vax accounts had paid to place adverts on the platform, despite the firm pledging in March 2019 that “when we find ads that include misinformation about vaccinations, we will reject them”.
YouTube has removed advertising from some anti-vax channels but not all, the report said, after it vowed to demonetise them in February last year.
Researchers found that Twitter’s advertising platform made it possible to target users with anti-vaccine conspiracy theories.
The report accused social media giants had adopted “lenient” policies by making it harder to find anti-vax content rather than by removing it, and allowing activists to “successfully navigate weak policies to exploit the new opportunities coronavirus has presented”.
Mr Ahmed said that long-standing anti-vax campaigners had “instrumentalised” the prospect of a Covid-19 vaccine after trying to claim that track and trace schemes were a cover for secretive monitoring.
“There are lots of decent, well-meaning people who fall for conspiracy theories because they’re scared and confused, just as so many of us are,” he added.
“But I would go beyond calling anti-vaxxers conspiracy theorists to say they are an extremist group that pose a national security risk.”
Mr Ahmed said the movement “transcended left and right-wing boundaries” but warned: “Once someone has been exposed to one type of conspiracy it’s easy to lead them down a path where they embrace more radical world views that can lead to violent extremism.”
A Facebook spokesperson said: “We are working to stop harmful misinformation from spreading on our platforms and have removed hundreds of thousands of pieces of Covid-19-related misinformation.
“We reduce vaccine misinformation in News Feed, we don’t show it in search results or recommend it on Facebook or Instagram, we don’t allow it in ads, and we connect people with authoritative information from recognised health experts.”
A spokesperson for Twitter said anyone searching for vaccine-related information is directed to the NHS and its advertising policy forbade misleading medical claims.
It added: “Twitter’s top priority is protecting the health of the public conversation and surfacing authoritative public health information.”
YouTube said all six channels featured in the report had now been demonetised.
“We’ve taken a number of steps to address misinformation including surfacing more authoritative content across our site for people searching for vaccination-related topics, beginning to reduce recommendations of certain anti-vaccination videos and showing information panels with more sources where they can fact check information for themselves,” a statement added.
A government spokesperson said: “Since the start of the pandemic, specialist government units have been working at pace to identify and rebut false information about coronavirus. We are also working closely with social media platforms to help them remove incorrect claims about the virus that could endanger people’s health.
“We are developing our world-leading plans to put a duty of care on online platforms towards their users and will introduce legislation as soon as possible.”
Almost half of Britons believe coronavirus is ‘man-made’, poll finds
Warning over coronavirus disinformation as extremists exploit crisis
Covid-19 conspiracies must be taken seriously, warns extremism chief
Inside the UK’s biggest anti-lockdown protest
Only one in 10 coronavirus misinformation posts taken down – report
The battle with coronavirus misinformation is only just starting