In a policy shift, Facebook is now removing groups and pages that discourage people from getting vaccines. As social media giants grapple with the anti-vaccine movement, Specialist disinformation reporter Marianna Spring meets the everyday citizens battling conspiracy theories in their spare time.
The banner image shows a Photoshopped picture of Bill Gates with a crazed expression holding a needle decorated with a skull and crossbones. It looks like a Facebook group promoting anti-vaccine conspiracy theories.
But this group actually has very different intentions.
Richard is a builder, a trainee psychologist – and one half of the duo behind the decoy group. He says he aims not to spread bad information but rather to help people attracted to conspiracy theories.
His friend Dave (not his real name; we’ve agreed to give him a pseudonym because he fears abuse from anti-vaccine activists) believed in conspiracy theories for the best part of 20 years.
“If I was to actually create a group saying, ‘I’m going to re-educate you’… then I’m not going to get any takers,” he says.
“So I have to do it in a stealth way, which is a bit underhanded, I suppose. But the intentions are good.”
The group’s name references Bill Gates and completely unfounded conspiracy theories that the Microsoft founder is plotting to kill millions of people and control them with implanted microchips. And once people are drawn in, the two moderators try to reason with them, to bring them back to reality.
Richard and Dave are just two of the dozens of volunteers the BBC has spoken to who are combating online misinformation about vaccines. But are they doing work that should be Facebook’s responsibility?
The problem
Original research from BBC Monitoring has revealed how Facebook pages and groups promoting misleading and false claims about vaccines saw a significant rise in followers in several countries across the globe in the past year.
In Ukraine, pages sharing anti-vaccine content grew by 157% in 2020, reaching nearly 26,000 page likes, double the rate for the previous year. In Mexico, Brazil and India, similar pages grew by around 50% each in the past year – faster than in the two previous years.
It’s further proof of the spread of anti-vaccine content throughout the pandemic.
Previous research found a huge spike in followers of English-language social media accounts promoting anti-vaccine material during the pandemic, especially on Instagram and Facebook.
Although there is some overlap online, our research focused on extreme content – accounts and groups spreading false “genocide” and “implanted microchip” claims – rather than legitimate questions people have about safety and efficacy, and stories about rare cases of blood clots.
The Anti-Vax Files
Volunteers fight back
It was the pandemic’s wave of anti-vaccine content that prompted Dave and Richard to embark on their plan.
“I was out of work,” Dave says. “So I wanted to do something constructive.”
Although the duo have only met in real life once, they now run multiple “honeypot” Facebook groups that have thousands of members from all over the world.
Inside the groups, people who believe in vaccine and Covid-19 conspiracy theories are allowed by the moderators to post false and misleading articles.
Richard admits he’s conflicted about the deception.
“It was horrible having to lie to begin with,” he says.
After members initially joined the group, he says, the pair would observe what they shared, sometimes for weeks.
“And then it’d stop,” Richard says, “and we’d start questioning their narrative.”
Dave and Richard debunk myths and challenge people in comments under posts and via private message.
Dave uses his own personal experience of conspiracy theories to strike up a rapport with those in the group. He began to question his previous worldviews after he realised that the people promoting conspiracies were conning him. Their nightmare scenarios, he says, never seemed to come true.
“I just got tired of it,” he says, “I got tired of finding out about the next conspiracy, the next conspiracy and then looking back and thinking, well, this didn’t happen and that didn’t happen.”
Richard says some of his friends and family have been affected by online misinformation. He tries to engage with members of the groups to understand how they have fallen for falsehoods.
‘I might not be here’
One of those people was Brian. He was scared off vaccines by misleading posts on social media sites – including graphic videos promoting false claims about foetuses being used in jabs.
Brian’s encounters with the underbelly of social media coincided with an incredibly difficult personal time. Towards the end of 2019, he lost his job. He has multiple sclerosis, and around the same time, his condition started getting worse. Then the pandemic hit.
“I wasn’t in a good place,” he says, sitting on his leather sofa at home. He explains how he spent hours watching YouTube videos made by anti-vaccine activists.
But he also joined Dave and Richard’s Facebook group, thinking it was an anti-vaccine community. And that’s when things started to change.
“They sort of swung me round,” Brian says, “by sending me actual factual information.”
Richard talked with him about the personal difficulties that had left him vulnerable to the easy explanations of online pseudoscience. And he also explained how the algorithms of social media sites work to reel people in – with emotion, and by serving up content similar to that which the user has seen before.
Brian even credits Dave and Richard with saving his life. If he hadn’t encountered their group, Brian says, “I might not be here. I went to some dark places.”
But now, he says, things are looking up.
“I’m in a better place,” he says. “I’m in a proper home environment now, I’ve got rugrats running around my feet again.” His face lights up as he speaks of his grandchildren.
He’s also had a vaccine against Covid-19, having been completely opposed to it just a few months before.
‘Vaccine discouragement’
Richard blames social media sites – particularly Facebook – for failing to protect users like Brian.
“It needs to be policed a lot better,” he says. “And until they do, conspiracies are going to keep growing.”
In an interview, Facebook’s vice president for Northern Europe Steve Hatch acknowledged that the company has “a big responsibility to ensure people are seeing accurate information.”
Mr Hatch told the BBC that the company is now removing groups, pages and accounts that deliberately discourage people from taking vaccines, regardless of whether the information can be verified as false or not.
It’s a shift in policy. Previously Facebook would only delete groups – and accounts on Instagram, which it owns – filled with outright false vaccine information. Some of the groups that fall under the new policy may include material that’s true, or unverifiable, but are not outright falsehoods.
This includes Facebook groups with tens of thousands of members that have sprung up in recent months, dedicated to stories of people allegedly injured by Covid vaccines. We’ve seen several of these groups rapidly become popular. They’re often filled with scary stories, but just as frequently lacking in details and hard evidence.
Facebook say they’ve been combating misinformation, that they’ve labelled more than 160 million pieces of misleading content since the start of the pandemic, and connected 2 billion people to information from trusted health authorities.
Policy dilemmas
Meanwhile, Richard and Dave’s honeypot group has been suspended – because while they say their intentions are good, the group does contain posts pushing falsehoods.
It’s a situation that highlights the moderation challenges faced by Facebook – where a group like this, which they say is dedicated to helping individuals, technically breaches the company’s rules.
Richard and Dave are appealing the decision. And they plan to keep using their methods to help people like Brian -and others like him, who they are yet to reach.
Source: BBC