Disturbed by the number of unvaccinated COVID-19 patients who showed up at his hospital, the French doctor logged on to Facebook and uploaded a video urging people to get vaccinated.
It was soon flooded with dozens, then hundreds, then more than 1,000 hateful messages from an anti-vaccine extremist group known as V_V. The group, active in France and Italy, harassed doctors and public health officials, vandalized government offices and tried to shut down vaccination clinics.
Alarmed by the abuse of its platform, Facebook opened several accounts linked to the group last December. But it hasn’t stopped V_V, which continues to use Facebook and other platforms and, like many anti-vaccine groups around the world, has expanded its portfolio to include climate change denial and undemocratic messaging.
“Let’s go and get them at home, they don’t have to sleep anymore,” reads a post by the group. “Fight with us!” he reads another.
The largely uncontrolled nature of the attacks on the indisputable health benefits of the vaccine highlights the clear limits of a social media society in countering even the most destructive type of disinformation, particularly without sustained aggressive effort.
Researchers at Reset, a UK-based nonprofit, have identified more than 15,000 Facebook posts that are abusive or laden with misinformation from V_V, activity that peaked in spring 2022, months after the platform announced his actions against the organization. In a report on V_V’s activities, Reset researchers concluded that its continued presence on Facebook raises “questions about the effectiveness and consistency of Meta’s self-reported intervention.”
Meta, the parent company of Facebook, noted in response that its 2021 actions were never intended to delete all V_V content, but to delete accounts that participated in coordinated harassment. After the Associated Press notified Facebook of the group’s continued activities on its platform, it said it removed 100 more accounts this week.
Meta said it is trying to find a balance between removing content from groups like V_V that clearly violate the rules against harassment or dangerous misinformation, without silencing innocent users. This can be particularly difficult when it comes to the controversial issue of vaccines.
“This is a highly contradictory space and our efforts are ongoing: since our initial removal, we have taken numerous actions against the return attempts of this network,” a spokesperson for Meta told the AP.
V_V is also active on Twitter, where Reset researchers found hundreds of accounts and thousands of posts from the group. Many of the accounts were created shortly after Facebook took action on the program last winter, Reset discovered.
In response to Reset’s report, Twitter said it took law enforcement actions against several accounts linked to V_V, but did not detail those actions.
V_V proved particularly resistant to efforts to stop it. Named for the movie “V for Vendetta”, in which a lone masked man seeks revenge on an authoritarian government, the group uses fake accounts to evade detection and often coordinates its messages and activities on platforms such as Telegram lacking the most aggressive Facebook’s moderation policies.
This adaptability is one of the reasons it was difficult to stop the group, according to Jack Stubbs, a researcher at Graphika, a data analytics company that monitored V_V’s activities.
“They understand how the Internet works,” Stubbs said.
Graphika estimated there were 20,000 members of the group at the end of 2021, with a smaller core of members involved in its online harassment efforts. In addition to Italy and France, Graphika’s team has found evidence that V_V is trying to create chapters in Spain, the UK, Ireland, Brazil and Germany, where a similar anti-government movement known as Querdenken is active.
Groups and movements such as V_V and Querdenken have increasingly alarmed law enforcement and extremism researchers who say there is evidence that far-right groups are using skepticism about COVID-19 and vaccines to expand their reach. .
Increasingly, such groups are shifting from online harassment to real-world action.
For example, in April, V_V used Telegram to announce its intention to pay a bounty of € 10,000 to vandals who sprayed the group’s symbol (two red Vs in a circle) on public buildings or vaccination clinics. The group then used Telegram to spread the photos of the vandalism.
A month before Facebook took action on V_V, Italian police raided the homes of 17 anti-vaccine activists who had used Telegram to threaten government, medical, and media figures for their perceived support for COVID-19 restrictions.
Social media companies have struggled with the response to a wave of disinformation on vaccines since the start of the COVID-19 pandemic. Earlier this week, Facebook and Instagram suspended Children’s Health Defense, an influential anti-vaccine organization led by Robert Kennedy Jr.
One reason is the difficult balance between moderating harmful content and protecting freedom of expression, according to Joshua Tucker of New York University, who co-directs NYU’s Center for Social Media and Politics and is a senior consultant to Kroll, a technology company, government and economics consulting firm.
Finding the right balance is especially important as social media has emerged as a key source of news and information around the world. It leaves too much malicious content and users may be misinformed. Delete too much and users will start to distrust the platform.
“It is dangerous for society that we are moving in a direction where no one feels they can trust the information,” said Tucker.