Facebook and YouTube are losing the Covid-19 vaccine misinformation fight

Open Sourced Logo

Social media companies like Facebook and YouTube have reversed their policies against false coronavirus information and banned false claims about Covid-19 vaccines. But as the vaccine begins to roll out, online accounts are taking advantage of gaps in new policies and successfully sharing false claims that seek to promote vaccination.

Throughout the pandemic, platforms have established and updated regulations that are meant to prevent Covid-19-related fraudulent claims. Between March and October, Facebook downloaded 12 million pieces of content on Facebook and Instagram, and added info checklists to a further 167 million posts. But the authorized release of Covid-19 vaccine has forced social media companies to change again, changing the approach to both Covid-19 misinformation and anti-vaccine content long ago.

There are already plenty of examples of online content that casts doubt on Covid-19 vaccines. There are posts recommending vaccines as part of a government scheme and memes that mean the vaccine comes with real effects either not captured by the platforms or not. they seem to be breaking their rules.

The platforms are not just fighting communities against vaccination. Conspiracy theories, conservation groups, margin centers, and others are actively raising concerns about vaccines, according to Yonder, a company that advises companies involved in vaccine development. Although recent censuses show that the number of Americans willing to get vaccinated has grown – to about 70 percent, according to the Kaiser Family Foundation – there are still millions of Americans willing to take the vaccine, and many may not immediately accept it.

Facebook has vowed to eliminate false Covid-19 vaccine claims that could cause physical harm, and YouTube has said it will download videos about Covid-19 vaccines against health authorities such as the World Health Organization . Twitter takes the two-pronged approach to taking down Covid-19 misinformation it considers the most harmful applications as well as just false positives.

But overall, these approaches so far appear to focus on removing misinformation rather than tackling the wide range of vaccine attacks and suspicions – a lump sum. a barrier that can be much more complicated to deal with.

While platforms tend to design new policies to prevent misinformation, they do not always find and remove all content that violates their rules. In a Facebook, YouTube, and Twitter search, Recode found plenty of vaccine misinformation that has not been removed or identified as such.

On Facebook, Recode identified a number of posts that were only taken down after we identified them. Some of those removed say the pandemic was planned or that the vaccine would contain a microchip, a claim that is specifically prohibited under Facebook rules. Another post was taken down by a Facebook meme that jokingly meant that the vaccine comes with real side effects. The image had already been shared more than 100,000 times before Facebook took it down.

This meme, which means the vaccine has serious physical effects, is no longer available on Facebook.
Screenshot from Facebook

Other posts identified by Recode that appeared to violate company rules include one Facebook post stating that the Covid-19 vaccine will “change your DNA” and “take attack of the uterus. He linked to a YouTube video that discusses the conspiracy theory of “Plandemic” and Bill Gates. The post was shared in a Facebook group with more than 12,000 members, and the video was viewed more than 15,000 times on YouTube. Similarly, in a public Facebook Facebook group with 50,000 members, a post said that the Covid-19 vaccines were part of an effort to keep us from climbing into the spiritual creatures we were supposed to be. ”

While YouTube has vowed to eradicate Covid-19 vaccine misinformation, Recode found a range of content on the platform that appeared to violate these policies, including videos that they were easily discovered suggesting that the Covid-19 vaccine alters human DNA or that the vaccine is a ploy. to kill elderly people in nursing homes. YouTube took down one video identified by Recode that suggested the vaccine could be a “sign of the Beast” and linked to the end times in the Book of the Revelation.

Media Matters has found that, despite YouTube’s policies, videos suggesting the Covid-19 vaccine contained a microchip had received more than 400,000 views, and had ads some running on them. Meanwhile, Sam Clark, of YouTube watchdog Transparency Tube, reveals that there are plenty of channels known for pushing conspiracies posting about vaccines.

Twitter will begin implementing its new policies against Covid-19 misinformation starting December 21, and research shows the problem is growing and growing. November saw the biggest rise in the number of retweets of vaccine misinformation on Twitter this year, according to the company found misinformation VineSight.

Individual posts on these platforms don’t usually get much engagement, but they can get a large amount of pull together and can even spread to other platforms. According to data from Zignal Labs, between December 8 and 14, nearly 30,000 reported allegations of Chinese Communist Party involvement in the vaccines and nearly 90,000 reported Bell’s palsy, a condition that often temporary that causes parts of a person’s face to sag. After four participants in a Moderna vaccine trial found the condition, the FDA warned people to look for signs of Bell’s palsy, but the group says there is not enough information for Bell’s palsy and bind the vaccine.

At the same time, much of the content that casts doubt on Covid-19 vaccines avoids making factual claims and is not removed. In an Instagram post, for example, conservative reporter Candace Owens named people who will receive the “sheep” vaccine. The video was labeled by Facebook, but was still viewed over 2 million times.

Also of concern are those who are making false claims about compulsory vaccinations, which the U.S. government is not considering. Research from Zignal Labs found that between December 8 and 14, there were more than 40,000 reports of compulsory vaccination on the platforms it follows.

“Really, they’re fighting a ghost. They are fighting a boogeyman, ”notes David Broniatowski, who studies behavioral epidemiology at George Washington University. “There is no one out there who says we are going to make a law prescribing Covid vaccine. ”

These comments are not just misinformation, and often do not make claims about the vaccine itself. However, they weaken confidence in vaccination by raising the expectation of government control, the politics of vaccination, or raising doubts about the science behind it.

“Someone says, ‘Do you know what the Covid vaccine is? ‘And they just leave it at that – that’s not real information,’ said Broniatowski. “But of course there is a growing trust in the vaccine.”

This petition makes the task of modeling what is allowed on sites like Facebook and YouTube very difficult. These platforms do not want to be accused of extending anti-vaccine content, but carefully crave content that incorporates Covid’s debates, humor, opinions and facts. -19 aimed at vaccines, especially since we are still learning more about Covid-19 vaccines. At the same time, public health experts have stressed that people should have a place to ask questions about vaccines.

Importantly, these platforms use out-of-motion strategies, such as enforcing leaflets and promoting accurate information from health authorities. But the main concern is that Facebook, Twitter, and YouTube policies could contribute to the problem of vaccine delays, not only through police misinformation but also addressing these gray areas. So while the public may put pressure on platforms to bring down irrational content, what they leave up is just as sad.

Open Source enabled by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

Source