Small number of Facebook users responsible for most Covid vaccine complaints – report | Technology

A small subset of Facebook users relying on most of the content is reported to express or provoke doubts about Covid-19 vaccines, according to early results from a Facebook survey that -in.

The study, originally reported by the Washington Post, confirms what researchers have long argued about how the impact of an echo chamber can elevate some beliefs within media communities. social. It also shows how less accurate language information on vaccines, which is banned on Facebook, could still contribute to vaccine dissatisfaction.

A document releasing the study – which has not been made public – has been obtained by the Washington Post. Researchers at Facebook divided users, groups and pages into 638 “population segments” and surveyed them for “vaccine hesitant beliefs”, according to the Post. This could include language such as “I am worried about getting the vaccine because it is so new”, or “I do not know if a vaccine is safe”, rather than incorrect information.

Each “section” could be up to 3m people, meaning the survey could monitor the activity of more than 1bn people – less than half of 2.8bn monthly active Facebook users, the Post said. The large study also confirms the amount of information available from the Facebook user base, and how the company uses this data to analyze public health outcomes.

The Post said that the survey found in the area of ​​population with the highest incidence of vaccine malnutrition, only 111 users relied on half of the content identified within of that department. It also showed that only 10 out of the 638 population segments identified accounted for 50% of the vaccine vaccine content on the platform.

Facebook’s investigation into vaccine delays is part of an ongoing effort to help public health campaigns in the wake of the pandemic, spokeswoman Dani Lever said, and is one of several studies Facebook is conducting.

“We regularly check things like voting, bias, hate speech, nudity, and Covid – to understand emerging trends so we can build, refine, and measure our results,” said Lever.

Meanwhile, Facebook, in the past year, has partnered with more than 60 global health experts to provide accurate information regarding Covid-19 and vaccines. It announced in December 2020 that it would ban false information from vaccines, deter users from breaking the rules and ultimately ban if they continue to go against policies.

The study is just the latest to reveal the external impact that only a few actors can have on an online information ecosystem. It comes on the heels of another study from the Electoral Integrity Project which found that the majority of election misinformation in the period was due to a handful of rectangular “super-spreaders” on social media. up to Capitol attack. In that report, experts set out a number of recommendations, including the complete removal of super-spreader accounts.

The Facebook study also found that there may be a large overlap between users exposing anti-vaccination behavior on Facebook and supporters of QAnon, an unfounded conspiracy theory surrounding the “deep state” cabal of Democrats and Hollywood celebrities involved in pedophilia and sex trafficking.

The crossover shows another long-term effect on the rise of QAnon, which has also been linked to the uprising at the Capitol in January. Many far-flung actors, including QAnon fans and admirers, understand how to manipulate social media algorithms to reach a wider audience, said Sophie Bjork-James, senior professor of anthropology at Vanderbilt University who studies white nationalist movement in the USA.

“QAnon is now a public health threat,” said Bjork-James. “Over the past year, QAnon has spread widely within the community against online vaccination and by expanding the other health community. A Facebook survey shows that we are likely to be affected by this for some time to come. ”

.Source