Coronavirus: False claims viewed by millions on YouTube
More than a quarter of the most viewed coronavirus videos on YouTube contain “misleading or inaccurate information,” suggests a study.
In total, the misleading videos have been viewed more than 62 million times.
Among the false claims was the idea that pharmaceutical companies already have a coronavirus vaccine but refuse to sell it.
YouTube has stated that it has committed to reducing the spread of harmful information on disinformation.
The researchers suggested that “accurate and good quality information” was uploaded to YouTube by government agencies and health experts.
But he said the videos were often difficult to understand and lacked the popular appeal of YouTube stars and vloggers.
The study, published online by BMJ Global Health, examined the most viewed videos of coronavirus in English, starting March 21.
After excluding duplicate videos, videos longer than an hour and videos that didn’t include relevant audio or visual material were left with 69 to analyze.
The videos were evaluated if they presented only factual information about viral spread, coronavirus symptoms, prevention and potential treatments.
Government agency videos scored significantly better than other sources, but were viewed less widely.
Of the 19 videos found to include disinformation:
- about a third came from entertainment news sources
- national news reports represented about a quarter
- Internet news sources also account for about a quarter
- 13% were uploaded by independent video producers
The report recommends governments and health authorities to work with entertainment news sources and social media influencers to create engaging and factual content that is more widely viewed.
YouTube said in a statement: “We are committed to providing timely and useful information at this critical time, including the collection of authoritative content, the reduction of the dissemination of harmful misinformation and the display of information panels, using SSN data and of the World Health Organization (WHO), to help fight disinformation.
“We have clear policies that prohibit videos promoting non-medically proven methods of preventing coronavirus in place of seeking medical treatment, and we quickly remove videos that violate these policies when reported to us. Now any content that challenges the existence or transmission of Covid-19, as described by the WHO and the NHS is in violation of YouTube policies. For borderline content that could misinform users in a harmful way, we reduce the recommendations.
“We will continue to evaluate the impact of these videos on communities around the world.”
by Marianna Spring, disinformation specialist and social media reporter
In recent weeks, there has been an increase in very refined videos that promote conspiracy theories shared on YouTube and prove to be very popular.
So these results – although concerning – are not surprising.
Accurate information shared by trusted healthcare organizations on YouTube tends to be more complex.
The popular appeal of conspiracy videos, which give misleading explanations to worried people looking for quick answers or someone to blame, may be missing.
This includes videos like Plandemic, which was widely shared online last week.
High-quality production values and interviews with alleged experts can make these videos very convincing. Often the facts are presented out of context and used to draw false conclusions.
And dealing with this type of content is a cat and mouse game for social media sites.
Once videos get traction, even if they are removed, they continue to be uploaded repeatedly by other users.
It is not just about alternative outlets that load misinformation. Whether it’s views or clicks, the study suggests that some major media are also guilty of spreading misleading information.