Coronavirus: Facebook alters virus action after damning misinformation report

Coronavirus: Facebook alters virus action after damning misinformation report

Coronavirus: Facebook alters virus action after damning misinformation report

Facebook is changing the way it treats Covid-19’s misinformation after a malicious report on its management of the virus.

Users who have read, watched or shared fake coronavirus content will receive a pop-up notice inviting them to visit the World Health Organization website.

A study indicated that Facebook often failed to crack down on fake posts, particularly when they were in languages ​​other than English.

Facebook said the research does not reflect recent work.

The Californian tech company says it will start showing messages at the top of news feeds “in the coming weeks”.

A Facebook spokesman said he did not recognize the warnings as a policy change, but instead told the BBC that it was “operational changes to the platform”.

Facebook logo 

Redirection of truth

The messages will direct people to World Health Organization website where myths are dispelled.

A Facebook spokesman said the move “will connect people who may have interacted with malicious information about the disinformation of the virus with the truth from authoritative sources, in case they see or hear these claims again on Facebook.”

The changes were prompted by a major misinformation study on the platform in six languages ​​by Avaaz, a crowdfunded funded activist group.

The researchers say millions of Facebook users continue to be exposed to coronavirus disinformation, without warning on the platform.

The team found that some of the most dangerous lies had received hundreds of thousands of opinions, including claims like “blacks are resistant to coronavirus” and “coronavirus is destroyed by chlorine dioxide”.

Avaaz researchers analyzed a sample of over 100 misinformation fragments about Facebook’s coronavirus in the English, Spanish, Portuguese, Arabic, Italian and French versions of the website.

Research has found that:

  • The company can take up to 22 days to issue warning labels for coronavirus disinformation, with delays even when Facebook partners have reported harmful content to the platform.
  • 29% of fake content in the sample was not labeled at all in the English version of the website
  • It’s worse in some other languages, with 68% of Italian-language content, 70% of Spanish-language content and 50% of Portuguese-language content not labeled as fake
  • The Arabic-language Facebook efforts are more successful, with only 22% of the sample of misleading posts unlabeled.

Facebook says it is continuing to expand its multilingual network of fact-checkers who award grants and collaborates with trusted organizations in over 50 languages.

Fadi Quran, director of the Avaaz campaign, said: “Facebook is at the epicenter of the disinformation crisis.

“But the company is turning a critical corner today to clean up this toxic information ecosystem, becoming the first social media platform to alert all users who have been exposed to coronavirus misinformation and directing them to life-saving facts.”

One of the falsehoods that the researchers tracked down was the claim that people could rid the body of the virus by drinking lots of water and gargle with salt or vinegar. The post was shared over 31,000 times before it was removed after Avaaz marked it on Facebook.

However, more than 2,600 post clones remain on the platform, with nearly 100,000 interactions, and most of these cloned posts do not have warning labels from Facebook.

Mark Zuckerberg, Facebook founder and CEO, defended his company’s work in an online post saying: “On Facebook and Instagram, we have now directed more than two billion people to authoritative healthcare resources through our Covid-19 Information Center. and educational pop-ups, with over 350 million people clicking to find out more.

“If a content contains harmful misinformation that could lead to imminent physical harm, then we will eliminate it. We have eliminated hundreds of thousands of incorrect information related to Covid-19, including theories such as drinking bleach virus or that physical distance is ineffective in preventing spread of the disease.For other erroneous information, once classified as false by the controllers of the facts, we reduce its distribution, apply warning labels with more context and find duplicates.

Zuckerberg insists that warning pop-ups work, with 95% of users choosing not to display content when presented with labels.

“I think this last step is a good move by Facebook and we have seen a much more proactive stance towards disinformation in this pandemic than in other situations such as the American elections,” says Emily Taylor, associate at Chatham House and an expert in disinformation about social media.

“We don’t know if it will make a big difference, but it’s worth trying because the difference between disinformation in a health crisis and elections is literally that lives are at stake,” he said.

Source link

About Chief Editor 12302 Articles
CEO, Chief Editor, Reporting curated news on all matters. Publishing News on all topics at DailyHindNews.com

Be the first to comment

Leave a Reply

Your email address will not be published.


*