Facebook has announced it will now post COVID-19 advice in the news feeds of users who have interacted with posts containing misinformation.
The new features announced yesterday (Thursday 16th April) aim to crack down on misinformation on the social media platform and promote factual news and information on the pandemic.
Users who have engaged with misleading or factually incorrect posts will be encouraged to visit the World Health Organisation’s (WHO) official website for advice.
Additionally, the social media giant intends to launch a new section called “Get the Facts” featuring verified news sources for users to access.
In a blog post, Guy Rosen, VP Integrity at Facebook explained: “We’re going to start showing messages in News Feed to people who have liked, reacted or commented on harmful misinformation about COVID-19 that we have since removed.
“These messages will connect people to COVID-19 myths debunked by the WHO including ones we’ve removed from our platform for leading to imminent physical harm.”
An exact timeline for when Facebook users can expect to see these new features hasn’t been detailed. However, Rosen added that people will “start seeing these messages in the coming weeks”.
This latest announcement from Facebook follows the release of a scathing report into coronavirus-related misinformation on the platform.
A study published by non-profit organisation, Avaaz, claims the social media platform is “rife with bogus cures and conspiracy theories”, many of which remain on the platform for quite some time.
Avaaz’ report describes Facebook as an “epicentre of coronavirus misinformation” and the organisation insisted it must do more to protect users from harmful information.
The 104 coronavirus-related examples of misinformation analysed in the report reached a potentially huge global audience and accrued more than 117 million views. This, the report adds, represents “only the tip of the iceberg” in regards to the scale of the problem.
Attempts to combat misinformation by the social media giant aren’t strong enough, the report asserts, and fact-checking procedures appear to do little to prevent the spread of false information.
“Facebook’s current anti-misinformation efforts remain slow and insufficient in limiting the spread of coronavirus misinformation, even when the content is flagged by the platform’s very own fact-checking partners,” the report states.
In addition, a significant portion of misinformation content that remains on the platform without warning labels has also been debunked by partners of Facebook’s fact-checking initiative, the report reveals.
- Reddit advertising update to target political ad transparency
- Gamers to be notified of in-game purchases under new rules
- Facebook to use bots to pre-empt exploitable platform issues
Commenting further on the issue of misinformation, Rosen outlined some of the work Facebook has been carrying out to tackle the problem, including support for fact-checking organisations around the world.
He also insisted the company is gaining ground in the race to curb misinformation, stating: “During the month of March, we displayed warnings on about 40 million posts related to COVID-19 on Facebook, based on around 4,000 articles by our independent fact-checking partners.
“When people saw those warning labels, 95% of the time they did not go on to view the original content. To date, we’ve also removed hundreds of thousands of pieces of misinformation that could lead to imminent physical harm.”
Examples of misinformation that could lead to “imminent physical harm”, Rosen noted, included claims that drinking bleach cures the virus, along with theories suggesting that social distancing does little to prevent the spread of the disease.
Rosen added: “As this pandemic evolves, we’ll continue focusing on the most effective ways to keep misinformation and dangerous hoaxes about COVID-19 off our apps and ensure people have credible information from health experts to stay safe and informed.”