Facebook will soon let you know if you saw or interacted with dangerous coronavirus misinformation on the site. The new notice will be sent to users who have liked, reacted to, or commented on posts featuring harmful or false claims about COVID-19 after they have been removed by moderators, the AP reports. The alert, which will start appearing on Facebook in the coming weeks, will direct users to a site where the World Health Organization lists and debunks virus myths and rumors. The latest move is part of an unprecedented effort by Facebook, Google and Twitter that includes stricter rules, altered algorithms and thousands of fact checks to contain an outbreak of bad information online that’s spreading as quickly as the virus itself.
Challenges remain. Tech platforms have sent home human moderators who police the platforms, forcing them to rely on automated systems to take down harmful content. They are also up against people's mistrust of authoritative sources for information, such as the WHO. Facebook disclosed Thursday that it put more than 40 million warning labels in March over videos, posts or articles about the coronavirus that fact-checking organizations have determined are false or misleading. The number includes duplicate claims—the labels were based on 4,000 fact checks. Facebook says those warning labels have stopped 95% of users from clicking on the false information. "It's a big indicator that people are trusting the fact checkers," said Baybars Orsek, director of the International Fact-Checking Network. "The label has an impact on people’s information consumption."
(More
Facebook stories.)