• Facebook funnelling readers towards Covid misinformation - study | Technology | The Guardian
    https://www.theguardian.com/technology/2020/aug/19/facebook-funnelling-readers-towards-covid-misinformation-study
    https://i.guim.co.uk/img/media/905ac886c6dc0f5a3d40eb514637a8cdf0255873/0_5_4703_2822/master/4703.jpg?width=1200&height=630&quality=85&auto=format&fit=crop&overlay-ali

    Facebook had promised to crack down on conspiracy theories and inaccurate news early in the pandemic. But as its executives promised accountability, its algorithm appears to have fuelled traffic to a network of sites sharing dangerous false news, campaign group Avaaz has found.

    False medical information can be deadly; researchers led by Bangladesh’s International Centre for Diarrhoeal Disease Research, writing in The American Journal of Tropical Medicine and Hygiene, have directly linked a single piece of coronavirus misinformation to 800 deaths.

    Pages from the top 10 sites peddling inaccurate information and conspiracy theories about health received almost four times as many views on Facebook as the top 10 reputable sites for health information, Avaaz warned in a report.

    “This suggests that just when citizens needed credible health information the most, and while Facebook was trying to proactively raise the profile of authoritative health institutions on the platform, its algorithm was potentially undermining these efforts,” the report said.

    A relatively small but influential network is responsible for driving huge amounts of traffic to health misinformation sites. Avaaz identified 42 “super-spreader” sites that had 28m followers generating an estimated 800m views.

    A single article, which falsely claimed that the American Medical Association was encouraging doctors and hospitals to over-estimate deaths from Covid-19, was seen 160m times.

    This vast collective reach suggested that Facebook’s own internal systems are not capable of protecting users from misinformation about health, even at a critical time when the company has promised to keep users “safe and informed”.

    “Avaaz’s latest research is yet another damning indictment of Facebook’s capacity to amplify false or misleading health information during the pandemic,” said British MP Damian Collins, who led a parliamentary investigation into disinformation.

    “The majority of this dangerous content is still on Facebook with no warning or context whatsoever … The time for [Facebook CEO, Mark] Zuckerberg to act is now. He must clean up his platform and help stop this harmful infodemic.”

    Some of the false claims were directly harmful: one, suggesting that pure alcohol could kill the virus, has been linked to 800 deaths, as well as 60 people going blind after drinking methanol as a cure. “In India, 12 people, including five children, became sick after drinking liquor made from toxic seed Datura (ummetta plant in local parlance) as a cure to coronavirus disease,” the paper says. “The victims reportedly watched a video on social media that Datura seeds give immunity against Covid-19.”

    Beyond the specifically dangerous falsehoods, much misinformation is merely useless, but can contribute to the spread of coronavirus, as with one South Korean church which came to believe that spraying salt water could combat the virus.

    “They put the nozzle of the spray bottle inside the mouth of a follower who was later confirmed as a patient before they did likewise for other followers as well, without disinfecting the sprayer,” an official later said. More than 100 followers were infected as a result.

    Among Facebook’s tactics for fighting disinformation on the platform has been giving independent fact-checkers the ability to put warning labels on items they consider untrue.

    Zuckerberg has said fake news would be marginalised by the algorithm, which determines what content viewers see. “Posts that are rated as false are demoted and lose on average 80% of their future views,” he wrote in 2018.

    But Avaaz found that huge amounts of disinformation slips through Facebook’s verification system, despite having been flagged up by factcheck organisations.

    They analysed nearly 200 pieces of health misinformation which were shared on the site after being identified as problematic. Fewer than one in five carried a warning label, with the vast majority – 84% – slipping through controls after they were translated into other languages, or republished in whole or part.

    “These findings point to a gap in Facebook’s ability to detect clones and variations of fact-checked content – especially across multiple languages – and to apply warning labels to them,” the report said.

    Two simple steps could hugely reduce the reach of misinformation. The first would be proactively correcting misinformation that was seen before it was labelled as false, by putting prominent corrections in users feeds.

    Recent research has found corrections like these can halve belief in incorrect reporting, Avaaz said. The other step would be to improve the detection and monitoring of translated and cloned material, so that Zuckerberg’s promise to starve the sites of their audiences is actually made good.

    A Facebook spokesperson said: “We share Avaaz’s goal of limiting misinformation, but their findings don’t reflect the steps we’ve taken to keep it from spreading on our services. Thanks to our global network of fact-checkers, from April to June, we applied warning labels to 98m pieces of Covid-19 misinformation and removed 7mpieces of content that could lead to imminent harm. We’ve directed over 2bn people to resources from health authorities and when someone tries to share a link about Covid-19, we show them a pop-up to connect them with credible health information.”

    #Facebook #Fake_news #Désinformation #Infodemics #Promesses #Culture_de_l_excuse #Médias_sociaux