• Facebook copyright notice hoax: How algorithms could stop misinformation.
    http://www.slate.com/articles/technology/technology/2014/12/facebook_copyright_notice_hoax_how_algorithms_could_stop_misinformation.html?

    Un article prémonitoire de 2014

    True, Facebook didn’t invent the viral hoax. It just happens to be the perfect 21st-century venue for the propagation of scams and urban legends that would have propagated by word of mouth, tabloid, or chain letter in earlier eras. But Facebook amplifies misinformation at a speed and on a scale that exceeds what was possible before. Its news feed is the product of cutting-edge software that the company has finely calibrated to maximize user engagement—that is, to prioritize the posts that catch people’s eyes and compel them to click, like, or share. One problem: Hoaxes, scams, and conspiracy theories are specifically optimized to do just that. Truth may be stranger than fiction, but on Facebook, fiction is often more viral.

    Facebook will tell you it’s because distinguishing truth from lies is none of its business. The purpose of the news feed, the company explains, is not to sift right from wrong or good from bad according to some objective standard. It’s to sift what’s interesting to each Facebook user from what isn’t—that is, to give its users what they want. And what they want, Facebook has learned, is to see what their friends, family, and acquaintances are talking about. Whether that’s a cute baby photo, a serious current event, a clever lifehack, or a 9/11 conspiracy theory is not Facebook’s concern.

    “Our goal is to connect people with the content they’re most interested in and not to prioritize one point of view over another,” spokeswoman Jessie Baker told me.

    Similarly, Facebook chief Mark Zuckerberg was reportedly moved to tweak the algorithms in a different direction when he saw a co-worker’s birthday ranking ahead of the birth of his niece. The problem: The news feed’s emphasis on relationships was leading it to prioritize trivial posts from his closest connections over the major life events of other friends and family. Facebook’s clever solution: When a post is prompting the word “congratulations” in the comments, show it to a wider audience.

    A similarly simple approach could work wonders to counter the inherent virality of popular hoaxes, as Slate science editor Laura Helmuth recently pointed out to me. Let’s say a given post is repeatedly triggering comments that contain links to Snopes.com, for instance, or that include words like “hoax” or “debunked.” Facebook’s algorithms could take that as a subtle sign that the post’s widespread engagement might be ill-gotten. It wouldn’t have to censor the post—just treat it less like a viral sensation worthy of topping everyone’s feeds. The poster’s friends and family might still see it and have a chance to educate or disabuse him of the false information. But it would be far less likely to spread like fungus across the entire platform.

    #fake_news #Facebook