Facebook will try out new ways to report and flag fake news this week, setting up a partnership with fact-checking organisations to try to address the “worst of the worst” hoaxes spread by spammers.
The world’s largest social network is testing several ways to try to limit the rapid proliferation of fake news stories. This was highlighted by posts that went viral during the US presidential election campaign, such as a report that the Pope endorsed Donald Trump or the “Pizzagate” story that claimed Democrats were involved in a paedophile ring.
=>▻http://rue89.nouvelobs.com/2016/12/14/conspiration-trois-fromages-265906
Facebook will make it easier to report a fake news story by clicking in the upper right-hand corner of each post. Once a story is reported by Facebook users or identified by “other signals”, such as whether people share a story after they read it, as potentially being fake, it will be sent to third-party fact-checking organisations.
If the members of Poynter’s International Fact Checking Network discover it is fake, it will be flagged as “disputed”, with a link to the fact-checking organisation’s article explaining why. Disputed stories will appear lower in the Facebook news feed, where posts appear in an order governed by a complex algorithm, and people will receive a warning that they are disputed if they decide to share them.
Adam Mosseri, vice-president of product management at Facebook, said the company was committed to doing its part to address the issue of fake news.
“We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully,” he said.
Facebook has long insisted it is a technology company, not a media organisation, and been cautious about getting involved in editorial decisions. When the problem of fake news hit the headlines after the US election, the social network was initially reluctant to accept responsibility, with founder and chief executive Mark Zuckerberg saying it was “pretty crazy” to think fake news affected the election result.
However, within days, Facebook said it was experimenting with developing ways to stop the spread of fake news. Many in the tech and media industries have already begun to build or discuss their products to address the problem.
“We’ve focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain,” Mr Mosseri said.
But rightwing commentators complained that Facebook had partnered with fact-checking organisations they deemed as on the left, with Republican Evan Siegfried tweeting that it was “not good for conservatives”. Ben Shapiro, editor-in-chief of the Daily Wire, wrote that the change was a “disaster for news” and accused the factcheckers: Politifact, Factcheck.org, Snopes, ABC news and the Washington Post of all skewing to the left.
The US public is convinced fake news is a real problem, according to a survey released by the Pew Research Center on Thursday. The majority of Americans believe the spread of fake news has confused people about basic facts and a third say they frequently see fake political news online.
Some 71 per cent believe social networking sites and search engines have a responsibility to stop the spread of fake news, but they assign similar responsibility for stopping the spread of fake news to the public and politicians.
Nearly a quarter claim to have shared fake news on social networks themselves, with about 14 per cent admitting they shared it despite knowing the story was fake.
Facebook and Google have already tried to limit the financial gains that can be made by spreading fake news, by ensuring that known fake news sites do not receive revenue from their advertising network. Now, Facebook has also decided that any link flagged as disputed cannot be included in an advert, so people cannot pay for them to go viral. Sites purporting to be reputable news sites, by disguising their URL, will also not be allowed to buy adverts from the company.