Inside the hate factory: how Facebook fuels far-right profit | Australia news | The Guardian
A Guardian investigation can reveal those messages were part of a covert plot to control some of Facebook’s largest far-right pages, including one linked to a rightwing terror group, and create a commercial enterprise that harvests Islamophobic hate for profit.
This group is now using its 21-page network to churn out more than 1,000 coordinated faked news posts per week to more than 1 million followers, funnelling audiences to a cluster of 10 ad-heavy websites and milking the traffic for profit.
The posts stoke deep hatred of Islam across the western world and influence politics in Australia, Canada, the UK and the US by amplifying far-right parties such as Australia’s One Nation and vilifying Muslim politicians such as the London mayor, Sadiq Khan, and the US congresswoman Ilhan Omar.
The network has also targeted leftwing politicians at critical points in national election campaigns. It posted false stories claiming the UK Labour leader, Jeremy Corbyn, said Jews were “the source of global terrorism” and accused the Canadian prime minister, Justin Trudeau, of allowing “Isis to invade Canada”.
Australia’s first female Muslim senator, Mehreen Faruqi, felt the full force of the network in August last year.
The revelations show Facebook has failed to stop clandestine actors from using its platform to run coordinated disinformation and hate campaigns. The network has operated with relative impunity even since Mark Zuckerberg’s apology to the US Senate following the Cambridge Analytica and Russian interference scandals.
Political influence and Facebook’s failures
In April last year, Zuckerberg sat before an army of cameras and offered a mea culpa to the world.
Facebook, still reeling from the Cambridge Analytica scandal, had failed its users, Zuckerberg said. The company had struggled to stop its platform being used for coordinated political interference and the spread of disinformation and hate.
“It’s clear now that we didn’t do enough to prevent these tools from being used for harm,” Zuckerberg said. “We didn’t take a broad enough view of our responsibility, and that was a big mistake. And it was my mistake.”
Two months later, the Israeli-based network gained access to its 13th far-right Facebook page, expanding the already sizeable audience for its disinformation.
The Facebook chief executive, Mark Zuckerberg, testifies before the House of Representatives energy and commerce committee
The Facebook chief executive, Mark Zuckerberg, testifies before the House of Representatives energy and commerce committee in April 2018. Photograph: Anadolu Agency/Getty Images
The network has operated with relative impunity for almost two years.
“Believe it or not she hasn’t done anything to get the page in trouble,” Devito said of his Israeli administrator. “I haven’t gotten anything from Facebook that ‘you’ve been posting inappropriate content that’s violated our community standards’ or anything of the sort. I’ve been very fortunate in that regard.”
As the network grew, so did its ability to influence the thinking of voters. By the time the Australian election came around in May, the pages were providing a significant platform for far-right candidates, including One Nation and Fraser Anning, a senator widely condemned for calling for a “final solution” to immigration.
The network boosted Anning and One Nation with 401 posts in the lead-up to the election, which attracted 82,025 likes, 18,748 comments and 33,730 shares.
A One Nation spokesman, James Ashby, said the network would not benefit the party, and engagement on leader Pauline Hanson’s personal page was far greater. “I would suggest the 401 posts you refer to has attracted a nanoscopic number of likes, comments and shares in comparison,” he said.
A spokesman for Anning said he was previously unaware of the network and did not believe it had helped his campaign.
It was a similar story in Canada. In the lead-up to the October election, the network pushed out 80 coordinated posts critical of Trudeau that were liked, shared or commented on 30,000 times.
In the UK, the network has savaged Corbyn. More than 510 coordinated posts have attacked the Labour leader since mid-2016, attracting 15,384 likes, 17,148 comments and 16,406 shares.
Facebook’s own definition of “coordinated inauthentic activity” reads like a blueprint for the network the Guardian has uncovered.
“Coordinated inauthentic behaviour is when groups of pages or people work together to mislead others about who they are or what they’re doing,” Facebook’s head of security policy, Nathaniel Gleicher, explained last year. “We might take a network down for making it look like it’s being run from one part of the world, when in fact it’s being run from another.
“This could be done for ideological purposes or it could be financially motivated. For example, spammers might seek to convince people to click on a link to visit their page or to read their posts.”
But Villereal said he had not heard from Facebook since the Israel-based administrator began distributing content from his page.
“I haven’t had no notifications from Facebook or anything like that about the content they’re posting: like spam risk or fake accounts or community violations or anything like that.”