• A TikTok Twist on ‘PizzaGate’ - The New York Times

    One of social media’s early conspiracy theories is back, but remade in creatively horrible ways.

    “PizzaGate,” a baseless notion that a Washington pizza parlor was the center of a child sex abuse ring, leading to a shooting in 2016, is catching on again with younger people on TikTok and other online hangouts, my colleagues Cecilia Kang and Sheera Frenkel wrote.

    I talked to Sheera about how young people have tweaked this conspiracy and how internet sites help spread false ideas. (And, yes, our names are pronounced the same but spelled differently.)

    Shira: How has this false conspiracy changed in four years?

    Sheera: Younger people on TikTok have made PizzaGate more relatable for them. So a conspiracy that centered on Hillary Clinton and other politicians a few years ago now instead ropes in celebrities like Justin Bieber. Everyone is at home, bored and online more than usual. When I talked to teens who were spreading these conspiracy videos, many of them said it seemed like fun.

    If it’s for “fun,” is this version of the PizzaGate conspiracy harmless?

    It’s not. We’ve seen over and over that some people can get so far into conspiracies that they take them seriously and commit real-world harm. And for people who are survivors of sexual abuse, it can be painful to see people talking about it all over social media.

    Have the internet companies gotten better at stopping false conspiracies like this?

    They have, but people who want to spread conspiracies are figuring out workarounds. Facebook banned the PizzaGate hashtag, for example, but the hashtag is not banned on Instagram, even though it’s owned by Facebook. People also migrated to private groups where Facebook has less visibility into what’s going on.

    Tech companies’ automated recommendation systems also can suck people further into false ideas. I recently tried to join Facebook QAnon conspiracy groups, and Facebook immediately recommended I join PizzaGate groups, too. On TikTok, what you see is largely decided by computer recommendations. So I watched one video about PizzaGate, and the next videos I saw in the app were all about PizzaGate.

    TikTok is a relatively new place where conspiracies can spread. What is it doing to address this?

    TikTok is not proactively going out and looking for videos with potentially false and dangerous ideas and removing them. There were more than 80 million views of TikTok videos with PizzaGate-related hashtags.

    The New York Times reached out to TikTok about the videos, pointing out their spike. After we sent our email, TikTok removed many of the videos and seemed to limit their spread. Facebook and Twitter often do this, too — they frequently remove content only after journalists reach out and point it out.

    Do you worry that writing about baseless conspiracies gives them more oxygen?

    We worry about that all the time, and spend as much time debating whether to write about false conspiracies and misinformation as we do writing about them.

    We watch for ones that reach a critical mass; we don’t want to be the place where people first find out about conspiracies. When a major news organization writes about a conspiracy — even to debunk it — people who want to believe it will twist it to appear to validate their views.

    But to ignore them completely could also be dangerous.

    #Pizzagate #complotisme #fake_news #TikTok

  • How YouTube Serves As The Content Engine Of The Internet’s Dark Side - BuzzFeed News

    David Seaman is the Pizzagate King of the Internet.

    On Twitter, Seaman posts dozens of messages a day to his 66,000 followers, often about the secret cabal — including Rothschilds, Satanists, and the other nabobs of the New World Order — behind the nation’s best-known, super-duper-secret child sex ring under a DC pizza parlor.

    But it’s on YouTube where he really goes to work. Since Nov. 4, four days before the election, Seaman has uploaded 136 videos, more than one a day. Of those, at least 42 are about Pizzagate. The videos, which tend to run about eight to fifteen minutes, typically consist of Seaman, a young, brown-haired man with glasses and a short beard, speaking directly into a camera in front of a white wall. He doesn’t equivocate: Recent videos are titled “Pizzagate Will Dominate 2017, Because It Is Real” and “#PizzaGate New Info 12/6/16: Link To Pagan God of Pedophilia/Rape.”

    Seaman has more than 150,000 subscribers. His videos, usually preceded by preroll ads for major brands like Quaker Oats and Uber, have been watched almost 18 million times, which is roughly the number of people who tuned in to last year’s season finale of NCIS, the most popular show on television.

    And yet there is a mammoth social platform, a cornerstone of the modern internet with more than a billion active users every month, which hosts and even pays for a fathomless stock of bad information, including viral fake news, conspiracy theories, and hate speech of every kind — and it’s been held up to virtually no scrutiny: YouTube.

    Frequently, the videos consist of little more than screenshots of a Reddit “investigation” laid out chronologically, set to ominous music. Other times, they’re very simple, featuring a man in a sparse room speaking directly into his webcam, or a very fast monotone narration over a series of photographs with effects straight out of iMovie. There’s a financial incentive for vloggers to make as many videos as cheaply they can; the more videos you make, the more likely one is to go viral. David Seaman’s videos typically garner more than 50,000 views and often exceed 100,000. Many of Seaman’s videos adjoin ads for major brands.

    So what responsibility, if any, does YouTube bear for the universe of often conspiratorial, sometimes bigoted, frequently incorrect information that it pays its creators to host, and that is now being filtered up to the most powerful person in the world? Legally, per the Digital Millennium Copyright Act, which absolves service providers of liability for content they host, none. But morally and ethically, shouldn’t YouTube be asking itself the same hard questions as Facebook and Twitter about the role it plays in a representative democracy? How do those questions change because YouTube is literally paying people to upload bad information ?

    #fake_news #post-truth #YouTube

  • ÉCŒURANT : les médias de masse considèrent le #PizzaGate comme une "rumeur" qui "empoisonne le web"

    Des poursuites ? Y’en n’aura jamais, ou personne n’ira en taule... Remarquez que les médias tente un faux-hoaxbusting parce que des gens ont été menacé de mort (qui ?), en gros ce genre de révélations qui arrivent aux oreilles du peuple qui tombe de haut,...