YouTube Executives Ignored Warnings, Let Toxic Videos Run Rampant - Bloomberg
▻https://www.bloomberg.com/news/features/2019-04-02/youtube-executives-ignored-warnings-letting-toxic-videos-run-rampant
Wojcicki’s media behemoth, bent on overtaking television, is estimated to rake in sales of more than $16 billion a year. But on that day, Wojcicki compared her video site to a different kind of institution. “We’re really more like a library,” she said, staking out a familiar position as a defender of free speech. “There have always been controversies, if you look back at libraries.”
Since Wojcicki took the stage, prominent conspiracy theories on the platform—including one on child vaccinations; another tying Hillary Clinton to a Satanic cult—have drawn the ire of lawmakers eager to regulate technology companies. And YouTube is, a year later, even more associated with the darker parts of the web.
The conundrum isn’t just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive “library,” generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.
Mais justement NON ! Ce ne peut être une “bibliothèque”, car une bibliothèque ne conserve que des documents qui ont été publiés, donc avec déjà une première instance de validation (ou en tout cas de responsabilité éditoriale... quelqu’un ira en procès le cas échéant).
YouTube est... YouTube, quelque chose de spécial à internet, qui remplit une fonction majeure... et également un danger pour la pensée en raison de “l’économie de l’attention”.
The company spent years chasing one business goal above others: “Engagement,” a measure of the views, time spent and interactions with online videos. Conversations with over twenty people who work at, or recently left, YouTube reveal a corporate leadership unable or unwilling to act on these internal alarms for fear of throttling engagement.
In response to criticism about prioritizing growth over safety, Facebook Inc. has proposed a dramatic shift in its core product. YouTube still has struggled to explain any new corporate vision to the public and investors – and sometimes, to its own staff. Five senior personnel who left YouTube and Google in the last two years privately cited the platform’s inability to tame extreme, disturbing videos as the reason for their departure. Within Google, YouTube’s inability to fix its problems has remained a major gripe. Google shares slipped in late morning trading in New York on Tuesday, leaving them up 15 percent so far this year. Facebook stock has jumped more than 30 percent in 2019, after getting hammered last year.
YouTube’s inertia was illuminated again after a deadly measles outbreak drew public attention to vaccinations conspiracies on social media several weeks ago. New data from Moonshot CVE, a London-based firm that studies extremism, found that fewer than twenty YouTube channels that have spread these lies reached over 170 million viewers, many who were then recommended other videos laden with conspiracy theories.
So YouTube, then run by Google veteran Salar Kamangar, set a company-wide objective to reach one billion hours of viewing a day, and rewrote its recommendation engine to maximize for that goal. When Wojcicki took over, in 2014, YouTube was a third of the way to the goal, she recalled in investor John Doerr’s 2018 book Measure What Matters.
“They thought it would break the internet! But it seemed to me that such a clear and measurable objective would energize people, and I cheered them on,” Wojcicki told Doerr. “The billion hours of daily watch time gave our tech people a North Star.” By October, 2016, YouTube hit its goal.
❞
YouTube doesn’t give an exact recipe for virality. But in the race to one billion hours, a formula emerged: Outrage equals attention. It’s one that people on the political fringes have easily exploited, said Brittan Heller, a fellow at Harvard University’s Carr Center. “They don’t know how the algorithm works,” she said. “But they do know that the more outrageous the content is, the more views.”
People inside YouTube knew about this dynamic. Over the years, there were many tortured debates about what to do with troublesome videos—those that don’t violate its content policies and so remain on the site. Some software engineers have nicknamed the problem “bad virality.”
Yonatan Zunger, a privacy engineer at Google, recalled a suggestion he made to YouTube staff before he left the company in 2016. He proposed a third tier: Videos that were allowed to stay on YouTube, but, because they were “close to the line” of the takedown policy, would be removed from recommendations. “Bad actors quickly get very good at understanding where the bright lines are and skating as close to those lines as possible,” Zunger said.
His proposal, which went to the head of YouTube policy, was turned down. “I can say with a lot of confidence that they were deeply wrong,” he said.
Rather than revamp its recommendation engine, YouTube doubled down. The neural network described in the 2016 research went into effect in YouTube recommendations starting in 2015. By the measures available, it has achieved its goal of keeping people on YouTube.
“It’s an addiction engine,” said Francis Irving, a computer scientist who has written critically about YouTube’s AI system.
Wojcicki and her lieutenants drew up a plan. YouTube called it Project Bean or, at times, “Boil The Ocean,” to indicate the enormity of the task. (Sometimes they called it BTO3 – a third dramatic overhaul for YouTube, after initiatives to boost mobile viewing and subscriptions.) The plan was to rewrite YouTube’s entire business model, according to three former senior staffers who worked on it.
It centered on a way to pay creators that isn’t based on the ads their videos hosted. Instead, YouTube would pay on engagement—how many viewers watched a video and how long they watched. A special algorithm would pool incoming cash, then divvy it out to creators, even if no ads ran on their videos. The idea was to reward video stars shorted by the system, such as those making sex education and music videos, which marquee advertisers found too risqué to endorse.
Coders at YouTube labored for at least a year to make the project workable. But company managers failed to appreciate how the project could backfire: paying based on engagement risked making its “bad virality” problem worse since it could have rewarded videos that achieved popularity achieved by outrage. One person involved said that the algorithms for doling out payments were tightly guarded. If it went into effect then, this person said, it’s likely that someone like Alex Jones—the Infowars creator and conspiracy theorist with a huge following on the site, before YouTube booted him last August—would have suddenly become one of the highest paid YouTube stars.
In February of 2018, the video calling the Parkland shooting victims “crisis actors” went viral on YouTube’s trending page. Policy staff suggested soon after limiting recommendations on the page to vetted news sources. YouTube management rejected the proposal, according to a person with knowledge of the event. The person didn’t know the reasoning behind the rejection, but noted that YouTube was then intent on accelerating its viewing time for videos related to news.
#YouTube #Economie_attention #Engagement #Viralité