person:alex jones

  • #Facebook bans Milo Yiannopoulos, Alex Jones, other ’dangerous’ figures | TheHill
    https://thehill.com/policy/technology/441854-facebook-bans-dangerous-figures-including-milo-yiannopoulos-and-alex

    Facebook announced Thursday that it has permanently banned a host of prominent figures it described as “dangerous” from its platform, including right-wing commentator and former Breitbart News editor Milo Yiannopoulos, conspiracy theorist Alex Jones and Nation of Islam leader Louis Farrakhan.

    The platform said it has determined that those figures are “dangerous,” and removed them under their policy barring individuals and groups that promote hateful and violent messages.

  • YouTube Executives Ignored Warnings, Let Toxic Videos Run Rampant - Bloomberg
    https://www.bloomberg.com/news/features/2019-04-02/youtube-executives-ignored-warnings-letting-toxic-videos-run-rampant

    Wojcicki’s media behemoth, bent on overtaking television, is estimated to rake in sales of more than $16 billion a year. But on that day, Wojcicki compared her video site to a different kind of institution. “We’re really more like a library,” she said, staking out a familiar position as a defender of free speech. “There have always been controversies, if you look back at libraries.”

    Since Wojcicki took the stage, prominent conspiracy theories on the platform—including one on child vaccinations; another tying Hillary Clinton to a Satanic cult—have drawn the ire of lawmakers eager to regulate technology companies. And YouTube is, a year later, even more associated with the darker parts of the web.

    The conundrum isn’t just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive “library,” generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.

    Mais justement NON ! Ce ne peut être une “bibliothèque”, car une bibliothèque ne conserve que des documents qui ont été publiés, donc avec déjà une première instance de validation (ou en tout cas de responsabilité éditoriale... quelqu’un ira en procès le cas échéant).

    YouTube est... YouTube, quelque chose de spécial à internet, qui remplit une fonction majeure... et également un danger pour la pensée en raison de “l’économie de l’attention”.

    The company spent years chasing one business goal above others: “Engagement,” a measure of the views, time spent and interactions with online videos. Conversations with over twenty people who work at, or recently left, YouTube reveal a corporate leadership unable or unwilling to act on these internal alarms for fear of throttling engagement.

    In response to criticism about prioritizing growth over safety, Facebook Inc. has proposed a dramatic shift in its core product. YouTube still has struggled to explain any new corporate vision to the public and investors – and sometimes, to its own staff. Five senior personnel who left YouTube and Google in the last two years privately cited the platform’s inability to tame extreme, disturbing videos as the reason for their departure. Within Google, YouTube’s inability to fix its problems has remained a major gripe. Google shares slipped in late morning trading in New York on Tuesday, leaving them up 15 percent so far this year. Facebook stock has jumped more than 30 percent in 2019, after getting hammered last year.

    YouTube’s inertia was illuminated again after a deadly measles outbreak drew public attention to vaccinations conspiracies on social media several weeks ago. New data from Moonshot CVE, a London-based firm that studies extremism, found that fewer than twenty YouTube channels that have spread these lies reached over 170 million viewers, many who were then recommended other videos laden with conspiracy theories.

    So YouTube, then run by Google veteran Salar Kamangar, set a company-wide objective to reach one billion hours of viewing a day, and rewrote its recommendation engine to maximize for that goal. When Wojcicki took over, in 2014, YouTube was a third of the way to the goal, she recalled in investor John Doerr’s 2018 book Measure What Matters.

    “They thought it would break the internet! But it seemed to me that such a clear and measurable objective would energize people, and I cheered them on,” Wojcicki told Doerr. “The billion hours of daily watch time gave our tech people a North Star.” By October, 2016, YouTube hit its goal.

    YouTube doesn’t give an exact recipe for virality. But in the race to one billion hours, a formula emerged: Outrage equals attention. It’s one that people on the political fringes have easily exploited, said Brittan Heller, a fellow at Harvard University’s Carr Center. “They don’t know how the algorithm works,” she said. “But they do know that the more outrageous the content is, the more views.”

    People inside YouTube knew about this dynamic. Over the years, there were many tortured debates about what to do with troublesome videos—those that don’t violate its content policies and so remain on the site. Some software engineers have nicknamed the problem “bad virality.”

    Yonatan Zunger, a privacy engineer at Google, recalled a suggestion he made to YouTube staff before he left the company in 2016. He proposed a third tier: Videos that were allowed to stay on YouTube, but, because they were “close to the line” of the takedown policy, would be removed from recommendations. “Bad actors quickly get very good at understanding where the bright lines are and skating as close to those lines as possible,” Zunger said.

    His proposal, which went to the head of YouTube policy, was turned down. “I can say with a lot of confidence that they were deeply wrong,” he said.

    Rather than revamp its recommendation engine, YouTube doubled down. The neural network described in the 2016 research went into effect in YouTube recommendations starting in 2015. By the measures available, it has achieved its goal of keeping people on YouTube.

    “It’s an addiction engine,” said Francis Irving, a computer scientist who has written critically about YouTube’s AI system.

    Wojcicki and her lieutenants drew up a plan. YouTube called it Project Bean or, at times, “Boil The Ocean,” to indicate the enormity of the task. (Sometimes they called it BTO3 – a third dramatic overhaul for YouTube, after initiatives to boost mobile viewing and subscriptions.) The plan was to rewrite YouTube’s entire business model, according to three former senior staffers who worked on it.

    It centered on a way to pay creators that isn’t based on the ads their videos hosted. Instead, YouTube would pay on engagement—how many viewers watched a video and how long they watched. A special algorithm would pool incoming cash, then divvy it out to creators, even if no ads ran on their videos. The idea was to reward video stars shorted by the system, such as those making sex education and music videos, which marquee advertisers found too risqué to endorse.

    Coders at YouTube labored for at least a year to make the project workable. But company managers failed to appreciate how the project could backfire: paying based on engagement risked making its “bad virality” problem worse since it could have rewarded videos that achieved popularity achieved by outrage. One person involved said that the algorithms for doling out payments were tightly guarded. If it went into effect then, this person said, it’s likely that someone like Alex Jones—the Infowars creator and conspiracy theorist with a huge following on the site, before YouTube booted him last August—would have suddenly become one of the highest paid YouTube stars.

    In February of 2018, the video calling the Parkland shooting victims “crisis actors” went viral on YouTube’s trending page. Policy staff suggested soon after limiting recommendations on the page to vetted news sources. YouTube management rejected the proposal, according to a person with knowledge of the event. The person didn’t know the reasoning behind the rejection, but noted that YouTube was then intent on accelerating its viewing time for videos related to news.

    #YouTube #Economie_attention #Engagement #Viralité

  • Balade au bord du désordre
    http://www.dedefensa.org/article/balade-au-bord-du-desordre

    Balade au bord du désordre

    15 février 2019 – Mon attention a été attirée par un des titres de tête de l’édition quotidienne du site Infowars.com, du tonitruant Alex Jones, parce qu’il porte sur une intervention d’Éric Zemmour, cela deux jours après avoir déjà fait un titre sur le même Zemmour :

    • 13 février 2019 : « L’intellectuel français Zemmour : les élites organisent l’“invasion” des migrants pour en, faire leur propre classe de serviteurs – Les Européens doivent être remplacés parce qu’ils refusent désormais d’assurer leurs tâches subalternes pour des bas salaires. »

    • 15 février 2019 : « Zemmour : les élites se sont révoltées et ont cessé de guider les gens qu’elles inspiraient – La riposte va conduire à une “terrible confrontation”. »

    On connaît Infowars.com et l’on connaît Zemmour. Si l’on peut (...)

  • « Le #Communisme expliqué aux enfants » : le livre de Bini Adamczak qui scandalise l’alt-right
    https://www.lautrequotidien.fr/articles/2018/11/15/le-communisme-expliqu-aux-enfants-all-les-dputs-lrm

    La parution du livre aux États-Unis a provoqué une véritable levée de boucliers de la part de toute la mouvance conservatrice, de l’alt-right, et a par conséquent largement contribué à faire de ce livre l’un des grands emblèmes de l’opposition à Trump. « Au grand dam des parents, les éditions MIT Press ont récemment publié un petit livre portant le titre de Communism for Kids. Il apprendrait aux enfants à diriger un goulag, à imiter certains dictateurs génocidaires, à adorer Satan, et globalement à contribuer à la destruction de la civilisation occidentale ‒ le tout pour la modique somme de 12,95 dollars […] L’auteur de ce texte répandrait subrepticement certaines valeurs “anti-américaines”, féministes, queers, etc. C’est en tout cas ce que peuvent raconter pléthore de figures ou médias conservateurs comme Breitbart, National Review, American Conservative, The Daily Beast, The Daily Signal, Alex Jones, Rush Limbaugh, Milo Yiannopoulos, Steven Crowder, The Blaze, Pamela Geller, The Christian Truther, The Washington Free Beacon et Fox News. » — Jacob Blumenfeld, The New York Times

    #livres #édition

  • Can Mark Zuckerberg Fix Facebook Before It Breaks Democracy? | The New Yorker
    https://www.newyorker.com/magazine/2018/09/17/can-mark-zuckerberg-fix-facebook-before-it-breaks-democracy

    Since 2011, Zuckerberg has lived in a century-old white clapboard Craftsman in the Crescent Park neighborhood, an enclave of giant oaks and historic homes not far from Stanford University. The house, which cost seven million dollars, affords him a sense of sanctuary. It’s set back from the road, shielded by hedges, a wall, and mature trees. Guests enter through an arched wooden gate and follow a long gravel path to a front lawn with a saltwater pool in the center. The year after Zuckerberg bought the house, he and his longtime girlfriend, Priscilla Chan, held their wedding in the back yard, which encompasses gardens, a pond, and a shaded pavilion. Since then, they have had two children, and acquired a seven-hundred-acre estate in Hawaii, a ski retreat in Montana, and a four-story town house on Liberty Hill, in San Francisco. But the family’s full-time residence is here, a ten-minute drive from Facebook’s headquarters.

    Occasionally, Zuckerberg records a Facebook video from the back yard or the dinner table, as is expected of a man who built his fortune exhorting employees to keep “pushing the world in the direction of making it a more open and transparent place.” But his appetite for personal openness is limited. Although Zuckerberg is the most famous entrepreneur of his generation, he remains elusive to everyone but a small circle of family and friends, and his efforts to protect his privacy inevitably attract attention. The local press has chronicled his feud with a developer who announced plans to build a mansion that would look into Zuckerberg’s master bedroom. After a legal fight, the developer gave up, and Zuckerberg spent forty-four million dollars to buy the houses surrounding his. Over the years, he has come to believe that he will always be the subject of criticism. “We’re not—pick your noncontroversial business—selling dog food, although I think that people who do that probably say there is controversy in that, too, but this is an inherently cultural thing,” he told me, of his business. “It’s at the intersection of technology and psychology, and it’s very personal.”

    At the same time, former Facebook executives, echoing a growing body of research, began to voice misgivings about the company’s role in exacerbating isolation, outrage, and addictive behaviors. One of the largest studies, published last year in the American Journal of Epidemiology, followed the Facebook use of more than five thousand people over three years and found that higher use correlated with self-reported declines in physical health, mental health, and life satisfaction. At an event in November, 2017, Sean Parker, Facebook’s first president, called himself a “conscientious objector” to social media, saying, “God only knows what it’s doing to our children’s brains.” A few days later, Chamath Palihapitiya, the former vice-president of user growth, told an audience at Stanford, “The short-term, dopamine-driven feedback loops that we have created are destroying how society works—no civil discourse, no coöperation, misinformation, mistruth.” Palihapitiya, a prominent Silicon Valley figure who worked at Facebook from 2007 to 2011, said, “I feel tremendous guilt. I think we all knew in the back of our minds.” Of his children, he added, “They’re not allowed to use this shit.” (Facebook replied to the remarks in a statement, noting that Palihapitiya had left six years earlier, and adding, “Facebook was a very different company back then.”)

    In March, Facebook was confronted with an even larger scandal: the Times and the British newspaper the Observer reported that a researcher had gained access to the personal information of Facebook users and sold it to Cambridge Analytica, a consultancy hired by Trump and other Republicans which advertised using “psychographic” techniques to manipulate voter behavior. In all, the personal data of eighty-seven million people had been harvested. Moreover, Facebook had known of the problem since December of 2015 but had said nothing to users or regulators. The company acknowledged the breach only after the press discovered it.

    We spoke at his home, at his office, and by phone. I also interviewed four dozen people inside and outside the company about its culture, his performance, and his decision-making. I found Zuckerberg straining, not always coherently, to grasp problems for which he was plainly unprepared. These are not technical puzzles to be cracked in the middle of the night but some of the subtlest aspects of human affairs, including the meaning of truth, the limits of free speech, and the origins of violence.

    Zuckerberg is now at the center of a full-fledged debate about the moral character of Silicon Valley and the conscience of its leaders. Leslie Berlin, a historian of technology at Stanford, told me, “For a long time, Silicon Valley enjoyed an unencumbered embrace in America. And now everyone says, Is this a trick? And the question Mark Zuckerberg is dealing with is: Should my company be the arbiter of truth and decency for two billion people? Nobody in the history of technology has dealt with that.”

    In 2002, Zuckerberg went to Harvard, where he embraced the hacker mystique, which celebrates brilliance in pursuit of disruption. “The ‘fuck you’ to those in power was very strong,” the longtime friend said. In 2004, as a sophomore, he embarked on the project whose origin story is now well known: the founding of Thefacebook.com with four fellow-students (“the” was dropped the following year); the legal battles over ownership, including a suit filed by twin brothers, Cameron and Tyler Winklevoss, accusing Zuckerberg of stealing their idea; the disclosure of embarrassing messages in which Zuckerberg mocked users for giving him so much data (“they ‘trust me.’ dumb fucks,” he wrote); his regrets about those remarks, and his efforts, in the years afterward, to convince the world that he has left that mind-set behind.

    New hires learned that a crucial measure of the company’s performance was how many people had logged in to Facebook on six of the previous seven days, a measurement known as L6/7. “You could say it’s how many people love this service so much they use it six out of seven days,” Parakilas, who left the company in 2012, said. “But, if your job is to get that number up, at some point you run out of good, purely positive ways. You start thinking about ‘Well, what are the dark patterns that I can use to get people to log back in?’ ”

    Facebook engineers became a new breed of behaviorists, tweaking levers of vanity and passion and susceptibility. The real-world effects were striking. In 2012, when Chan was in medical school, she and Zuckerberg discussed a critical shortage of organs for transplant, inspiring Zuckerberg to add a small, powerful nudge on Facebook: if people indicated that they were organ donors, it triggered a notification to friends, and, in turn, a cascade of social pressure. Researchers later found that, on the first day the feature appeared, it increased official organ-donor enrollment more than twentyfold nationwide.

    Sean Parker later described the company’s expertise as “exploiting a vulnerability in human psychology.” The goal: “How do we consume as much of your time and conscious attention as possible?” Facebook engineers discovered that people find it nearly impossible not to log in after receiving an e-mail saying that someone has uploaded a picture of them. Facebook also discovered its power to affect people’s political behavior. Researchers found that, during the 2010 midterm elections, Facebook was able to prod users to vote simply by feeding them pictures of friends who had already voted, and by giving them the option to click on an “I Voted” button. The technique boosted turnout by three hundred and forty thousand people—more than four times the number of votes separating Trump and Clinton in key states in the 2016 race. It became a running joke among employees that Facebook could tilt an election just by choosing where to deploy its “I Voted” button.

    These powers of social engineering could be put to dubious purposes. In 2012, Facebook data scientists used nearly seven hundred thousand people as guinea pigs, feeding them happy or sad posts to test whether emotion is contagious on social media. (They concluded that it is.) When the findings were published, in the Proceedings of the National Academy of Sciences, they caused an uproar among users, many of whom were horrified that their emotions may have been surreptitiously manipulated. In an apology, one of the scientists wrote, “In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

    Facebook was, in the words of Tristan Harris, a former design ethicist at Google, becoming a pioneer in “ persuasive technology.

    Facebook had adopted a buccaneering motto, “Move fast and break things,” which celebrated the idea that it was better to be flawed and first than careful and perfect. Andrew Bosworth, a former Harvard teaching assistant who is now one of Zuckerberg’s longest-serving lieutenants and a member of his inner circle, explained, “A failure can be a form of success. It’s not the form you want, but it can be a useful thing to how you learn.” In Zuckerberg’s view, skeptics were often just fogies and scolds. “There’s always someone who wants to slow you down,” he said in a commencement address at Harvard last year. “In our society, we often don’t do big things because we’re so afraid of making mistakes that we ignore all the things wrong today if we do nothing. The reality is, anything we do will have issues in the future. But that can’t keep us from starting.”

    In contrast to a traditional foundation, an L.L.C. can lobby and give money to politicians, without as strict a legal requirement to disclose activities. In other words, rather than trying to win over politicians and citizens in places like Newark, Zuckerberg and Chan could help elect politicians who agree with them, and rally the public directly by running ads and supporting advocacy groups. (A spokesperson for C.Z.I. said that it has given no money to candidates; it has supported ballot initiatives through a 501(c)(4) social-welfare organization.) “The whole point of the L.L.C. structure is to allow a coördinated attack,” Rob Reich, a co-director of Stanford’s Center on Philanthropy and Civil Society, told me. The structure has gained popularity in Silicon Valley but has been criticized for allowing wealthy individuals to orchestrate large-scale social agendas behind closed doors. Reich said, “There should be much greater transparency, so that it’s not dark. That’s not a criticism of Mark Zuckerberg. It’s a criticism of the law.”

    La question des langues est fondamentale quand il s’agit de réseaux sociaux

    Beginning in 2013, a series of experts on Myanmar met with Facebook officials to warn them that it was fuelling attacks on the Rohingya. David Madden, an entrepreneur based in Myanmar, delivered a presentation to officials at the Menlo Park headquarters, pointing out that the company was playing a role akin to that of the radio broadcasts that spread hatred during the Rwandan genocide. In 2016, C4ADS, a Washington-based nonprofit, published a detailed analysis of Facebook usage in Myanmar, and described a “campaign of hate speech that actively dehumanizes Muslims.” Facebook officials said that they were hiring more Burmese-language reviewers to take down dangerous content, but the company repeatedly declined to say how many had actually been hired. By last March, the situation had become dire: almost a million Rohingya had fled the country, and more than a hundred thousand were confined to internal camps. The United Nations investigator in charge of examining the crisis, which the U.N. has deemed a genocide, said, “I’m afraid that Facebook has now turned into a beast, and not what it was originally intended.” Afterward, when pressed, Zuckerberg repeated the claim that Facebook was “hiring dozens” of additional Burmese-language content reviewers.

    More than three months later, I asked Jes Kaliebe Petersen, the C.E.O. of Phandeeyar, a tech hub in Myanmar, if there had been any progress. “We haven’t seen any tangible change from Facebook,” he told me. “We don’t know how much content is being reported. We don’t know how many people at Facebook speak Burmese. The situation is getting worse and worse here.”

    I saw Zuckerberg the following morning, and asked him what was taking so long. He replied, “I think, fundamentally, we’ve been slow at the same thing in a number of areas, because it’s actually the same problem. But, yeah, I think the situation in Myanmar is terrible.” It was a frustrating and evasive reply. I asked him to specify the problem. He said, “Across the board, the solution to this is we need to move from what is fundamentally a reactive model to a model where we are using technical systems to flag things to a much larger number of people who speak all the native languages around the world and who can just capture much more of the content.”

    Lecture des journaux ou des aggrégateurs ?

    once asked Zuckerberg what he reads to get the news. “I probably mostly read aggregators,” he said. “I definitely follow Techmeme”—a roundup of headlines about his industry—“and the media and political equivalents of that, just for awareness.” He went on, “There’s really no newspaper that I pick up and read front to back. Well, that might be true of most people these days—most people don’t read the physical paper—but there aren’t many news Web sites where I go to browse.”

    A couple of days later, he called me and asked to revisit the subject. “I felt like my answers were kind of vague, because I didn’t necessarily feel like it was appropriate for me to get into which specific organizations or reporters I read and follow,” he said. “I guess what I tried to convey, although I’m not sure if this came across clearly, is that the job of uncovering new facts and doing it in a trusted way is just an absolutely critical function for society.”

    Zuckerberg and Sandberg have attributed their mistakes to excessive optimism, a blindness to the darker applications of their service. But that explanation ignores their fixation on growth, and their unwillingness to heed warnings. Zuckerberg resisted calls to reorganize the company around a new understanding of privacy, or to reconsider the depth of data it collects for advertisers.

    Antitrust

    In barely two years, the mood in Washington had shifted. Internet companies and entrepreneurs, formerly valorized as the vanguard of American ingenuity and the astronauts of our time, were being compared to Standard Oil and other monopolists of the Gilded Age. This spring, the Wall Street Journal published an article that began, “Imagine a not-too-distant future in which trustbusters force Facebook to sell off Instagram and WhatsApp.” It was accompanied by a sepia-toned illustration in which portraits of Zuckerberg, Tim Cook, and other tech C.E.O.s had been grafted onto overstuffed torsos meant to evoke the robber barons. In 1915, Louis Brandeis, the reformer and future Supreme Court Justice, testified before a congressional committee about the dangers of corporations large enough that they could achieve a level of near-sovereignty “so powerful that the ordinary social and industrial forces existing are insufficient to cope with it.” He called this the “curse of bigness.” Tim Wu, a Columbia law-school professor and the author of a forthcoming book inspired by Brandeis’s phrase, told me, “Today, no sector exemplifies more clearly the threat of bigness to democracy than Big Tech.” He added, “When a concentrated private power has such control over what we see and hear, it has a power that rivals or exceeds that of elected government.”

    When I asked Zuckerberg whether policymakers might try to break up Facebook, he replied, adamantly, that such a move would be a mistake. The field is “extremely competitive,” he told me. “I think sometimes people get into this mode of ‘Well, there’s not, like, an exact replacement for Facebook.’ Well, actually, that makes it more competitive, because what we really are is a system of different things: we compete with Twitter as a broadcast medium; we compete with Snapchat as a broadcast medium; we do messaging, and iMessage is default-installed on every iPhone.” He acknowledged the deeper concern. “There’s this other question, which is just, laws aside, how do we feel about these tech companies being big?” he said. But he argued that efforts to “curtail” the growth of Facebook or other Silicon Valley heavyweights would cede the field to China. “I think that anything that we’re doing to constrain them will, first, have an impact on how successful we can be in other places,” he said. “I wouldn’t worry in the near term about Chinese companies or anyone else winning in the U.S., for the most part. But there are all these places where there are day-to-day more competitive situations—in Southeast Asia, across Europe, Latin America, lots of different places.”

    The rough consensus in Washington is that regulators are unlikely to try to break up Facebook. The F.T.C. will almost certainly fine the company for violations, and may consider blocking it from buying big potential competitors, but, as a former F.T.C. commissioner told me, “in the United States you’re allowed to have a monopoly position, as long as you achieve it and maintain it without doing illegal things.”

    Facebook is encountering tougher treatment in Europe, where antitrust laws are stronger and the history of fascism makes people especially wary of intrusions on privacy. One of the most formidable critics of Silicon Valley is the European Union’s top antitrust regulator, Margrethe Vestager.

    In Vestager’s view, a healthy market should produce competitors to Facebook that position themselves as ethical alternatives, collecting less data and seeking a smaller share of user attention. “We need social media that will allow us to have a nonaddictive, advertising-free space,” she said. “You’re more than welcome to be successful and to dramatically outgrow your competitors if customers like your product. But, if you grow to be dominant, you have a special responsibility not to misuse your dominant position to make it very difficult for others to compete against you and to attract potential customers. Of course, we keep an eye on it. If we get worried, we will start looking.”

    Modération

    As hard as it is to curb election propaganda, Zuckerberg’s most intractable problem may lie elsewhere—in the struggle over which opinions can appear on Facebook, which cannot, and who gets to decide. As an engineer, Zuckerberg never wanted to wade into the realm of content. Initially, Facebook tried blocking certain kinds of material, such as posts featuring nudity, but it was forced to create long lists of exceptions, including images of breast-feeding, “acts of protest,” and works of art. Once Facebook became a venue for political debate, the problem exploded. In April, in a call with investment analysts, Zuckerberg said glumly that it was proving “easier to build an A.I. system to detect a nipple than what is hate speech.”

    The cult of growth leads to the curse of bigness: every day, a billion things were being posted to Facebook. At any given moment, a Facebook “content moderator” was deciding whether a post in, say, Sri Lanka met the standard of hate speech or whether a dispute over Korean politics had crossed the line into bullying. Zuckerberg sought to avoid banning users, preferring to be a “platform for all ideas.” But he needed to prevent Facebook from becoming a swamp of hoaxes and abuse. His solution was to ban “hate speech” and impose lesser punishments for “misinformation,” a broad category that ranged from crude deceptions to simple mistakes. Facebook tried to develop rules about how the punishments would be applied, but each idiosyncratic scenario prompted more rules, and over time they became byzantine. According to Facebook training slides published by the Guardian last year, moderators were told that it was permissible to say “You are such a Jew” but not permissible to say “Irish are the best, but really French sucks,” because the latter was defining another people as “inferiors.” Users could not write “Migrants are scum,” because it is dehumanizing, but they could write “Keep the horny migrant teen-agers away from our daughters.” The distinctions were explained to trainees in arcane formulas such as “Not Protected + Quasi protected = not protected.”

    It will hardly be the last quandary of this sort. Facebook’s free-speech dilemmas have no simple answers—you don’t have to be a fan of Alex Jones to be unnerved by the company’s extraordinary power to silence a voice when it chooses, or, for that matter, to amplify others, to pull the levers of what we see, hear, and experience. Zuckerberg is hoping to erect a scalable system, an orderly decision tree that accounts for every eventuality and exception, but the boundaries of speech are a bedevilling problem that defies mechanistic fixes. The Supreme Court, defining obscenity, landed on “I know it when I see it.” For now, Facebook is making do with a Rube Goldberg machine of policies and improvisations, and opportunists are relishing it. Senator Ted Cruz, Republican of Texas, seized on the ban of Jones as a fascist assault on conservatives. In a moment that was rich even by Cruz’s standards, he quoted Martin Niemöller’s famous lines about the Holocaust, saying, “As the poem goes, you know, ‘First they came for Alex Jones.’ ”

    #Facebook #Histoire_numérique

  • Twitter a finalement banni Alex Jones et montre qu’il ne peut pas rester neutre
    https://www.numerama.com/tech/415452-twitter-sest-finalement-decide-a-bannir-alex-jones-et-montre-quil-n

    En prenant la décision de bannir Alex Jones, Twitter rejoint les autres géants tech et fait l’aveu qu’il ne peut rester totalement neutre face aux contenus postés sur sa plateforme. Après des semaines d’hésitation, Twitter s’est finalement décidé à suspendre les comptes du conspirationniste Alex Jones et celle de son site d’information, Infowars, « sur la base de nouvelles informations sur des tweets et des vidéos postés hier [mercredi] qui violent notre politique de comportement abusif ». Peu connu en (...)

    #Twitter #manipulation #censure

  • Censure d’Alex Jones
    http://www.dedefensa.org/article/censure-dalex-jones

    Censure d’Alex Jones

    Il s’est passé récemment quelque chose qui m’a donné l’impression d’être une espèce en voie de disparition. Un ensemble de sociétés Internet transnationales, y compris Google, Facebook, Apple et plusieurs autres, ont toutes supprimé, de manière synchrone, les contenus appartenant à infowars.com, le site web qui est géré par Alex Jones. Une telle synchronicité est un signe certain de conspiration – un sujet sur lequel Alex Jones surfe beaucoup.

    Une fois, j’ai participé à une émission de radio dirigée par Alex Jones, et il a réussi à résumer ce que j’avais dit : « les États-Unis vont s’effondrer comme l’URSS », ce qui était plutôt pas mal, vu que nous avons mal réussi à nous comprendre, ayant si peu de choses en commun. C’est un conservateur et un libertarien alors que je pense que les (...)

  • Deplatforming Works - Motherboard
    https://motherboard.vice.com/en_us/article/bjbp9d/do-social-media-bans-work

    The dust is still settling after Alex Jones’s InfoWars was more-or-less simultaneously banned by YouTube, Spotify, Apple, and Facebook. The move has spawned thousands of takes about whether deplatforming Jones was the right move or a slippery slope toward more censorship. But just as important to consider: Will it work?

    This is called “deplatforming” or “no platform,”—social media companies (sans Twitter, which says he hasn’t broken its rules) have decided to stop being complicit in spreading Jones’s conspiracy theories and hate. And we’ve seen no indication Jones will stop. But will his business remain viable and will his influence wane?

    “The good that comes with deplatforming is, their main goal was to redpill or get people within mainstream communities more in line with their beliefs, so we need to get them off those platforms,” Robyn Caplan, a PhD student at Rutgers University and Data and Society affiliate, told me on the phone. “But now we’ve put them down into their holes where they were before, and they could strengthen their beliefs and become more extreme.”

    The question is whether it’s more harmful to society to have many millions of people exposed to kinda hateful content or to have a much smaller number of ultra-radicalized true believers.

    Donovan believes that, ultimately, it’s important to deplatform people when their rhetoric is resulting in negative, real-world consequences: “The way Jones activates his audiences has implications for people who have already been victimized,” she said. “We have always had groups of white supremacists, misogynists, and violent insurrectionists joining message boards. But social media has made these tools much more powerful. So yes, we must take away the kinds of coordinative power they’re able to gain on platforms.”

    #Deplateformisation #Fake_News

  • Alex Jones, figure de proue du conspirationnisme américain
    https://www.lemonde.fr/ameriques/article/2018/08/10/alex-jones-figure-de-proue-du-conspirationnisme-americain_5341325_3222.html

    L’animateur radio le plus polémique des Etats-Unis s’est fait bannir de plusieurs réseaux sociaux lundi, mais il reste l’une des personnes les plus …

  • Ont-ils fait d’Alex Jones le héros d’internet ?
    http://www.dedefensa.org/article/ont-ils-fait-dalex-jones-le-heros-dinternet

    Ont-ils fait d’Alex Jones le héros d’internet ?

    PhG, agité notoire, vient de porter le jugement, dans son Journal-dde.crisis, que les comploteurs de GAFA & Cie, avec l’aide de CNN et du parti démocrate, sont des « connards maladroits » dont le Diable n’est pas très satisfait. Le psychologue Jordan B. Paterson, fameux au Canada et d’ailleurs souvent critiqué par la gauche progressiste-sociétale, a twittéà propos de la censure censée être mortelle d’Alex Jones : « Ne persécutez jamais un paranoïaque, sinon vous allez justifier sa paranoïa. C’est une sérieuse erreur. »

    Alex Jones que les“connards maladroits” de GAFA & Cie ont éliminé de leurs plateformes en 24 heures vient de déclarer au Daily Mail qu’il avait récolté 5,6 millions d’abonnés de plus en deux jours. (Il dépasse largement CNN, le NYT et tous les (...)

  • GAFA-la-gaffe ?
    http://www.dedefensa.org/article/gafa-la-gaffe

    GAFA-la-gaffe ?

    09 août 2018 – Certes, ils ont frappé, – et cela va continuer, car lorsque les crétins friqués ont commencé à produire leur sottise infamantes, c’est comme une diarrhée, plus rien ne les arrête plus... (C’est même à ça qu’on les reconnaît.) L’affaire est en train de nous confirmer pour la nième fois que la transformation des sapienscourants en zombieSystème se fait par l’équation américanisme + fric. En effet, je tiens pour évident et absolument convainquant le jugement que l’action de censure lancée par les GAFA & le reste sur consignes du DeepState représente une action tactique extrêmement visible dans ses buts illégaux, et dont l’effet stratégique se révèlera rapidement catastrophique.

    L’opération de censure gigantesque qui est en cours se déroule dans la plus complète hypocrisie juridique (...)

  • Je crois qu’il se passe quelque chose d’important par ici :
    https://twitter.com/jack/status/1026984242893357056
    Pas seulement parce que le patron de twitter explique pourquoi #twitter ne va pas clôturer le compte de #Alex_Jones ni de #Infowars, contrairement à la plupart des autres réseaux sociaux, mais parce qu’il réaffirme le besoin de confronter les opinions et surtout de contrer les fausses informations de manière visible, chose que peut se permettre un twitter où les commentaires sont beaucoup plus lus qu’ailleurs...

    If we succumb and simply react to outside pressure, rather than straightforward principles we enforce (and evolve) impartially regardless of political viewpoints, we become a service that’s constructed by our personal views that can swing in any direction. That’s not us.
    Accounts like Jones’ can often sensationalize issues and spread unsubstantiated rumors, so it’s critical journalists document, validate, and refute such information directly so people can form their own opinions. This is what serves the public conversation best.

    Je suis tombée là dessus grâce à un tweet de #Olivier_Tesquet qui fait un article super complet pour telerama sur la descente aux enfers des #GAFAM de Alex Jones :

    La “Big Tech” à l’épreuve du roi des conspirationnistes

    En privant Alex Jones, conspirationniste en chef de l’extrême-droite américaine, de ses comptes Facebook, Spotify ou Youtube, les géants de l’Internet prennent le risque d’ouvrir un débat sur la privatisation de la liberté d’expression.

    https://www.telerama.fr/medias/la-big-tech-a-lepreuve-du-roi-des-conspirationnistes,n5756062.php

    #liberte_d_expression #conspirationnisme #complotisme #extreme_droite ...

  • Silicon Valley tech giants team up to destroy Infowars and silence Alex Jones (Video)
    http://theduran.com/silicon-valley-tech-giants-team-up-to-destroy-infowars-and-silence-alex-jo

    UPDATE: YouTube is now joining in on the censorship action, permanently shutting down the popular Alex Jones – Infowars channel. This is a despicable display of collusion to shut down opposing viewpoints.

    Il y a comme un air de 1er amendement qui ne serait pas si important que ça... Encore quelques jours, et on va découvrir que les armes à feu vont être interdites aux US ?...

    #hypocrisie #censure #gafam

  • Salut, Anastasia GAFA
    http://www.dedefensa.org/article/salutanastasiagafa

    Salut, Anastasia GAFA

    … Au reste, l’acronyme GAFA est largement insuffisant et de toutes les façon inadéquat. L’attaque concertéecontre Infowars.comet Alex Jonesvient des sociétés Google, Facebook, Apple, Spotify, Stitcher et Youtube (GFASSY ferait l’affaire si l’on s’en tient à cette seule brochette) ; mais GAFA nous sert de symbole pour désigner le première attaque de censure politique de cette amplitude, venue de groupes privés prétendant par le fait s’ériger en censeur politique, au nom d’une solide morale démocratique, et accessoirement pour la liberté d’expression également démocratique, pourquoi pas ? (D’ailleurs on s’y perd dans le décompte des agresseurs puisque, dans une seconde fournée, Twitters prend le relaisen suspendant les comptes de plusieurs chroniqueurs libertariens et antiguerre, dont le (...)

  • Les géants d’internet censurent le conspirationniste Alex Jones, après des années d’impunité
    http://www.levif.be/actualite/international/les-geants-d-internet-censurent-le-conspirationniste-alex-jones-apres-des-annees-d-impunite/article-normal-874567.html

    Ces mesures interviennent après des mois de critiques visant YouTube, Facebook et Twitter, accusés de ne pas en faire assez pour combattre la …

  • Alex Jones
    https://en.wikipedia.org/wiki/Alex_Jones

    On July 27, 2018, Facebook suspended Jones’s profile for 30 days, and also removed four videos he posted, saying they violated Facebook’s standards against hate speech and bullying. YouTube also removed the same videos and issued a “strike” against the Infowars channel.[151][152] On August 5, 2018 removed five Infowars podcasts from their podcast app.[153] On August 6, 2018, Facebook removed four pages of social media accounts related to Alex Jones, stating “More content from the same pages has been reported to us – upon review, we have taken it down for glorifying violence, which violates our graphic violence policy, and using dehumanising language to describe people who are transgender, Muslims and immigrants, which violates our hate speech policies.”[154] YouTube later removed the InfoWars channel the same day.[155]

    #infowars (ou https://seenthis.net/recherche?recherche=infowars)
    #nazis #censurés sur #youtube et #facebook en même temps c’est louche :)

  • The Messy Fourth Estate – Trust Issues – Medium
    https://medium.com/s/trustissues/the-messy-fourth-estate-a42c1586b657

    par dans boyd

    I want to believe in journalism. I want to believe in the idealized mandate of the fourth estate. I want to trust that editors and journalists are doing their best to responsibly inform the public and help create a more perfect union. But my faith is waning.
    Many Americans — especially conservative Americans — do not trust contemporary news organizations. This “crisis” is well-trod territory, but the focus on fact-checking, media literacy, and business models tends to obscure three features of the contemporary information landscape that I think are poorly understood:
    Differences in worldview are being weaponized to polarize society.
    We cannot trust organizations, institutions, or professions when they’re abstracted away from us.
    Economic structures built on value extraction cannot enable healthy information ecosystems.

    Contemporary propaganda isn’t about convincing someone to believe something, but convincing them to doubt what they think they know.

    Countless organizations and movements exist to pick you up during your personal tornado and provide structure and a framework. Take a look at how Alcoholics Anonymous works. Other institutions and social bodies know how to trigger that instability and then help you find ground. Check out the dynamics underpinning military basic training. Organizations, movements, and institutions that can manipulate psychological tendencies toward a sociological end have significant power. Religious organizations, social movements, and educational institutions all play this role, whether or not they want to understand themselves as doing so.
    Because there is power in defining a framework for people, there is good reason to be wary of any body that pulls people in when they are most vulnerable. Of course, that power is not inherently malevolent. There is fundamental goodness in providing structures to help those who are hurting make sense of the world around them. Where there be dragons is when these processes are weaponized, when these processes are designed to produce societal hatred alongside personal stability. After all, one of the fastest ways to bond people and help them find purpose is to offer up an enemy.

    School doesn’t seem like a safe place, so teenagers look around and whisper among friends about who they believe to be the most likely shooter in their community. As Stephanie Georgopulos notes, the idea that any institution can offer security seems like a farce.
    When I look around at who’s “holding” these youth, I can’t help but notice the presence of people with a hateful agenda. And they terrify me, in no small part because I remember an earlier incarnation.
    In 1995, when I was trying to make sense of my sexuality, I turned to various online forums and asked a lot of idiotic questions. I was adopted by the aforementioned transgender woman and numerous other folks who heard me out, gave me pointers, and helped me think through what I felt. In 2001, when I tried to figure out what the next generation did, I realized that struggling youth were more likely to encounter a Christian gay “conversion therapy” group than a supportive queer peer. Queer folks were sick of being attacked by anti-LGBT groups, and so they had created safe spaces on private mailing lists that were hard for lost queer youth to find. And so it was that in their darkest hours, these youth were getting picked up by those with a hurtful agenda.

    Teens who are trying to make sense of social issues aren’t finding progressive activists. They’re finding the so-called alt-right.

    Fast-forward 15 years, and teens who are trying to make sense of social issues aren’t finding progressive activists willing to pick them up. They’re finding the so-called alt-right. I can’t tell you how many youth we’ve seen asking questions like I asked being rejected by people identifying with progressive social movements, only to find camaraderie among hate groups. What’s most striking is how many people with extreme ideas are willing to spend time engaging with folks who are in the tornado.
    Spend time reading the comments below the YouTube videos of youth struggling to make sense of the world around them. You’ll quickly find comments by people who spend time in the manosphere or subscribe to white supremacist thinking. They are diving in and talking to these youth, offering a framework to make sense of the world, one rooted in deeply hateful ideas. These self-fashioned self-help actors are grooming people to see that their pain and confusion isn’t their fault, but the fault of feminists, immigrants, people of color. They’re helping them believe that the institutions they already distrust — the news media, Hollywood, government, school, even the church — are actually working to oppress them.
    Most people who encounter these ideas won’t embrace them, but some will. Still, even those who don’t will never let go of the doubt that has been instilled in the institutions around them. It just takes a spark.
    So how do we collectively make sense of the world around us? There isn’t one universal way of thinking, but even the act of constructing knowledge is becoming polarized. Responding to the uproar in the news media over “alternative facts,” Cory Doctorow noted:
    We’re not living through a crisis about what is true, we’re living through a crisis about how we know whether something is true. We’re not disagreeing about facts, we’re disagreeing about epistemology. The “establishment” version of epistemology is, “We use evidence to arrive at the truth, vetted by independent verification (but trust us when we tell you that it’s all been independently verified by people who were properly skeptical and not the bosom buddies of the people they were supposed to be fact-checking).”
    The “alternative facts” epistemological method goes like this: “The ‘independent’ experts who were supposed to be verifying the ‘evidence-based’ truth were actually in bed with the people they were supposed to be fact-checking. In the end, it’s all a matter of faith, then: you either have faith that ‘their’ experts are being
    truthful, or you have faith that we are. Ask your gut, what version feels more truthful?”
    Doctorow creates these oppositional positions to make a point and to highlight that there is a war over epistemology, or the way in which we produce knowledge.
    The reality is much messier, because what’s at stake isn’t simply about resolving two competing worldviews. Rather, what’s at stake is how there is no universal way of knowing, and we have reached a stage in our political climate where there is more power in seeding doubt, destabilizing knowledge, and encouraging others to distrust other systems of knowledge production.
    Contemporary propaganda isn’t about convincing someone to believe something, but convincing them to doubt what they think they know. And once people’s assumptions have come undone, who is going to pick them up and help them create a coherent worldview?

    Meanwhile, local journalism has nearly died. The success of local journalism didn’t just matter because those media outlets reported the news, but because it meant that many more people were likely to know journalists. It’s easier to trust an institution when it has a human face that you know and respect. And as fewer and fewer people know journalists, they trust the institution less and less. Meanwhile, the rise of social media, blogging, and new forms of talk radio has meant that countless individuals have stepped in to cover issues not being covered by mainstream news, often using a style and voice that is quite unlike that deployed by mainstream news media.
    We’ve also seen the rise of celebrity news hosts. These hosts help push the boundaries of parasocial interactions, allowing the audience to feel deep affinity toward these individuals, as though they are true friends. Tabloid papers have long capitalized on people’s desire to feel close to celebrities by helping people feel like they know the royal family or the Kardashians. Talking heads capitalize on this, in no small part by how they communicate with their audiences. So, when people watch Rachel Maddow or listen to Alex Jones, they feel more connected to the message than they would when reading a news article. They begin to trust these people as though they are neighbors. They feel real.

    Building a sustainable news business was hard enough when the news had a wealthy patron who valued the goals of the enterprise. But the finance industry doesn’t care about sustaining the news business; it wants a return on investment. And the extractive financiers who targeted the news business weren’t looking to keep the news alive. They wanted to extract as much value from those business as possible. Taking a page out of McDonald’s, they forced the newsrooms to sell their real estate. Often, news organizations had to rent from new landlords who wanted obscene sums, often forcing them to move out of their buildings. News outlets were forced to reduce staff, reproduce more junk content, sell more ads, and find countless ways to cut costs. Of course the news suffered — the goal was to push news outlets into bankruptcy or sell, especially if the companies had pensions or other costs that couldn’t be excised.
    Yes, the fragmentation of the advertising industry due to the internet hastened this process. And let’s also be clear that business models in the news business have never been clean. But no amount of innovative new business models will make up for the fact that you can’t sustain responsible journalism within a business structure that requires newsrooms to make more money quarter over quarter to appease investors. This does not mean that you can’t build a sustainable news business, but if the news is beholden to investors trying to extract value, it’s going to impossible. And if news companies have no assets to rely on (such as their now-sold real estate), they are fundamentally unstable and likely to engage in unhealthy business practices out of economic desperation.

    Fundamentally, both the New York Times and Facebook are public companies, beholden to investors and desperate to increase their market cap. Employees in both organizations believe themselves to be doing something important for society.
    Of course, journalists don’t get paid well, while Facebook’s employees can easily threaten to walk out if the stock doesn’t keep rising, since they’re also investors. But we also need to recognize that the vast majority of Americans have a stake in the stock market. Pension plans, endowments, and retirement plans all depend on stocks going up — and those public companies depend on big investors investing in them. Financial managers don’t invest in news organizations that are happy to be stable break-even businesses. Heck, even Facebook is in deep trouble if it can’t continue to increase ROI, whether through attracting new customers (advertisers and users), increasing revenue per user, or diversifying its businesses. At some point, it too will get desperate, because no business can increase ROI forever.

    At the end of the day, if journalistic ethics means anything, newsrooms cannot justify creating spectacle out of their reporting on suicide or other topics just because they feel pressure to create clicks. They have the privilege of choosing what to amplify, and they should focus on what is beneficial. If they can’t operate by those values, they don’t deserve our trust. While I strongly believe that technology companies have a lot of important work to do to be socially beneficial, I hold news organizations to a higher standard because of their own articulated commitments and expectations that they serve as the fourth estate. And if they can’t operationalize ethical practices, I fear the society that must be knitted together to self-govern is bound to fragment even further.
    Trust cannot be demanded. It’s only earned by being there at critical junctures when people are in crisis and need help. You don’t earn trust when things are going well; you earn trust by being a rock during a tornado. The winds are blowing really hard right now. Look around. Who is helping us find solid ground?

    #danah_boyd #Médias #Journalisme #Post_truth

  • Alors tu suis Cory Doctorow sur Twitter et tu te retrouves à lire ça : Gwyneth Paltrow wants you to squirt coffee up your asshole using this $135 glass jar. Évidemment, maintenant mon week-end est foutu.
    https://boingboing.net/2018/01/06/rectum-damn-near-kilt-im.html

    Both Goop and Alex Jones are big on “detoxing,” an imaginary remedy that poses a very real health-risk, especially when it involves filling your asshole with coffee.

    Coffee enemas are, of course, bullshit, whose history and present are rife with hucksters whose smooth patter is only matched by their depraved indifference for human life.

    But as stupid as coffee enemas are, they’re even stupider when accomplished by means of Goop’s, $135 “Implant O’Rama,” manufactured by Implant O’Rama LLC. It’s a $135 glass jar with a couple silicon hoses attached to it.

  • How the White Helmets, Syria’s Volunteer First Responders, Became a Conspiracy Theory | WIRED
    https://www.wired.com/2017/04/white-helmets-conspiracy-theory

    Various White Helmet “truthers”—who range from Assad and his supporters to Russian embassies, and even to Alex Jones—accuse the group of staging rescue photos, belonging to al Qaeda, and being pawns of liberal bogeyman George Soros. The story of how that conspiracy grew is a perfect distillation of how disinformation can spread unchecked, supplanting fact with frenzy where no support exists.

  • It’s no surprise that the far right are mobilising against George Soros – he’s the biggest threat to their global domination | The Independent
    http://www.independent.co.uk/voices/george-soros-caused-refugee-crisis-breitbart-muslim-takeover-biggest-

    Influential financial analysts Zerohedge claim George Soros “singlehandedly created the European refugee crisis”; xenophobic rag Breitbart says Soros’s funding of Black Lives Matter was part of an agenda to swing the US presidential election; and Donald Trump’s favourite crank Alex Jones says “Soros is behind the Muslim takeover of the West”. In August, hackers thought to be linked to the Russian government stole thousands of documents from Soros’s foundation’s servers and put them online, placing at risk many of the brave individuals the foundation funds.

    #extrême-droite #george_soros

  • Douguine, assécheur du Marais
    http://www.dedefensa.org/article/douguine-assecheur-du-marais

    Douguine, assécheur du Marais

    Dans notre texte du 16 novembre, nous vous présentions un texte de Douguine du 11 novembre sur la victoire de Trump, — essentiellement son aspect politique “opérationnel”, avec un accent très significatif mis sur le rôle de la presse antiSystème (Alex Jones, de Infowars.com). Nous vous annoncions un autre Ouverture Libre sur un autre texte de Douguine, du 14 novembre 2016, également sur Katehon.com. Il s’agit d’un texte, le Marais et le Feu, d’un domaine très différent : le géopoliticien mystique et philosophe, métahistorien, interprète la victoire de Donald Trump.

    Douguine ouvre son propos à partir de l’hypothèse du caractère historique fondamental de l’événement, au point qu’on peut parler, selon lui, d’un retournement complet de la situation américaine (et non plus (...)

    • Ouais, ben ça s’arrange pas vraiment chez @dedefensa

      “Le Marais” c’est le nouveau nom de la secte globaliste, des adeptes de la société ouverte, des pervers LGBT, de l’armée de Soros, des post-humanistes et ainsi de suite. Il est absolument impératif d’assécher le Marais, pas seulement pour les Etats-Unis : c’est un défi global pour nous tous. De nos jours, chaque peuple est prisonnier de son propre Marais. Nous, tous ensemble, devons commencer la lutte contre le Marais russe, le Marais français, le Marais allemand, etc. Nous avons besoin de purger nos sociétés de l’influence du Marais. Au lieu de nous battre entre nous, asséchons-le ensemble. Assécheurs du Marais du monde entier, unissez-vous !

      Bref, pour le génial géopoliticien-spiritualiste et mystique, il faut d’urgence éliminer les pédés, les juifs, les droits-de-l’hommistes, et j’en passe…