• Le pouvoir des mots...
    Une militante acharnée (Laila Micklewait, de l’organisme TraffickingHub) arrive à faire publier par un grand reporter de guerre (Nicholas Kristof, Prix Pulitzer) un reportage dévastateur dans le New York Times de dimanche sur l’exploitation d’enfants par une chaîne de vidéos et l’empire de pornocrates montréalais est discrédité en 48 heures !
    https://tradfem.wordpress.com/.../08/les-enfants-de-pornhub

  • Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review
    https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-repo

    Imagine if Facebook stopped moderating its site right now. Anyone could post anything they wanted. Experience seems to suggest that it would quite quickly become a hellish environment overrun with spam, bullying, crime, terrorist beheadings, neo-Nazi texts, and images of child sexual abuse. In that scenario, vast swaths of its user base would probably leave, followed by the lucrative advertisers.

    But if moderation is so important, it isn’t treated as such. The overwhelming majority of the 15,000 people who spend all day deciding what can and can’t be on Facebook don’t even work for Facebook. The whole function of content moderation is farmed out to third-party vendors, who employ temporary workers on precarious contracts at over 20 sites worldwide. They have to review hundreds of posts a day, many of which are deeply traumatizing. Errors are rife, despite the company’s adoption of AI tools to triage posts according to which require attention. Facebook has itself admitted to a 10% error rate, whether that’s incorrectly flagging posts to be taken down that should be kept up or vice versa. Given that reviewers have to wade through three million posts per day, that equates to 300,000 mistakes daily. Some errors can have deadly effects. For example, members of Myanmar’s military used Facebook to incite genocide against the mostly Muslim Rohingya minority in 2016 and 2017. The company later admitted it failed to enforce its own policies banning hate speech and the incitement of violence.

    If we want to improve how moderation is carried out, Facebook needs to bring content moderators in-house, make them full employees, and double their numbers, argues a new report from New York University’s Stern Center for Business and Human Rights.

    “Content moderation is not like other outsourced functions, like cooking or cleaning,” says report author Paul M. Barrett, deputy director of the center. “It is a central function of the business of social media, and that makes it somewhat strange that it’s treated as if it’s peripheral or someone else’s problem.”

    Why is content moderation treated this way by Facebook’s leaders? It comes at least partly down to cost, Barrett says. His recommendations would be very costly for the company to enact—most likely in the tens of millions of dollars (though to put this into perspective, it makes billions of dollars of profit every year). But there’s a second, more complex, reason. “The activity of content moderation just doesn’t fit into Silicon Valley’s self-image. Certain types of activities are very highly valued and glamorized—product innovation, clever marketing, engineering … the nitty-gritty world of content moderation doesn’t fit into that,” he says.

    He thinks it’s time for Facebook to treat moderation as a central part of its business. He says that elevating its status in this way would help avoid the sorts of catastrophic errors made in Myanmar, increase accountability, and better protect employees from harm to their mental health.

    It seems an unavoidable reality that content moderation will always involve being exposed to some horrific material, even if the work is brought in-house. However, there is so much more the company could do to make it easier: screening moderators better to make sure they are truly aware of the risks of the job, for example, and ensuring they have first-rate care and counseling available. Barrett thinks that content moderation could be something all Facebook employees are required to do for at least a year as a sort of “tour of duty” to help them understand the impact of their decisions.

    The report makes eight recommendations for Facebook:

    Stop outsourcing content moderation and raise moderators’ station in the workplace.
    Double the number of moderators to improve the quality of content review.
    Hire someone to oversee content and fact-checking who reports directly to the CEO or COO.
    Further expand moderation in at-risk countries in Asia, Africa, and elsewhere.
    Provide all moderators with top-quality, on-site medical care, including access to psychiatrists.
    Sponsor research into the health risks of content moderation, in particular PTSD.
    Explore narrowly tailored government regulation of harmful content.
    Significantly expand fact-checking to debunk false information.

    The proposals are ambitious, to say the least. When contacted for comment, Facebook would not discuss whether it would consider enacting them. However, a spokesperson said its current approach means “we can quickly adjust the focus of our workforce as needed,” adding that “it gives us the ability to make sure we have the right language expertise—and can quickly hire in different time zones—as new needs arise or when a situation around the world warrants it.”

    But Barrett thinks a recent experiment conducted in response to the coronavirus crisis shows change is possible. Facebook announced that because many of its content moderators were unable to go into company offices, it would shift responsibility to in-house employees for checking certain sensitive categories of content.

    “I find it very telling that in a moment of crisis, Zuckerberg relied on the people he trusts: his full-time employees,” he says. “Maybe that could be seen as the basis for a conversation within Facebook about adjusting the way it views content moderation.”

    #Facebook #Moderation #Travail #Digital_labour #Modérateurs

  • #Google sait identifier les « phrases problématiques » dans un e-mail - Numerama
    http://www.numerama.com/magazine/25889-google-sait-identifier-les-34phrases-problematiques34-dans-un-e-mail

    Nos e-mails nous trahissent, comme nous le rappelaient les affaires Goldman Sachs, Enron, MegaUpload... Nos discussions sensibles laissent désormais des traces... Mais Google a la solution. Il vient de déposer un brevet pour identifier les phrases problématiques dans un document électronique, comme un e-mail, avec la prise en compte d’un contexte (à savoir le destinataire du document) selon la politique et le droit. Le but : éviter la fuite de données confidentielles, accidentellement ou non. (...)

  • “Bruce #Bartlet of The Fiscal Times sees #Obama as a #moderate #conservative
    http://www.thefiscaltimes.com/Columns/2011/07/22/Barack-Obama-The-Democrats-Richard-Nixon.aspx#page1

    Democrat Franklin D. #Roosevelt was a transformative president, partly because of his policies but mainly because he presided over the two most disruptive events of the 20th century: the Great Depression and World War II.

    By the time Dwight #Eisenhower took office, people craved stability and he was determined to give it to them. This angered his fellow #Republicans, who wanted nothing more than to repeal Roosevelt’s New Deal, root and branch. And with control of both the House and Senate in 1953 and 1954, he could have undone a lot of it if he wanted to.

    But Eisenhower not only refused to repeal the New Deal, he wouldn’t even let Republicans in Congress cut taxes even though the high World War II and Korean War rates were in effect. He thought a balanced budget should take priority. Eisenhower also helped to destroy right wing hero Joe McCarthy and worked closely with liberals on civil rights.

    Eisenhower’s effective liberalism was deeply frustrating to conservatives. Robert Welch of the John Birch Society even accused him of being a communist. But after Republicans lost control of Congress in 1954, he was the only game in town for them.”