• The Underworld of Online Content Moderation | The New Yorker
    https://www.newyorker.com/news/q-and-a/the-underworld-of-online-content-moderation

    More than one hundred thousand people work as online content moderators, viewing and evaluating the most violent, disturbing, and exploitative content on social media. In a new book, “Behind the Screen,” Sarah T. Roberts, a professor of information studies at U.C.L.A., describes how this work shapes their professional and personal lives. Roberts, who conducted interviews with current and former content moderators, found that many work in Silicon Valley, but she also travelled as far as the Philippines, where some of the work has been outsourced. From her research, we learn about the emotional toll, low wages, and poor working conditions of most content moderation. Roberts never disputes that the work is crucial, but raises the question of how highly companies like Facebook and Google actually value it.

    I recently spoke by phone with Roberts. During our conversation, which has been edited for length and clarity, we discussed why finding and deleting offensive content is so tricky, why the job is so psychologically taxing, and the fixes that could help these workers and make them better at their jobs.

    The example I’ll give there is blackface. One person that I talked with said time and again he would see these videos that were filled with blackface, and he would go and argue with his supervisor, saying, “This is racist, we supposedly don’t allow racist or hate speech on our platform,” and he could get no traction. So the policies that were in place almost parodied themselves. They were so specific on the one hand and totally missing the forest for the trees on the other that you really had to embed yourself into the logic of the particular platform, and of course every platform has its own set of policies that it makes up.

    I think they cared enough that they had an entire apparatus devoted to the creating and designing and thinking through their policies, but what became clear to me through the course of this work was that the primary function of people doing commercial content moderation at these platforms was for brand management of the social-media platform itself. There would be a great side-benefit of keeping some bad stuff out of people’s way, or “cleaning up” the platform. But ultimately this was in the service of the brand, so that the brand could continue to function as a site where advertisers might want to come. And so I feel that this whole practice really laid that bare for me.

    What could be done to make the lives of these workers better, given that this is a job that needs to be done? And it needs to be done by smart people doing it well, who need to be very well-trained.

    This is a question that I’ve often posed to the workers themselves because I certainly am not possessed of the answers on my own. They want better pay. And I think we can read that in a lot of ways: they want better pay, they want to be respected. The nature of the way the work has been designed has been for the work to be secret. In many cases, their N.D.A. precludes them from even talking about the work. And the industry itself formulated the job as a source of shame in that sense, an industry source of shame. They were not eager to tout the efforts of these people, and so instead they hid them in the shadows. And, if nothing else, that was a business decision and a value judgment that could have gone another way. I think there’s still a chance that we could understand the work of these people in a different way and value it differently, collectively. And we could ask that the companies do that as well.

    There’s a rich history of labor organizing and worker-led, or worker-informed, movements, and in this case it might have to be region by region or specific to particular parts of the world. Or it could be something that crossed geographic and cultural boundaries where workers learn to identify with each other despite where they’re located.

    We talk a lot about automation. I think that’s what you’re saying about the tech companies. Their solution is always automation, or that’s what gets foregrounded, but, I think if you talk to anyone in the industry who’s in the know, the likelihood of humans going away anytime soon is pretty much nil. And we also need to support them with mental-health support. There are things we can do technologically to maybe make it less difficult to look at some of the content.

    Facebook, just about ten days or so ago, announced a major initiative where they were going to raise the base pay of all their content moderators. I was thrilled about that. On the other hand, we could read between the lines of such an announcement to learn that until now these people were probably making minimum wage or close to that. And we could also read the deafening silence from other firms that they haven’t done that and aren’t really willing to do that yet. Because, if they were, they’d be issuing a press release, too. We’ve got a ways to go on that.

    #Content_moderation #Modération #Médias_sociaux #Travail

  • Douek, speculating as to why a “Supreme Court of Facebook” might be appealing to the company, argues, “Content-moderation decisions on Facebook are hard, and any call is likely to upset a proportion of Facebook users. By outsourcing the decision and blame, Facebook can try to wash its hands of controversial decisions.” If that’s part of the motivation, it doesn’t make the underlying idea better or worse.

    But consumers should be aware that Facebook may prefer to manipulate distribution rather than impose an outright ban. A Supreme Court of Facebook with no control of the algorithm, in a context where Facebook wasn’t transparent about what content it penalizes and why, wouldn’t necessarily remove Facebook’s control over free expression and the most important censorship decisions after all.

    https://www.theatlantic.com/ideas/archive/2018/12/facebook-punish-censorship/577654

    #facebook #content_moderation #open_internet #politics_of_social_media