Katie Notopoulos (katienotopoulos) on BuzzFeed

/katienotopoulos

  • Newly Leaked Facebook Documents Show How The Company Sets Up Its Moderators To Fail
    https://www.buzzfeednews.com/article/katienotopoulos/facebook-moderators-are-set-up-to-fail

    For moderators at Cognizant and Accenture, this was yet another instance of the confusing, often poorly explained rules and convoluted communication between them and Facebook that make their job nearly impossible. In interviews with BuzzFeed News, former and current moderators described policing content for Facebook as a grueling job made worse by poor management, lousy communication, and whipsaw policy changes that made them feel set up to fail. Hundreds of pages of leaked documents about Facebook’s moderation policy were obtained by BuzzFeed News. The documents — which contain often ambiguous language, Byzantine charts, and an indigestible barrage of instructions — reveal just how daunting a moderator’s jobs can be.
    Facebook had made an exception for Reynolds’ pubes.

    Moderating a platform of over a billion people is a daunting task — particularly at a global level. It’s unreasonable to expect Facebook to seamlessly police all the content that people post on it.

    But Facebook is also a $533 billion company with nearly 40,000 employees. It’s had years to craft these policies. It has the incentives, the resources, and the time to get it right. And it’s trying. On Sept. 12, the company released an updated set of values that it said would guide the company’s community standards — including authenticity, safety, privacy, and dignity. In a statement to BuzzFeed News, a Facebook spokesperson said: “Content review on a global scale is complex and the rules governing our platform have to evolve as society does. With billions of pieces of content posted to our platforms every day, it’s critical these rules are communicated efficiently. That’s a constant effort and one we’re committed to getting right.”

    But no matter the lofty principles, the moderators who spoke to BuzzFeed News say the outsourcing system by which Facebook governs the content on its platform is fundamentally broken.

    For moderators struggling to do good work under conditions described as PTSD-inducing, Facebook’s mercurial and ever-evolving policies added another stressor. Many who requested anonymity for fear of retribution described to BuzzFeed News a systemic breakdown of communication between Facebook’s policymakers, the Facebook employees charged with communicating that policy, and the senior staff at the outsourced firms charged with explaining it to the moderation teams. Sometimes the outsourcing firms misinterpreted an update in policy. Sometimes Facebook’s guidance came far too late. Sometimes the company issued policy updates it subsequently changed or retracted entirely.

    “From an outside perspective, it doesn’t make sense,” a Facebook moderator at Accenture told BuzzFeed News. “But having worked the system for years, there’s some logic.”

    The real problem, moderators say, are the cases that don’t fall squarely under written policy or test the rules in unexpected ways. And there are many of them.

    “These edge cases are what cause problems with [moderators],” another moderator explained. “There’s no mechanism built in for edge cases; Facebook writes the policy for things that are going to capture the majority of the content, and rely on these clarifications to address the edge cases. But that requires quick communication.”

    But that’s only one facet of the issue. “There’s a second problem,” Gillespie explained. “Facebook pretends like there are clear policies and decisions at the top, that the challenge is simply how to convey them down the line. That’s not how content moderation works. The policies aren’t clear at the top, and moderators are constantly being tested and surprised by what users come up with.”

    Cognizant moderators receive four weeks of training on Facebook’s policies before they are put to work. Once training is over, they begin work by reporting to a different layer of Cognizant leadership called the policy team, which is responsible for interpreting Facebook’s rules. The company did not respond to questions about the training members of this team are given. Moderators who spoke to BuzzFeed News said the policy teams they work with typically consist of about six to seven people, including those who oversee the Spanish-language moderators.

    To be fair, there is a lot of nuance for Facebook to communicate when it comes to content moderation. That’s painfully apparent in the bimonthly policy updates the company distributes to outsourcers like Cognizant. These typically arrive as 10- to 20-page slideshow presentations and cover small tweaks in policy; flag new kinds of harassment, bullying, hate speech, and hate groups; and provide updates on content that should be preserved or removed. On their own, the updates seem to make sense. But taken together, they reveal the endless game of whack-a-mole Facebook plays with hate speech, bullying, crime, and exploitation — and the rat’s nest of policy tweaks and reversals that moderators, who are under extreme pressure to get high accuracy scores, are expected to navigate.

    “It does seem all over the place and impossible to align these guideline changes,” said a current moderator. “Say you’re reviewing a post where the user is clearly selling marijuana, but the post doesn’t explicitly mention it’s for sale. That doesn’t meet the guidelines to be removed. ... But then later an update will come up saying the ‘spirit of the policy’ should guide us to delete the post for the sale of drugs. Which is why no one can ever align what the spirit of the policy is versus what the policy actually states.”
    “Understanding and applying this policy is far harder than any class I took in college.”

    BuzzFeed News reviewed 18 such policy updates issued between 2018 and 2019; topics they addressed included sexual solicitation of minors, animal abuse, hate speech, and violence, among a host of other things. For example, an update about the sale of animals is tucked into another update that outlines broad changes to how Facebook moderates posts about eating disorders and self-harm. “It’s just not something $15/hour employees are equipped for,” said a former Cognizant moderator. “Understanding and applying this policy is far harder than any class I took in college.”

    On June 25, a heartbreaking photo of a Salvadoran father and toddler daughter who drowned while trying to enter the United States went viral. Within the hour, it was the top news story of the day; the next night, it was discussed in the Democratic presidential debate. But Facebook’s third-party moderators weren’t sure how to handle it.

    Facebook policy says that images depicting violent death should be “marked as disturbing,” which would put a blurred overlay on the image that a viewer would have to click through to see.
    ‘Well, you know, it’s drowning, but someone didn’t drown them, so why would it be a violent death?’”

    The question was whether their tragic drowning would be considered a “violent” death, per Facebook’s rules. “I overheard someone saying that day, ‘Well, you know, it’s drowning, but someone didn’t drown them, so why would it be a violent death?’” a moderator told BuzzFeed News.

    At 4 p.m. Arizona time, over 24 hours after the photo was first published, leaders at Cognizant posted their guidance: The photo should be considered a violent death and marked as disturbing. The next morning, leaders issued another update, reversing a newsworthy exception for a famous photo of a Syrian boy who had drowned in 2015. That photo should now also be marked as disturbing and receive the blurred overlay.

    “This is the outcome of a system that is fractured and stratified organizationally, geographically, structurally, etc.,” said Sarah T. Roberts, assistant professor of information studies at UCLA and author of Behind the Screen, a book about social media content moderators. “It’s not as if social media is the first industry to outsource. The tech industry get treated as if their cases are sui generis, but in reality they’re just corporate entities doing the same things other corporate entities have done. When a firm like H&M or Walmart has a global production chain, and a factory collapses in Bangladesh, they can reasonably claim that Gosh, we didn’t even know our clothes were made there because their supply chain is so fractured and convoluted. I’m not alleging that Facebook and other social media companies have deliberately set up that kind of structure. I don’t have any evidence to support that. But what I do know is that there are pitfalls in such a structure that one could reasonably foresee.”

    #Facebook #Modération