Facebook Should Dial Down the Toxicity Much More Often - The Atlantic
The strategy works! Facebook has recently touted reductions in the amount of hate speech and graphic content that users see on its platform. How did it make these improvements? Not by changing its rules on hate speech. Not by hiring more human content moderators. Not by refining artificial-intelligence tools that seek out rule-breaking content to take down. The progress was “mainly due to changes we made to reduce problematic content in News Feed.” The company used dials, not on-off switches.
The Chauvin trial may be a unique event, but racial tension and violence are clearly not. Content on social media leading to offline harm is not confined to Minneapolis or the U.S.; it is a global problem. Toxic online content is not an aberration, but a permanent feature of the internet. Platforms shouldn’t wait until the house is burning down to do something about it.