• The Problem With Diversity in Computing
    https://www.theatlantic.com/technology/archive/2019/06/tech-computers-are-bigger-problem-diversity/592456

    Tech’s discriminatory culture might never change, no matter how many women and people of color are invited into the room. When Amy Webb broke her ankle, she was forced to hobble around on a walking boot. That inconvenience spawned others : among them, she couldn’t pass through the metal detector at airport TSA PreCheck lines any longer. Instead, she had to use the backscatter machines that produce X-ray images of passengers. Webb, who is a professor at New York University and the author of (...)

    #Google #algorithme #discrimination

    • those people are really committed,” Bobb told me. But their motivation is largely driven by providing access to the existing state of affairs. “They’re compelled by the argument that it just isn’t fair that more people don’t have access to the Google life—the free food and the power and the money,” Bobb said. Their goal is to get more people in the game, not necessarily to change the rules of that game. In this line of thinking, inclusion is first a problem of economic equity; any resulting social or moral benefits would just be gravy.

    • Google’s focus on the “next billion users” entails a better understanding of people of color, he said, but only because the company finally understands that they represent an untapped market for advertising.

    • For Webb, the underrepresentation of women, black people, and others is a real problem, but it’s not the fundamental one. “We’re all discriminated against by computing,” she insisted. Computing professionals constitute a “tribe,” separated from the general public not primarily by virtue of their race, gender, or nationality, but by the exclusive culture of computing education and industry. That culture replaces all knowledge and interests with the pursuit of technological solutions at maximum speed. “Anyone who falls outside of that core group of interests are not being represented,” Webb said. If she’s right, then the problem with computing isn’t just that it doesn’t represent a diverse public’s needs. Instead, the problem with computing is computing.

  • Democrats and Republicans Passing Soft Regulations - The Atlantic
    https://www.theatlantic.com/technology/archive/2019/06/democrats-and-republicans-passing-soft-regulations/592558

    Your face is no longer just your face—it’s been augmented. At a football game, your face is currency, used to buy food at the stadium. At the mall, it is a ledger, used to alert salespeople to your past purchases, both online and offline, and shopping preferences. At a protest, it is your arrest history. At the morgue, it is how authorities will identify your body.

    Facial-recognition technology stands to transform social life, tracking our every move for companies, law enforcement, and anyone else with the right tools. Lawmakers are weighing the risks versus rewards, with a recent wave of proposed regulation in Washington State, Massachusetts, Oakland, and the U.S. legislature. In May, Republicans and Democrats in the House Committee on Oversight and Reform heard hours of testimony about how unregulated facial recognition already tracks protesters, impacts the criminal-justice system, and exacerbates racial biases. Surprisingly, they agreed to work together to regulate it.

    The Microsoft president Brad Smith called for governments “to start adopting laws to regulate this technology” last year, while the Amazon Web Services CEO Andy Jassy echoed those comments in June, likening the technology to a knife. It’s a less dramatic image than the plutonium and nuclear-waste metaphors critics employ, but his message—coming from an executive at one of the world’s most powerful facial-recognition technology outfits—is clear: This stuff is dangerous.

    But crucially, Jassy and Smith seem to argue, it’s also inevitable. In calling for regulation, Microsoft and Amazon have pulled a neat trick: Instead of making the debate about whether facial recognition should be widely adopted, they’ve made it about how such adoption would work.

    Without regulation, the potential for misuse of facial-recognition technology is high, particularly for people of color. In 2016 the MIT researcher Joy Buolamwini published research showing that tech performs better on lighter-skinned men than on darker-skinned men, and performs worst on darker-skinned women. When the ACLU matched Congress members against a criminal database, Amazon’s Rekognition software misidentified black Congress members more often than white ones, despite there being far fewer black members.

    This includes House Chairman Elijah Cummings, a Baltimore native whose face was also scanned when he attended a 2015 rally in memory of Freddie Gray, the unarmed black teenager who died of a spinal-cord injury while in police custody. The Baltimore Police Department used facial recognition to identify protesters and target any with outstanding warrants. Most of the protesters were black, meaning the software used on them might have been less accurate, increasing the likelihood of misidentification. Expert witnesses at the committee hearing in May warned of a chilling effect: Protesters, wary of being identified via facial recognition and matched against criminal databases, could choose to stay home rather than exercise their freedom of assembly.

    Microsoft and Amazon both claim to have lessened the racial disparity in accuracy since the original MIT study and the ACLU’s report. But fine-tuning the technology to better recognize black faces is only part of the process: Perfectly accurate technology could still be used to support harmful policing, which affects people of color. The racial-accuracy problem is a distraction; how the technology is used matters, and that’s where policy could prevent abuse. And the solution Microsoft and Amazon propose would require auditing face recognition for racial and gender biases after they’re already in use—which might be too late.

    In early May, The Washington Post reported that police were feeding forensic sketches to their facial-recognition software. A witness described a suspect to a sketch artist, then police uploaded the sketch to Amazon’s Rekognition, looking for hits, and eventually arrested someone. Experts at the congressional hearing in May were shocked that a sketch submitted to a database could credibly qualify as enough reasonable suspicion to arrest someone.

    Read: Half of American adults are in police facial-recognition databases

    But Jassy, the Amazon Web Services CEO, claimed that Amazon has never received a report of police misuse. In May, Amazon shareholders voted down a proposal that would ban the sale of Rekognition to police, and halt sales to law enforcement and ICE. Jassy said that police should only rely on Rekognition results when the system is 99 percent confident in the accuracy of a match. This is a potentially critical safeguard against misidentification, but it’s just a suggestion: Amazon doesn’t require police to adhere to this threshold, or even ask. In January, Gizmodo quoted an Oregon sheriff’s official saying his department ignores thresholds completely. (“There has never been a single reported complaint from the public and no issues with the local constituency around their use of Rekognition,” a representative from Amazon said, in part, in a statement to Gizmodo.)

    #Reconnaissance_faciale #Libertés #Espace_public #Etat_policier

  • The Strange Politics of Facial Recognition
    https://www.theatlantic.com/technology/archive/2019/06/democrats-and-republicans-passing-soft-regulations/592558

    Everyone seems to have found common ground on the emerging technology. That’s exactly what its makers want. Your face is no longer just your face—it’s been augmented. At a football game, your face is currency, used to buy food at the stadium. At the mall, it is a ledger, used to alert salespeople to your past purchases, both online and offline, and shopping preferences. At a protest, it is your arrest history. At the morgue, it is how authorities will identify your body. Facial-recognition (...)

    #Axon #Microsoft #Amazon #AWS #algorithme #CCTV #Rekognition #biométrie #facial #vidéo-surveillance #discrimination #surveillance #ACLU (...)

    ##FaceAPI

    • In calling for regulation, Microsoft and Amazon have pulled a neat trick: Instead of making the debate about whether facial recognition should be widely adopted, they’ve made it about how such adoption would work.

      And as the discussions around the specifics of implementation swirl, critics argue that they are a diversion from larger, worthier discussions.

      So far, technology companies have succeeded in setting the terms of the debate over facial recognition. But these might be the last days of privately owning our own faces. Common ground itself is not a victory. Narrowing the discussion isn’t compromise; it’s a rhetorical trick. It turns public governance into a terms-of-service agreement: One party sets the terms while the other, uninterested and resigned to the inevitable, simply says, “I agree.”