Study finds a potential risk with self-driving cars : failure to detect dark-skinned pedestrians

/self-driving-car-racial-bias-study-auto

  • Study finds a potential risk with self-driving cars: failure to detect dark-skinned pedestrians - Vox
    https://www.vox.com/future-perfect/2019/3/5/18251924/self-driving-car-racial-bias-study-autonomous-vehicle-dark-skin

    If you’re a person with dark skin, you may be more likely than your white friends to get hit by a self-driving car, according to a new study out of the Georgia Institute of Technology. That’s because automated vehicles may be better at detecting pedestrians with lighter skin tones.

    The authors of the study started out with a simple question: How accurately do state-of-the-art object-detection models, like those used by self-driving cars, detect people from different demographic groups? To find out, they looked at a large dataset of images that contain pedestrians. They divided up the people using the Fitzpatrick scale, a system for classifying human skin tones from light to dark.

    The researchers then analyzed how often the models correctly detected the presence of people in the light-skinned group versus how often they got it right with people in the dark-skinned group.

    The study’s insights add to a growing body of evidence about how human bias seeps into our automated decision-making systems. It’s called algorithmic bias.

    The most famous example came to light in 2015, when Google’s image-recognition system labeled African Americans as “gorillas.” Three years later, Amazon’s Rekognition system drew criticism for matching 28 members of Congress to criminal mugshots. Another study found that three facial-recognition systems — IBM, Microsoft, and China’s Megvii — were more likely to misidentify the gender of dark-skinned people (especially women) than of light-skinned people.