Study finds gender and skin-type bias in commercial artificial-intelligence systems
▻http://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212
Examination of facial-analysis software shows error rate of 0.8 percent for light-skinned men, 34.7 percent for dark-skinned women. Three commercially released facial-analysis programs from major technology companies demonstrate both skin-type and gender biases, according to a new paper researchers from MIT and Stanford University will present later this month at the Conference on Fairness, Accountability, and Transparency. In the researchers’ experiments, the three programs’ error rates in (...)