Millions of black people affected by racial #bias in health-care algorithms
Study reveals rampant racism in decision-making software used by US hospitals — and highlights ways to correct it.
Ghani says that his team has carried out unpublished analyses comparing algorithms used in public health, criminal justice and education to human decision making. They found that the machine-learning systems were biased — but less so than the people.
“We are still using these algorithms called humans that are really biased,” says Ghani. “We’ve tested them and known that they’re horrible, but we still use them to make really important decisions every day.”