News & Politics

Can a Healthcare Algorithm Be Racially Biased?

doctorhealthcare

An algorithm in a widely used program at hospitals is allegedly racially biased because it leads black patients to get passed over for special care for white patients.

Associated Press:

The software predicts costs rather than sickness. It is used by U.S. insurers and hospitals to direct higher-cost patients into health care programs designed to help them stay on medications or out of the hospital.

Whites tend to be higher-cost patients even when they’re not as sick as blacks. The study found the software regularly suggested letting healthier white patients into health care risk management programs ahead of blacks who were less healthy because those white patients were more costly.

It may be misleading to ascribe “racial bias” to a program that predicts costs. That there is a difference in outcomes between black and white is certain. But why?

The study was based on patient data from one large hospital where blacks cost $1,800 less per year than whites with the same number of chronic illnesses. That’s a pattern seen across the U.S.

The researchers had no financial relationship with the health data company and did not name the company in their paper, but they did share their findings with the company, Obermeyer said.

The company, Optum, acknowledged that its software was the subject of the study and responded Friday, calling the findings “misleading” because hospitals can and should supplement the company’s cost algorithm with their own socio-economic data.

“The cost model is just one of many data elements intended to be used to select patients for clinical engagement programs, including, most importantly, the doctor’s expertise and knowledge of his or her patient’s individual needs,” said Optum spokesman Tyler Mason.

The results may be racially biased, but again, the disparity is not based on skin color but on cost.

That doesn’t stop the racialists from decrying the hidden racial discrimination hiding behind a “veneer of technical neutrality.”

Other research has tied racial disparities in health care to a cluster of factors including doctors’ unconscious attitudes, blacks’ distrust in the health care system, lack of transportation and poverty.

As big data drives more health care decisions, some experts worry that bias will be further baked into the system. In an accompanying editorial, Ruha Benjamin of Princeton University wrote that older Jim Crow forms of discrimination are feeding into a “New Jim Code” source of bias in “automated systems that hide, speed and deepen racial discrimination behind a veneer of technical neutrality.”

Oh, c’mon. Really? This is results-oriented bias not racial discrimination of a “New Jim Code.” It’s clever, but ludicrous. By its very definition, an automated system is incapable of consciously discriminating. It may unintentionally lead to biased results, but it’s hardly evidence of racism. And how can an automated system “hide” racial discrimination? Ms. Benjamin couldn’t resist the opportunity to point out discrimination — especially where none exists.

I think the company is right: hospitals and doctors should be using other criteria besides a computer program to determine care. Perhaps our reliance on technological aids like this is the problem, not “racist” algorithms.