People with Dark Skin at Higher Risk of Being Hit by Self-Driving Cars, Study Finds

(Getty Images)

AI has a problem. Apparently, it’s racist. Specifically, self-driving cars are racist because people of color have a greater chance of being run over by one than white people do, according to a new study published by the Georgia Institute of Technology.

Advertisement

The study concludes that the existing recognition technology used in self-driving cars has a harder time recognizing humans the darker their skin is. The authors write:

We give evidence that standard models for the task of object detection, trained on standard datasets, appear to exhibit higher precision on lower Fitzpatrick skin types than higher skin types. This behavior appears on large images of pedestrians, and even grows when we remove occluded pedestrians. Both of these cases (small pedestrians and occluded pedestrians) are known difficult cases for object detectors, so even on the relatively “easy” subset of pedestrian examples, we observe this predictive inequity.

The study is long and complicated, filled with details and big words that I had to look up. And the study is also important. Whether we like it or not, our society is moving rather quickly toward having more and more things automated. Self-driving cars will eventually be ubiquitous. Because of that, it’s comforting knowing that people much smarter than me are attempting to uncover potential problems with the technology in order to find solutions. But none of that is why I pitched this article to my editor.

Advertisement

It’s an article published by Relevant magazine that caused me to sit up and take notice of the study. Tyler Huckabee opens his article by lamenting the racism of self-driving cars:

It’s easy to start fretting about the coming robo-takeover and all its attendant Terminator and I, Robot-esque predictions about how artificial intelligence is coming for humanity. But for the moment, the real concerns about new technology aren’t so different from the old concerns: systemic bias against people of color.

Systemic bias against people of color in self-driving cars? What?

To be fair, Huckabee does toss in the caveat that the designers of self-driving cars aren’t meaning to be racist, but “their own implicit biases work their way into the algorithms they create.”

Look, the fact that researchers are working hard at finding solutions for the problems with current recognition technology undercuts claims of “systemic bias against people of color” as it relates to self-driving cars. Researchers recognize that people of color have a 5 percent greater chance of being struck by a self-driving car than a white person, but the problem is in the technology not in the latent racism of the developers. There are variables creating this issue in the technology that aren’t connected to the motives of the developers. The GIT study states, “Common challenges of Pedestrian Detection include: occlusion by other objects or people, changes in clothing, and diverse lighting conditions.”

Advertisement

A teacher once told me that if you look hard enough, you can find a bogeyman behind every tree. The parable of the boy who cries wolf also comes to mind. As a society, we need to be careful about seeing racism everywhere. Doing so makes it harder to combat actual racism. Self-driving cars aren’t racist; the technology simply has some bugs that need to be worked out.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement