Across the automotive and transportation industry, executives are hoping to render drivers a thing of the past. Not only are autonomous or self-driving taxi cabs, private cars and commercial trucks considered to be cheaper in terms of labor costs, but they are also expected to be far safer than vehicles that require actual human operators.
Yet what if driverless cars also carry a racial bias that makes them more likely to crash into black and darker-skinned people?
At a glance, the idea may sound strange–these are machines, after all, tools which should, by definition, be blind to skin color and neutral in matters of race.
But according to a new study from the Georgia Institute of Technology, researchers found that the high-tech visualization and detection systems rely on the same sensors and cameras that have previously made mistakes due to skin tone when applied to other automated technologies.
In practical terms, this means that cars are less likely to detect black people and hit the brakes before crashing into them as opposed to those with lighter skin tones.
The study’s authors questioned how accurately current state-of-the-art object detection algorithms can detect people from various demographic groups. The group set about categorizing image of pedestrians based on the Fitzpatrick scale, the system used to categorize human skin tones.
The object detection models were then put to the test, and results revealed that those with dark skin were detected at a rate of five percent less than those with lighter skin–a disparity that held true even when factors such as time of day or obstructed view of pedestrians was taken into account.
The study concluded:
“We hope this study provides compelling evidence of the real problem that may arise if this source of capture bias is not considered before deploying these sort of recognition models.”
AI researcher Kate Crawford, who wasn’t involved with the study, commented that these issues have long been known by critics. “Guess what? Study shows that self-driving cars are better at detecting pedestrians with lighter skin tones. Translation: Pedestrian deaths by self-driving cars are already here – but they’re not evenly distributed,” she tweeted.
https://platform.twitter.com/widgets.js
Technologists have long warned that the current crop of automated machine-learning and facial recognition algorithms reflect the systemic racial and skin color-based biases that prevail in society.
In past tests, facial-recognition software such as Amazon’s notorious “Rekognition” algorithm have mistaken darker-skinned members of Congress for criminal suspects and also had a high error rate in determining gender in cases of females and darker-skinned people.
The post Self-Driving Cars Are More Likely to Crash Into Black People: New Study appeared first on The Mind Unleashed.