By: Grace Liu
During a recent scientific experiment, researchers asked specially programmed robots to scan blocks with people’s faces on it and put the “criminal” in the box. The robots repeatedly a block with a person who might identify as being black on it.
Additionally, when the robots were asked to identify blocks with faces of people who might be characterized as “homemakers” and “janitors,” the robots chose blocks with faces of women and people of color repeatedly. Institutions like John Hopkins University and the Georgia Institute of Technology have shown the stereotypes baked into AI systems and placed into robots.
Recently, companies have been pouring billions of dollars into robots who stock shelves, deliver goods, and even care for hospital patients. Because of the heightened pandemic causing labor shortages, robots and AI’s have become almost like a gold rush. But experts warn us that there are unforeseen consequences along the road as the technology advances.
In recent years, there have been studies about biased AI algorithms, and according to The Washington Post, “That includes crime prediction algorithms unfairly targeting Black and Latino people for crimes they did not commit, as well as facial recognition systems having a hard time accurately identifying people of color.”
Studies from scientists at Princeton University attempted to discover if the AI was biased or if the machine learning system learns these perceptions. They found that “The AI studies the words, how they’re used, and what words they’re used in association with, in order to provide natural language responses and answers in a way that we can understand. It also, as it turns out, learns some of our more unfortunate quirks.”
To prevent future robots to adapt and reenact these stereotypes, systematic changes in research and business practices are needed.
Link to Article: https://s3.amazonaws.com/appforest_uf/f1658069860888x666689010042064400/Robots%20trained%20on%20AI%20exhibited%20racist%20and%20sexist%20behavior%20-%20The%20Washington%20Post.pdf