Members in the Media
From: Scientific American

Humans Absorb Bias From AI—and Keep It After They Stop Using the Algorithm

Artificial intelligence programs, like the humans who develop and train them, are far from perfect. Whether it’s machine-learning software that analyzes medical images or a generative chatbot, such as ChatGPT, that holds a seemingly organic conversation, algorithm-based technology can make errors and even “hallucinate,” or provide inaccurate information. Perhaps more insidiously, AI can also display biases that get introduced through the massive data troves that these programs are trained on—and that are indetectable to many users. Now new research suggests human users may unconsciously absorb these automated biases.

Past studies have demonstrated that biased AI can harm people in already marginalized groups. Some impacts are subtle, such as speech recognition software’s inability to understand non-American accents, which might inconvenience people using smartphones or voice-operated home assistants. Then there are scarier examples—including health care algorithms that make errors because they’re only trained on a subset of people (such as white people, those of a specific age range or even people with a certain stage of a disease), as well as racially biased police facial recognition software that could increase wrongful arrests of Black people.

Read the whole story (subscription may be required): Scientific American

More of our Members in the Media >


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.