Members in the Media
From: NPR's Science Friday

Why AI Is A Growing Part Of The Criminal Justice System

Facial recognition technology is all around us—it’s at concerts, airports, and apartment buildings. But its use by law enforcement agencies and courtrooms raises particular concerns about privacy, fairness, and bias, according to Jennifer Lynch, the Surveillance Litigation Director at the Electronic Frontier Foundation. Some studies have shown that some of the major facial recognition systems are inaccurate. Amazon’s software misidentified 28 members of Congress and matched them with criminal mugshots. These inaccuracies tend to be far worse for people of color and women.

Meanwhile, companies like Amazon, Microsoft, and IBM also develop and sell “emotion recognition” algorithms, which claim to identify a person’s emotions based on their facial expressions and movements. But experts on facial expression, like Lisa Feldman Barrett, a professor of psychology at Northeastern University, warn it’s extremely unlikely these algorithms could detect emotions based on facial expressions and movements alone. 

Read the whole story (subscription may be required): NPR's Science Friday

More of our Members in the Media >

APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.