Members in the Media
From: National Institutes of Health (NIH) Office of Behavioral and Social Sciences Research

Seeing is feeling – How Artificial Intelligence is helping us understand emotions

Recently published research supported by NIMH and NIDA sheds light on how our brains process visual information with emotional features by incorporating machine-learning innovations and human brain-imaging. The researchers started with an existing neural network, AlexNet, which enables computers to recognize objects and adapted it using prior research that identified stereotypical emotional responses to images. This new network, EmoNet, was then asked to categorize 25,000 images into 20 categories such as craving, sexual desire, horror, awe and surprise. EmoNet could accurately and consistently categorize 11 of the emotion types and reliably rate the emotional intensity of the images.

Read the whole story (subscription may be required): National Institutes of Health (NIH) Office of Behavioral and Social Sciences Research

More of our Members in the Media >


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.