From: National Institutes of Health (NIH) Office of Behavioral and Social Sciences Research

Seeing is feeling – How Artificial Intelligence is helping us understand emotions

Recently published research supported by NIMH and NIDA sheds light on how our brains process visual information with emotional features by incorporating machine-learning innovations and human brain-imaging. The researchers started with an existing neural network, AlexNet, which enables computers to recognize objects and adapted it using prior research that identified stereotypical emotional responses to images. This new network, EmoNet, was then asked to categorize 25,000 images into 20 categories such as craving, sexual desire, horror, awe and surprise. EmoNet could accurately and consistently categorize 11 of the emotion types and reliably rate the emotional intensity of the images.

Read the whole story: National Institutes of Health (NIH) Office of Behavioral and Social Sciences Research

Leave a Comment

Your email address will not be published.
In the interest of transparency, we do not accept anonymous comments.
Required fields are marked*

This site uses Akismet to reduce spam. Learn how your comment data is processed.