Recently published research supported by NIMH and NIDA sheds light on how our brains process visual information with emotional features by incorporating machine-learning innovations and human brain-imaging. The researchers started with an existing neural network, AlexNet, which enables computers to recognize objects and adapted it using prior research that identified stereotypical emotional responses to images. This new network, EmoNet, was then asked to categorize 25,000 images into 20 categories such as craving, sexual desire, horror, awe and surprise. EmoNet could accurately and consistently categorize 11 of the emotion types and reliably rate the emotional intensity of the images.
Read the whole story: National Institutes of Health (NIH) Office of Behavioral and Social Sciences Research