Cattell Fund Projects Explore Music, Mental Imagery, and Visual Meaning

The 2019-2020 James McKeen Cattell Fund Fellowships have been awarded to APS Fellow Brad Wyble, Andrea Halpern, and Clayton Curtis. Presented in partnership with APS, the fellowships allow recipients to extend their sabbatical periods from one semester to a full year. During that time, the researchers plan to pursue the research projects outlined below.

 

Brad Wyble

Pennsylvania State University

My lab explores how the human mind extracts meaningful events from a continuous stream of sensory information. For example, we have developed models of target selection that explain how the attentional blink (Raymond, Shapiro & Arnell 1992) could play a role in dividing a continuous stream of visual input into a series of discrete episodes that encapsulate different stimuli or events (Wyble et al., 2011).

Recent innovations in computer vision have provided biologically inspired models called convolutional neural networks (CNNs) that mimic some aspects of human visual function (Krizhevsky, Sutskever & Hinton 2012). These models are mostly focused on single images, whereas biological visual systems integrate information over time to provide a rich comprehension of events and interactions. The focus of my sabbatical will be to develop new models of visual processing that extend CNNs to better understand how the brain extracts meaningful information from video input.

This idea is inspired by the notion that attention is not just useful for in-the-moment perception, but also helps us to learn more efficiently by selecting which information about the world becomes part of our remembered experience. By providing a structured and sparse set of memories to guide learning, attention may increase the rate at which our perceptual systems learn to make sense of the data from our senses. Such theories can be tested with computational models that learn how to perceive such regularities (e.g., CNNs). By imbuing such models with biologically and cognitively inspired attentional systems, we can test how attention affects learning. My sabbatical will give me time to learn the application of such models to video data, and more importantly, how to incorporate analogs of human attention into such models. I am indebted to the James McKeen Cattell Fund for allowing me to spend an entire sabbatical year developing new directions in my research program.

References

Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). Imagenet classification with deep convolutional neural networks. In Advances in Neural Information Processing Systems (pp. 1097–1105).

Raymond, J. E., Shapiro, K. L., & Arnell, K. M. (1992). Temporary suppression of visual processing in an RSVP task: An attentional blink? Journal of Experimental Psychology: Human Perception and Performance, 18(3), 849. http://doi.org/10.1037/0096-1523.18.3.849

Wyble, B., Potter, M. C., Bowman, H., & Nieuwenstein, M. (2011). Attentional episodes in visual perception. Journal of Experimental Psychology: General, 140, 488–505. https://doi.org/10.1037/a0023612

 

Andrea Halpern

Bucknell University

I’ve always been interested in understanding how the mind and brain perceive and remember nonverbal information, especially music. Most people respond to the structure and function of music, and relationships between musical training and developmental trajectories can be studied. A special interest of mine is understanding auditory imagery as mode of music retrieval.

My sabbatical year will be divided between the University of Maryland and the University of Durham (UK). With two collaborators in Maryland, I will combine machine-learning methods with EEG data to understand how neural activity associated with hearing environmental sounds may predict the environmental sound you are later imagining. Another project will follow up on prior work showing musical conductors are better than pianists in monitoring two musical streams. We’ll be adding a language task to examine generalization to other auditory domains, and also EEG to see whether neural responses to targets are stronger among the conductors. A third project will explore auditory mental imagery in healthy older adults.

In Durham, I will be working in the Music Department on two music emotion projects.  One will extend my prior work showing that musicians display a larger EEG signal than nonmusicians when they hear the first minor interval in a melody (which conveys negative affect). Will nonmusicians show this response if the musical instrument is “happy” rather than “sad”? Which cue will “win” in conveying the emotion of a melody when they are in conflict?

Another project deals with aesthetic reaction to music. A prior study showed that when asked to rate music for pleasantness in real time, people’s ratings head unidirectionally toward the positive or negative end of the scale.  We are asking whether decision time will vary in older versus younger adults. The former are typically slower, but they do have many decades of experience with knowing their own taste.

 

Clayton Curtis

New York University

For decades, scientists have argued about the importance and function of mental imagery (Pylyshyn, 2002; Kosslyn et al., 2006) and how the brain might support imagery. We now know that imagery evokes complex patterns of neural activity in the early visual cortex (Klein et al., 2000) that resemble the patterns of activity when one imagines or holds that image in working memory (Albers et al., 2013). This suggests that the neural mechanisms that allow us to see things outside our body also allow us to see imagined or recalled things in our mind. Moreover, there appear to be large individual differences in imagery ability, including so-called aphantasia, a loss of visual mental imagery (Zeman et al., 2010). Nonetheless, the causes of this variation remain unknown.

With the support of the Cattell Fund Award, I will use recent computational neuroimaging developments in my lab that allow us to map almost two dozen retinotopic areas in the occipital, parietal, and frontal cortex (Mackey et al., 2017) and also model the contents of both perceived and imagined images (Rahmati et al., 2018). The main idea is that variation in these topographic maps underlie the variation in mental imagery ability. Leveraging these developments with collaborations across New York University’s Global Campuses, we will test a variety of hypotheses about the cognitive and neural mechanisms that enhance and limit mental imagery. œ

References

Albers, A. M., Kok, P. Toni, I., Dijkerman, H. C., & de Lange, F. P. (2013). Shared representations for working memory and mental imagery in early visual cortex. Current Biology 23, 1427–1431.

Klein, I., Paradis, A. L., Poline J. B., Kosslyn S. M., & Le Bihan, D. (2000).Transient activity in the human calcarine cortex during visual-mental imagery: a event-related fMRI study. Journal of Cognitive Neuroscience, 12(Suppl. 2):15–23.

Kosslyn, S. M., Thompson, & W.L., Ganis, G. (2006). The case for mental imagery. New York, NY: Oxford University Press.

Mackey, W. E., Winawer, J.,& Curtis, C. E. (2017). Visual field map clusters in human frontoparietal cortex. eLife, 6, Article e22974. http:// doi.org/10.7554/eLife.22974

Pylyshyn, Z. W. (2002) Mental imagery: In search of a theory [Target article and commentaries]. Behavioral Brain Science, 25:157–237.

Rahmati, M., Saber, G. T., Curtis, & C. E. (2018). Population dynamics of early visual cortex during working memory. Journal of Cognitive Neuroscience, 30, 219–233.

Zeman, A. Z., Della Sala, S., Torrens, L. A., Gountouna, V. E., McGonigle, D. J., & Logie, R. H. (2010). Loss of imagery phenomenology with intact visuo-spatial task performance: A case of ‘blind imagination’. Neuropsychologia 48,145–155.


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.