New Research From Psychological Science

Read about the latest research published in Psychological Science:

Where Does Time Go When You Blink?
Shany Grossman, Chen Gueta, Slav Pesin, Rafael Malach, and Ayelet N. Landau

When humans blink, they lose brief moments of vision, yet they rarely notice these gaps. But could blinks change time perception? This research suggests that when humans spontaneously blink, they underestimate time passing. While eye movements were recorded by an eye tracker, participants either saw a white disc or heard a white noise during an interval of time between 0.6 and 2.8 s, and they estimated whether the duration had been closer to the short interval (0.6 s) or to the long interval (2.8 s). To increase the probability of blinks occurring during the task, the researchers first asked participants to perform a visual task in which they saw colored squares and had to decide how may red squares they had seen. In the main task, when a blink occurred during the estimated time interval, participants’ time estimates were reduced when the interval was filled by visual information (the white disc) but not when it was filled by auditory information (the white noise). Moreover, the size of their underestimate depended on the blink duration. These results suggest that (a) unconscious loss of visual input, via spontaneous blinks, may be related to a compression of subjective time and (b) one’s subjective sense of time might be informed by the ongoing processing of sensory information.

Electrophysiological Evidence for Top-Down Lexical Influences on Early Speech Perception
Laura M. Getz and Joseph C. Toscano

How does information about the meaning of words influence speech perception? Getz and Toscano investigated whether feedback from lexical activation affects listeners’ initial representation of the sound of a word. Participants saw a written word, followed by an auditory target word, and they had to decide which sound the auditory target started with (e.g., /p/, /b/). During this task, participants’ electroencephalographic (EEG) data were collected. When the auditory target (e.g., “potatoes”) was associated with the written word (e.g., “MASHED”), participants were faster at identifying the sound than when the written word had a neutral association (e.g., “FACE”) or when it was a nonword (e.g., “XXXX”). EEG data revealed that the amplitude of N1, a negative potential in the waveform that indexes early acoustic-cue encoding, was smaller when the written word was associated with the target than when it was neutral or a nonword. In another experiment, the word presented before the target changed how ambiguous targets were perceived (e.g., in “park,” the ambiguous first sound /p/ or /b/ was processed more like /p/ when it was preceded by “AMUSEMENT” than it was when preceded by “TEDDY”), as indicated by the N1 amplitude. These results provide evidence for an interactive model of adults’ spoken-word recognition, in which semantic and lexical activation play a role in the early processing of word sounds.

Intentional Binding Without Intentional Action
Keisuke Suzuki, Peter Lush, Anil K. Seth, and Warrick Roseboom

Open Data and Open Materials badges

Experiencing agency over one’s actions and their consequences has been measured by intentional binding, which is the perceived compression of the time interval between an intentional action (e.g., pressing a button) and an outcome (e.g., an auditory tone). However, a person can also perceive time compression when perceiving a causal unintentional relationship between events (causal binding). To investigate whether intentional action or causal binding contributes to time-binding effects, Suzuki et al. used a virtual-reality task in which participants pressed a button, observed it being pressed by a virtual hand, or saw it pressing in on its own. When the button was pressed, it lit up and participants felt a vibration and then heard a sound. Participants were asked to estimate the time between the button being pressed and hearing the sound. The time estimates were shorter when participants pressed the button themselves or saw another hand pressing it. However, participants reported higher agency when they actively pressed the button than when they observed the hand doing it, indicating that the perception of time compression may not depend on agency but rather reflect causal binding. Therefore, future studies that relate binding effects to agency should provide evidence for effects beyond causal binding, Suzuki et al. suggest.