Jason P. Gallivan, Craig S. Chapman, Daniel K. Wood, Jennifer L. Milne, Daniel Ansari, Jody C. Culham, and Melvyn A. Goodale
Processing sensory information about multiple objects at once is limited, but is there a processing limit when someone is planning to do something with the objects (action planning)? Participants were asked to touch targets on a computer screen within a certain period of time. The number of targets that appeared, as well as their distance from the center of the screen, varied. Analysis of the participants’ reach trajectories revealed that only three to four objects could be simultaneously encoded for a particular action. Therefore, both conscious perceptual processing and visuomotor planning may be limited by a shared mechanism.
Stefan L. Frank and Rens Bod
Words in a sentence depend on other words, and that dependency can be organized into a hierarchy to create a hierarchical phrase structure. To better understand how these structures affect sentence comprehension, researchers evaluated various probabilistic language models, some involving hierarchical structures and others not. Based on their evaluations, the researchers concluded that human sentence processing may rely more on sequential structures than on hierarchical structures.
Roland W. Fleming, Frank Jakel, and Laurence T. Maloney
Viewing transparent objects is complex because light passing through them is refracted and reflected several times before reaching the eye. To test whether people receive cues about transparent objects via the distorted shapes of objects being viewed through the transparent substance, researchers asked observers to view computer-generated images of an object placed behind a transparent material. The researchers varied the level of distortion by changing the refractive properties of the image, and observers were asked to pick out stimuli pairs that had the most difference between them. The researchers found that variations in the distortion changed how observers perceived the transparent material.