Most Findings Obtained With Untimed Visual Illusions Are Confounded
Paola Bressan and Peter Kramer
Differences in how people or groups perceive visual illusions might be due to differences in the time they spend inspecting illusions rather than inequalities in their mental processes. Bressan and Kramer show that when presentation time is unrestricted, susceptibility to visual illusions depends on how long participants look at them. This susceptibility can decrease, increase, or not change for different illusions. These findings suggest that it is important to consider inspection time when investigating visual illusions or using them as tools.
People Are Less Susceptible to Illusion When They Use Their Hands to Communicate Rather Than Estimate
Amanda R. Brown, Wim Pouw, Diane Brentari, and Susan Goldin-Meadow
English speakers and American Sign Language signers used their hands to act on, estimate the length of, and describe sticks eliciting the Müller-Lyer illusion (an optical illusion in which two lines or sticks of the same length appear to be of different lengths). It had been known that using hands to estimate the length of a stick maintains the illusion whereas using hands to act on the stick diminishes the illusion. Here, when both groups of participants described the stick, their hands moved as though they were preparing to grasp it, and this reduced the illusion. Thus, although gesture and sign are tied to language, their roots may lie in action.
Listen to the APS Podcast episode about this study here.
Read the entire news release here.
Global Variation in Subjective Well-Being Predicts Seven Forms of Altruism
Shawn A. Rhoads, Devon Gunter, Rebecca M. Ryan, and Abigail A. Marsh
Nations with higher objective and subjective (self-reported) well-being and more individualist values (i.e., prioritizing autonomy and self-expression over collective interests) appear to have a higher prevalence of altruistic behaviors than nations with lower well-being and more collectivist cultural values. Rhoads and colleagues examined seven types of altruistic behaviors—blood donation, bone-marrow donation, living-kidney donation, humane treatment of animals, charitable donations, volunteering, and everyday helping—across 48 to 152 nations. Altruistic behaviors were more frequent when resources and cultural values provided means for pursuing personally meaningful goals. Thus, societal changes that promote well-being might foster altruistic behaviors.
Response Bias Reflects Individual Differences in Sensory Encoding
Individual differences in the use of normative strategies, derived from differences in sensory encoding, appear to partly explain why humans vary in their response bias. To test whether decision-making biases reflect how individuals encode two categories, Rhanev analyzed published data from three perceptual decision tasks (e.g., selecting which of two black boxes had more white dots) using computational modeling. Results indicated that response bias moved in the direction of the optimal criterion determined by the idiosyncratic internal-evidence distribution of each category. Thus, individuals appear to encode sensory stimuli differently, originating different decision criteria that ultimately influence their decisions.
The Experience of Empathy in Everyday Life
Gregory John Depow, Zoë Francis, and Michael Inzlicht
Depow and colleagues examined perceptions of empathy in the everyday lives of a group of U.S. adults. Participants reported their experiences of empathy in daily experience-sampling surveys for 7 days. They reported an average of about nine opportunities to empathize per day, which were associated with prosocial behavior. Most participants appeared to empathize with very close others, with whom they were about 3 times more likely to empathize with positive emotions than with negative emotions. Emotion sharing, perspective taking, and compassion typically co-occurred in everyday empathy experiences. Contrary to empathy as a trait, empathy in daily life was associated with increased well-being.
Looking for Semantic Similarity: What a Vector-Space Model of Semantics Can Tell Us About Attention in Real-World Scenes
Taylor R. Hayes and John M. Henderson
How does semantic knowledge about objects impact attention in real-world scenes? Participants viewed scene images (e.g., a kitchen) while the researchers recorded their eye movements to measure their attention. Using an index of the spatial distribution of semantic similarity of objects in a given scene, the researchers found that participants were more likely to direct their attention to regions with objects more semantically similar to the other objects in the scene and the scene category (e.g., a stove in a kitchen). These findings suggest that individuals use their previous knowledge to guide their attention and selectively process complex visual scenes.
Charged With a Crime: The Neuronal Signature of Processing Negatively Evaluated Faces Under Different Attentional Conditions
Sebastian Schindler, Maximilian Bruchmann, Claudia Krasowski, Robert Moeck, and Thomas Straube
Schindler and colleagues examined the neuronal processing of faces associated with crimes. Participants identified either the orientation of lines overlaid onto a face, the age of the face, or its emotional expression, while the researchers recorded electroencephalograms to measure their event-related potentials (ERPs). They found that early in the face presentation of supposed criminals, regardless of attention, there was potentiation of brain responses. When participants’ attention was directed to the emotional expression associated with the face, their subsequent processing of the face increased late neural activity. These findings suggest that negative biographical information can affect neuronal processing of faces.
Mutual Information and Categorical Perception
Feldman examined which dimensions are treated as informative, and why, when individuals distinguish between categories (e.g., fruits and trees) on the basis of their features (e.g., colors and shapes). In a series of experiments, participants learned novel categories differentiated by subtle shape features. Results indicated that, after category learning of the various shape features, participants became better at discriminating each feature in proportion to how informative it was for each category they learned (i.e., the mutual information between the feature and the category available), suggesting that perceptual discrimination might be tuned to the environment’s statistical structure.
Nonsymbolic-Magnitude Deficit in Adults With Developmental Dyscalculia: Evidence of Impaired Size Discrimination but Intact Size Constancy
Nirit Fooks, Bat-Sheva Hadad, and Orly Rubinsten
Adults with developmental dyscalculia (DD; a learning disability affecting the acquisition of arithmetic skills) show impairments in size discrimination but intact size consistency even when visual depth cues may alter the perceived distances to objects, this research suggests. Adults with DD and typically developing adults chose which of two spheres, accompanied by visual depth cues, was the largest. Compared with typically developing adults, adults with DD were less sensitive to subtle differences in sphere size but showed stable size representations despite variations in perceived distances, indicating that a core deficit in the mental representation of nonsymbolic magnitude may underlie DD.