Psychological Science Meets Sensory Technology
APS Award Address
Think of sensory technology — wearables, virtual assistants, virtual reality, and more — and engineers, programmers, and computer scientists often come to mind. But the rise of these devices also means a new horizon of possibilities for psychological scientists, according to APS James McKeen Cattell Fellow and longtime APS Treasurer Roberta L. Klatzky.
Sensory technology not only can help us to appreciate the world by stimulating our sensory systems in new ways, but also can be used to advance psychological research, said Klatzky, a professor of psychology and human-computer interaction at Carnegie Mellon University, during her May 2019 award address at the APS Annual Convention in Washington, DC.
“We can look at behavior in environments that we could never actually create in reality, through virtual reality,” she said. “We can measure behavior with great precision — kinematics, dynamics, gaze behavior . . . even physiological reactions.”
Through collaborations with engineers, medical doctors, and other experts, Klatzky has developed and improved devices to enhance cognitive function, compensate for sensory-motor loss, teach, and entertain. These include a robotic rehabilitation game for stroke patients and a reading device for children that simulates activity in a story, such as the feeling of rain.
Psychological scientists bring a unique understanding of the human user to these partnerships, Klatzky said, helping engineers to pinpoint how their technology can be most effective while uncovering new perspectives.
“Glitches in Perception” Can Improve Stroke Rehabilitation
In many situations, our senses do not transfer information as smoothly as expected, Klatzky said. These “errors” can be capitalized on to design sensory technologies that benefit a range of individuals, including patients recovering from strokes.
Klatzky, Susan Lederman (Queen’s University, Canada), and Jack Loomis (University of California, Santa Barbara) studied one such error in perception by tasking participants with identifying raised-line graphics of objects, such as a key or a hammer, by touch. To the researchers’ surprise, they couldn’t.
In another study, participants had trouble identifying an object when looking at only a part of it through a narrow aperture. This inability to describe what one is touching or partially seeing suggests that sensory perception “doesn’t feed into a mind’s eye where objects are easily recognized,” said Klatzky.
A related error in perception is the “just noticeable difference” — such as when it takes a shift of multiple decibels for humans to notice a change in audio volume. Through PhD research with Bambi Brewer at Carnegie Mellon University, Klatzky applied knowledge of this perceptual error in a robotic rehabilitation hangman game. Throughout the game, stroke patients had to choose letters by moving their fingers to control a robot. As the game went on, patients with extension problems had to move their fingers further apart, and those with flexion problems had to move their fingers closer together — yet they didn’t realize they were exerting more effort. After 6 weeks of this, patients showed measurable functional improvements.
“People were willing to work way beyond what they originally defined as their functional limitations,” said Klatzky. “This is a great way to exploit the inadequacies or the failures or the glitches in perception.”
Perceiving Your Next Move
Many researchers have long assumed that actions fall into the domain of either perception or cognition. But some cognitive actions — which are typically demanding on mental resources — can become part of perception, Klatzky said. Expert chess players, for example, can perceive their next move in a game rather than thinking or planning for it, according to research by APS William James Fellow Herbert Simon and William Chase (Carnegie Mellon University).
“Wherever possible, move a task to the perceptual system, because cognition is much more taxing to our systems than perceiving,” said Klatzky. This implies that sensory technology could be beneficially engineered to “off-load” a task from cognition to perception.
A device for vision-guided surgery that projects ultrasound images onto patients’ bodies to guide surgeons’ incisions puts this principle into practice.
“[I]t becomes a perceptually guided action rather than a cognitively guided action,” Klatzky said. “That worked very well to improve accuracy, cognitive load, and a bunch of other factors in the task.”
“Do Not Touch”
The sensory system is critical in guiding decision-making and action, Klatzky explained.
“Perception is just the doorway, but if you control the doorway, you control what gets through it,” she said.
For example, people’s first instinct upon seeing a novel object is often to touch it. This is why museums need signs that say “please do not touch” or why pregnant women sometimes complain of total strangers approaching them to touch their bellies, said Klatzky.
Perception leads to representation, and that “is the doorway to everything else . . . action, and further upstream, aesthetic appreciation, higher meaning, inference, extrapolation,” Klatzky said.
Research by Klatzky and Joann Peck (University of Wisconsin, Madison) found that study participants were induced by the appearance of perfume bottles to want to touch them and that this tendency was greater for participants who scored higher on a scale measuring their intrinsic desire to touch objects. The effects suggest that visually inviting touch-ability could be a component of product design.
Of Snowball Fights and Robot Hugs
Klatzky’s research has contributed to a variety of sensory technologies. One such device is an electronic reader designed by Disney that uses tactile sensations to improve children’s reading comprehension and memory. Children put their hands in the Mickey Mouse gloves included with the device, allowing them to feel simulations of actions in the book such as the sensation of rain drops on their hands when they read about a storm in a jungle.
A “Force Jacket” vest, also designed by Disney, allows people to feel the impact of a snowball in a snowball fight, the slither of a snake across their body, or a transformation into a muscular hero through air compartments that inflate to exert pressure across the chest.
Klatzky’s undergraduate students at Carnegie Mellon have also built a soft robot to collect data on whether a robot hug could ever replicate the real thing.
“I hope that more widgets come my way, because it’s endlessly fascinating to do this,” Klatzky said.
— Elaine Meyer is a freelance writer based in Chicago.
APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.
Please login with your APS account to comment.