After driving over 1 million miles, Google’s self-driving cars have been in 17 reported crashes. Of these accidents, 16 were blamed on human error caused by other drivers but the Google car’s first at-fault crash appears to be due to the car’s lack of nuanced social cognition.
According to the accident report, the Google car was driving in autonomous mode in the far right lane when it encountered a pile of sandbags blocking the street. To get around the obstacle, the self-driving car tried to merge into the center lane. The self-driving car, and the test driver, assumed the bus would let them in; and the bus driver assumed the car would wait to merge.
“Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day,” Google explained in a report.
This incident illustrates just how tricky it is even for human drivers, let alone robots, to correctly interpret each other’s intentions. Eye contact is enormously important in human communication, particularly in driving. It’s possible that this kind of accident could have been avoided if the two vehicles could have better communicated through eye contact, as researchers have found that a quick glimpse of another person’s eyes jump-starts an elaborate cascade of neurological events.
“It has long been known that predictive eye movements play an important role in guiding our own actions,” Gustaf Gredebäck (Uppsala University) and Terje Falck-Ytter (Karolinska Institutet) write in Perspectives on Psychological Science. “A large body of work has shown that predictive eye movements are functionally integrated into most everyday actions that we perform, including reaching, walking, driving, playing sports, and cooking.”
“The idea was that when you see someone else act, you activate your own motor plans for similar actions, and these motor plans include instructions for the oculomotor system to implement goal-directed, predictive saccades,” Gredebäck and Falck-Ytter explain.
So, when a driver trying to merge is able to quickly lock eyes with the driver in the next lane, both drivers are primed to anticipate each other’s actions.
In a recent set of experiments, psychological scientists Anne Böckler (Max Planck Institute for Human Cognitive and Brain Sciences), Robrecht P. R. D. van der Wel (Rutgers University), and Timothy N. Welsh (University of Toronto) confirm previous findings that eye contact can improve information processing speed.
“Humans’ sensitivity to gaze cues is striking,” the researchers write in Psychological Science. “Gaze cues modulate subsequent attentional and cognitive processing of social information and thereby foster communication and successful social interaction.”
In one of the experiments, 16 college students were asked to identify a target letter presented on one of four faces by pushing either the corresponding key on a keyboard. The students were shown an image with four images of the same woman. Two of the faces gazed directly at the participant, while the gaze of the other two faces was averted. After 1500 ms, some of the faces and the target letters shifted positions on the screen.
The results showed that locking eyes attracts attention and influences the speed of information processing. Reaction times were much faster when a target was presented along with a direct gaze, compared to a pairing with an averted gaze.
“The influence of direct gaze appears to take effect immediately on presentation and wanes as soon as gaze is averted or direct gaze is established by another face,” the researchers write.
Böckler, A., van der Wel, R. P., & Welsh, T. N. (2014). Catching Eyes Effects of Social and Nonsocial Cues on Attention Capture. Psychological Science. doi: 10.1177/0956797613516147
Gredebäck, G., & Falck-Ytter, T. (2015). Eye movements during action observation. Perspectives on Psychological Science, 10(5), 591-598. doi: 10.1177/1745691615589103