The New York Times:
It looked like a child’s playroom: toys in cubbies, a little desk for doing homework, a whimsical painting of a tree on the wall. A woman and a girl entered and sat down in plump papasan chairs, facing a low table that was partly covered by a pink tarp. The wall opposite them was mirrored from floor to ceiling, and behind it, unseen in a darkened room, a half-dozen employees of the toy company Mattel sat watching through one-way glass. The girl, who looked about 7, wore a turquoise sweatshirt and had her dark hair pulled back in a ponytail. The woman, a Mattel child-testing specialist named Lindsey Lawson, had sleek dark hair and the singsong voice of a kindergarten teacher. Microphones hidden in the room transmitted what Lawson said next. ‘‘You are going to have a chance to play with a brand-new toy,’’ she told the girl, who leaned forward with her hands on her knees. Removing the pink tarp, Lawson revealed Hello Barbie.
‘‘Yay, you’re here!’’ Barbie said eagerly. ‘‘This is so exciting. What’s your name?’’
For psychologists who study the imaginative play of children, the primary concern with A.I. toys is not that they encourage kids to fantasize too wildly. Instead, researchers worry that a conversational doll might prevent children, who have long personified toys without technology, from imagining wildly enough. ‘‘Imaginary companions aren’t constrained,’’ says Tracy Gleason, a professor of psychology at Wellesley College who studies children’s imaginative play. ‘‘They often do all kinds of things like switching age, gender, priorities and interests.’’ With a toy like Hello Barbie, the personality is limited by programming — and public-relations concerns. Mattel, rather than kids, ultimately controls what she can say. ‘‘She is who she is,’’ Gleason says. ‘‘That might be a lot of fun, but it is definitely less imaginative, child-generated and truly interactive than someone with whom you can imagine whatever you want.’’
Read the whole story: The New York Times