Research Presentation: Jessie Abell

“What’s the most alive thing a toy can do?”: How Robotic Toys Change Children's Understanding of an Authentic Relationship

Key words: alive, robotic toy, authenticity, relationship, children

The personal computer became a staple in the American home in the 1990s, and since then, new technologies have flooded the market every year and pervaded nearly every aspect of our lives. The inundation of technology into our lives now begins at incredibly young ages—children receive robotic toys as early as they can interact with them.

Rarely do we consider technology as “bad” or a negative influence; rather, we see our computers, smartphones, and robotic toys as conveniences, progressions, innovations. And while technologies may enhance our lives in countless ways, they also change us. Researchers like Sherry Turkle, quoted above, have recently begun to ask questions about how technologies change us—she says that through them, “we create, navigate, and carry out our emotional lives.”

While we may not often consider how the technology we use affects us emotionally or psychologically, perhaps we should. Sherry Turkle begins Alone Together: Why We Expect More from Technology and Less from Each Other by discussing her studies of children’s interactions with robotic toys. I, too, believe that examining how children interact with robotic toys can offer profound insight into how technology is quietly changing the way people interact with one another and redefining innumerable aspects of the human experience, especially the authenticity of relationships.

The new Furby by Hasbro has a tagline that reads simply “a mind of its own.” Adults easily understand that the toy doesn’t actually have a mind; as a robot, it has a computer that simulates a mind. Children, however, have a difficult time making that distinction. When children play with robotic toys that blur the lines of aliveness and encourage serious emotional attachment, does it negatively affect them psychologically or emotionally? Based on my research, I will argue that when children play with robotic toys, they learn to relate to those toys in ways that damage their understanding of and ability to have truly authentic human relationships.

From a Toy to a Companion

As with most technologies on the market today, robotic toys become more sophisticated each year. Toy companies have recently struggled to “keep up” with the rest of the technological market and have faced lagging sales—a reflection of “an industry that’s being transformed by the surging popularity of mobile devices. As more kids of all ages turn to tablets such as Apple Inc.’s iPad for play, the more Hasbro and competitors such as Mattel Inc. need to invest in innovation” (Townsend).

That robotic toys can increasingly mimic human interaction more convincingly is not an accident. In an interview discussing why Hasbro decided to redesign and re-launch the Furby in 2012 (their biggest hit in 1998), the Chief Executive Officer of the company, Brian Goldner, said that designers, engineers, and marketers simply asked, “What’s the most alive thing a toy can do?” (Townsend). Interestingly, Goldner didn’t say, “What’s the most technologically advanced thing a toy can do?” but used the word alive. His rhetorical choice reflects the fact that toy companies want children to think of robotic toys as alive and, even more, as companions.

Richard Yanofsky, president of WowWee, a toy company based in Hong Kong that sells an astonishing array of sophisticated robotic toys, says that he envisions “‘a seamless interaction with machines in a manner that will be very close to human experience’” (Marriott). WowWee certainly markets their robotic toys as alive: a few descriptions on the toys’ website read “your fun, friendly robotic buddy,” “personality-packed,” and “interactive talking companion.” WowWee even has a robotic toy called “Femisapien,” which “speaks her own language called ‘emotish,’ which consists of gentle sounds and gesture…she responds to your hand gestures, touch, and sound.”

As toy companies market their robotic toys as alive, full of personality, a companion, and even gendered, surely children will face confusion in differentiating between a relationship with their robotic toy and a relationship with another person. And while robotic toys may come “very close to human experience,” do we want to teach our children that they can indeed substitute for a human relationship, or even that a relationship with a robot, a technology, a toy should be treated like a relationship with another human being?

How Interacting with Something “Alive Enough” Changes Children

In the following video, Dennis Hong, an Associate Professor of Mechanical Engineering at Virginia Tech and the Director of the VT Robotics & Mechanisms Laboratory (RoMeLa), records his son interacting with “DARwIn-OP,” one of RoMeLa’s robot creations.

The video shows two-year-old Ethan scoot toward the robot, which moves its hands around and says a few words, then sit in front of the robot and extend his forefinger in an investigatory manner. The robot says “Oops,” which Ethan mimics as he touches the robot, and then squeals and laughs with excitement. Ethan’s body language and excitement show that he feels almost instantly comfortable with the robot. The video then cuts to Ethan sitting next to DARwIn-OP as it walks up to a small yellow ball and kicks it, which again causes Ethan to shriek with joy and clap his hands, shouting, “Yay!” Ethan picks up the ball and brings it back to the robot (which does not respond), so Ethan says, “No?” and goes to get a large blue ball for the robot (which the robot cannot recognize, so Ethan’s dad tells him to bring back the yellow ball). The robot kicks the ball a second time and Ethan again reacts with enthusiasm, but then DARwIn-OP abruptly falls backwards and stops moving.

When the robot falls down and stops moving, Ethan instantly freezes. His dad says, “Uh oh!” which Ethan mimics, but he looks unrelieved and still stays completely still (at which time the video ends).

On the surface, Ethan’s interaction with DARwIn-OP seems innocent enough—hardly an example of the potentially damaging emotional effects of playing with robotic toys. However, I find Ethan’s interactions with DARwIn-OP incredibly fascinating for a few reasons: 1) how quickly Ethan becomes comfortable with the robot and interacts with it as though it were alive by talking to it and asking it questions as though it had preferences (“No?” to the yellow ball), 2) the absolute jubilation with which Ethan reacts when the robot “plays with him,” and 3) the fear that Ethan reacts with when the robot falls down and stops moving.

What does it mean when a child like Ethan interacts with a robotic toy as though it were alive? What does it mean when a child like Ethan becomes quickly attached to a robotic toy and feels sadness or fear when the toy “gets hurt” (breaks or stops working)? In Alone Together, Sherry Turkle describes her studies of many children’s interactions with robotic toys. She found that most children, like Ethan, quickly felt extremely attached to their robotic toys and distraught when the toy malfunctioned. Turkle explains that more than inviting play, robotic toys ask children to care for them and that “when we are asked to care for an object, when an object thrives under our care, we experience that object as intelligent, but, more importantly, we feel ourselves to be in a relationship with it” (20).

Some might argue that a child developing a relationship with a robotic toy is much like a child developing a relationship with a pet, which teaches responsibility and commitment. In a relationship with a robotic toy, however, children “are learning a way of feeling connected in which they have permission to think only of themselves” (60). Children quickly learn that they can “give enough to feel attached, but then they can turn away” because they know that a robotic toy doesn’t really feel pain, hunger, or sadness (60). Authors of a study on people’s relationships with AIBO, a robotic pet, echo a similar sentiment as Turkle when they say that “children in particular may fall prey to accepting robotic pets without the moral responsibilities (and moral developmental outcomes) that real, reciprocal companionship and cooperation involves” (Friedman, et. al 273).

While children have some understanding of a robotic toy’s aliveness as different than a human’s, these toys still confuse their understanding of exactly what “alive” means. Their confusion leads to blurred lines between how to act in a relationship with something alive ‘like a Furby’ and a relationship with another human being (Hafner). Children “deal with the robot as they would deal with a pet or person” and, conversely, might learn to deal with humans as they would their robotic toy: with a certain sense of detachment and expectation of control (Turkle 38).

Robotic toys unquestionably entertain and even educate children in very different ways than “regular” toys. When toy companies create and market robotic toys as purely companions (and not as useful tools), we should stop to consider whether we want children to become dependent on robotic toys for companionship. Because “dependence on a robot presents itself as risk free. But when one becomes accustomed to ‘companionship’ without demands, life with people may seem overwhelming. Dependence on a person is risky—it makes us subject to rejection—but it also opens us to deeply knowing another” (66). Robotic toys seem less innocent when we consider the fact that they could take away a child’s capability to “deeply know” another human being.

Asking the Right Questions

Psychologists have long studied the development of children and how they understand the world around them. After all, as children, we learn how to walk, talk, and interact with the people and world around us. Jean Piaget, a development psychologist, made famous his 1930s study in which “he quizzed children to find out if they could tell the difference between living creatures and inanimate objects [and] concluded that they defined life by figuring out which objects could move by themselves” (Hafner). In the past 20 years or so, Piaget’s theory “has been almost completely overturned by research showing that young children are not fooled by things like garage doors that move by remote control” (Hafner).

A garage door may not “fool” a child, but today’s robotic toys can and do fool children into thinking they are alive. Ultimately, robotic toys psychologically and emotionally damage children in two ways: by providing children with a relationship to an object that they will use as a basis for understanding a relationship with a person and by encouraging the idea that a robotic toy’s companionship (and perhaps later, a computer or a phone’s) could substitute for human companionship. Our technologies “present opportunities to reflect on our values and direction” (19). So, of robotic toys, we must ask, “Do they provide children with the right kind of values?” and “Can we change the direction they are sending us in?”

Works Cited

Friedman, Batya, et. al. "Hardware Companions? What Online AIBO Discussion Forums Reveal about the Human-Robotic Relationship." chi Letters New Horizons. Volume 5, Issue 1. ACM 273-280. Print. 9 Oct. 2012.

Hafner, Katie. "What Do You Mean, 'It's Just Like a Real Dog?'; As Robot Pets and Dolls Multiply, Children React in New Ways to Things That Are 'Almost Alive'." The New York Times, 25 May 2000. Web. 10 Oct. 2012.

Marriott, Michel. "The Shape of Robots to Come." The New York Times, 16 March 2006. Web. 10 Oct. 2012.

RoMeLaVT. "Beta Testing DARwIn-OP." Youtube. VT Robotics & Mechanisms Laboratory, 13 Nov. 2011 Web. 11 Oct 2012.

Townsend, Matt. "Hasbro Reboots Furby Toy for an IPad-Obsessed Generation." Business Week. Bloomberg News, 10 Oct. 2012. Web. 16 Oct. 2012.

Turkle, Sherry. Alone Together: Why We Expect More From Technology and Less from Each Other. New York: Basic Books, 2011. Print.