A Japanese roboticist is building androids to understand humans—starting with himself

Photo: Makoto Ishida

Hiroshi Ishiguro, a roboticist at Osaka University, in Japan, has, as you might expect, built many robots. But his latest aren’t run-of-the-mill automatons. Ishiguro’s recent creations look like normal people. One is an android version of a middle-aged family man—himself.

Ishiguro constructed his mechanical doppelgänger using silicone rubber, pneumatic actuators, powerful electronics, and hair from his own scalp. The robot, like the original, has a thin frame, a large head, furrowed brows, and piercing eyes that, as one observer put it, “seem on the verge of emitting laser beams.” The android is fixed in a sitting posture, so it can’t walk out of the lab and go fetch groceries. But it does a fine job of what it’s intended to do: mimic a person.

Ishiguro controls this robot remotely, through his computer, using a microphone to capture his voice and a camera to track his face and head movements. When Ishiguro speaks, the android reproduces his intonations; when Ishiguro tilts his head, the android follows suit. The mechanical Ishiguro also blinks, twitches, and appears to be breathing, although some human behaviors are deliberately suppressed. In particular, when Ishiguro lights up a cigarette, the android abstains.

It’s the perfect tool for Ishiguro’s field of research: human-robot interaction, which is as much a study of people as it is of robots. “My research question is to know what is a human,” he tells me between spoonfuls of black sesame ice cream at an Osaka diner. “I use very humanlike robots as test beds for my hypotheses”—hypotheses about human nature, intelligence, and behavior.

Robots, Ishiguro and others say, are poised to move from factories into daily life. The hope is that robots will one day help people with a multitude of tasks—they’ll do household chores, care for the elderly, assist with physical therapy, monitor the sick at hospitals, teach classes, serve cappuccinos at Starbucks, you name it. But to be accepted in these roles, robots may have to behave less like machines and more like us.

Researchers have, of course, long been interested in making robots look and act more like human beings. Among the most notable efforts in this regard are Waseda University’s Wabot, MIT’s Cog, NASA’s Robonaut, Sarcos’s Sarcoman, the Toyota partner robots, Japan’s METI HRP series, Sony’s Qrio, and perhaps most famous of all, Honda’s Asimo.

These robots are all mechanical looking, Ishiguro says, but our brains are wired to relate to other humans—we’re optimized for human-human, not human-Asimo, interaction. That’s why he builds robots that look like people, as part of his work at the Advanced Telecommunications Research Institute International, known as ATR, where he’s a visiting group leader. To describe an android copy of a particular individual, he coined the term “geminoid,” after geminus, which is Latin for twin. He named his mechanical brother Geminoid HI-1.

By building humanlike robots Ishiguro hopes to decipher what the Japanese call sonzaikan—the feeling of being in the presence of a human being. Where does the sense of humanness come from? And can you convey those qualities with a robot?

The idea of connecting a person’s brain so intimately with a remotely controlled body seems straight out of science fiction. In The Matrix, humans control virtual selves. In Avatar, the controlled bodies are alien-human hybrids. In the recent Bruce Willis movie Surrogates, people control robot proxies sent into the world in their places. Attentive viewers will notice that Ishiguro and the Geminoid have cameo roles, appearing in a TV news report on the rapid progress of “robotic surrogacy.”

Ishiguro’s surrogate doesn’t have sensing and actuation capabilities as sophisticated as those in the movie. But even this relatively simple android is giving Ishiguro great insight into how our brains work when we come face to face with a machine that looks like a person. He’s also investigating, with assistance from cognitive scientists, how the operator’s brain behaves. Teleoperating the android can be so immersive that strange things happen. Simply touching the android is enough to trigger a physical sensation in him, Ishiguro says, almost as though he were inhabiting the robot’s body.

Read the full article on IEEE SPECTRUM

 

A text by Ericco Guizo for https://spectrum.ieee.org/

Log in with your credentials

Forgot your details?