Inside, the work of Mark Sagar and the University of Auckland School of Animate Technologies is being demonstrated. As you stand in front of a screen, a baby’s face reacts to what you are doing and to your expressions.
Leave her on her own, and she will cry; smile enough and she will laugh. She can also control the lights and sounds in the room.
It’s a simple demonstration, but there is a world of work in the background.
Baby X, billed as “a glimpse into the future of human interaction”, is modelled on Sagar’s daughter Francesca.
But to claim the birth of artificial intelligence (AI) is a bit rich, isn’t it. Perhaps, Sagar agrees.
The classic test of AI is that of mathematician and code-breaker Alan Turing, who in 1950 suggested a computer could be said to “think” if a human could not tell it from a machine while talking to it.
Sagar, who as part of the team at director Peter Jackson’s Weta Digital was behind the computer generated faces in films such as King Kong and Avatar, says Baby X is an exploration of emotional interaction through an interactive avatar.
“Turing was more about language,” he told ZDNet. “This is looking at the interaction side, exploring how natural you can make the computer. It’s expressive and emotional.”
In short, an intelligent and emotive computer will be a natural computer.
Baby X is powered by an artificial brain with inputs layered in through an artificial nervous system. Sagar says it is built to be plugged in to other AI systems that may deliver higher level thought.
“It’s modelled on how we tick,” he says. “We try to make what’s driving the model up to date with modern neuroscience.”
Sagar describes Baby X as a kitset allowing biological components to be put together as applications.
Applications are as varied as the imagination, from gaming to teaching and from elder care to machine-human interaction.
A text by Rob O’Neill for ZDNet.com