Having any semblance of a relationship with a robot seems unimaginable for most people. Ironically, that’s what most technology users already have with their devices: Day in and day out, we rely on artificial intelligence (AI) on our smartphones. From driving directions to choosing the food you eat, machines are influencing your decisions more than ever. After all, their algorithms have profiled you down to a tee. The AI on your phone probably knows more about you more than your friends or family do.
Elsewhere, there’s a growing number of people who think highly of their AI or robots, treating them as an extension of themselves. Military men, for instance, get attached to the autonomous robots that facilitate their training. Then there are also lonesome men and women whose love robots keep them company.
Humans are, by nature, social animals; we thrive in a “pack” and will probably connect with anything given the right conditions. Amid this backdrop, is it possible to replicate human relationships with robots? How close are engineers from creating an android that looks, functions, thinks, and talks like a human being? Can we form a relationship with machines, much like a bond between good friends, or at least, between owner and pet? In short, what is the future of human-robot interaction?
The Age of Social Robots
The Turing test posits that if a computer can pass off as a human being, then it is intelligent. As of this writing, we have a handful of AIs that have passed the Turing test to some degree. One of them is Eugene Goostman, a program, or rather a chatbot, that made history in 2014 for acing the test. In its responses, the program pretended to be a 13-year-old Ukrainian boy. It spoke in a manner indistinguishable from a teenager.
Currently, several lifelike robots exhibit smooth dexterity and can carry on simple conversations. Some common examples include:
- Geminoid F, a robot that featured in the Japanese film Sayonara
- Geminoid DK, a robotic clone of a Danish professor
- Nanyang Technological University’s Nadine, a social humanoid that can operate independently and recall conversations
- Hanson Robotics’s Sophia, which appeared in numerous television shows. Check out her conversation with Will Smith below:
These robots can recognize and replicate human speech and actions through sensors and computer vision.
However, such robots are no match for the next example. In terms of realism, the Avatar Shaman developed by Walt Disney’s Imagineering Team is perhaps the most advanced. It stands out for its graceful, fluid motion, a characteristic never before seen in any other robot. Probably a runner-up is Alfred, Garner Holt’s animatronic figure. The humanoid was dubbed as the “world’s most expressive robot” because of its startling ability to demonstrate an array of facial expressions.
Human-Robot Interaction: Challenges and Limitations
At this point, it will probably take decades for androids to develop near-human capabilities. Neural networks and similar AI algorithmic models enable robots to seem socially conscious and self-aware. These allow them to move, respond, and assist human beings. However, the actions they can accomplish are still rudimentary at best. They don’t come anywhere near the complete requirements of the Turing test.
While humans can empathize with machines, most of the emotional, physical, and psychological investment in the relationship will likely come from a person’s end. Unfortunately, that will not be enough to work as humans crave meaningful interactions. As the saying goes, relationships are a two-way street. From what we see at the moment, it seems that humans are more likely to lead or supervise robots, rather than see them as an equal in a relationship dynamic.
