Robots and other seemingly inanimate objects may seem just that, but of course, that doesn’t mean that we don’t wish to ascribe certain human characteristics to them, with many of these efforts done in a bid to try and make the technology seem more lifelike, and therefore more trustworthy.
“Making technology more human is a common approach to make the technology more familiar to us and thus make people more comfortable using it,” Sridhar Iyengar, Head of Europe, Zoho Corporation, told me. “For instance, with chatbots you don’t want answers to be too scripted as you want it to appear natural to help people bond with it.”
One of the more interesting developments in this area is to make robots either male or female. For instance, research from Washington State University suggests that the supposed sex of a robot affects how, or even whether, we want to engage with it.
The study argues that people may actually be happier conversing with a robot in hospitality settings if the robot appears to be female rather than male. This was especially so when the robot was humanoid in appearance.
“People have a tendency to feel more comfort in being cared for by females because of existing gender stereotyping about service roles,” the authors explain. “That gender stereotype appears to transfer to robot interactions, and it is more amplified when the robots are more human-like.”
Research from Stanford Graduate School of Business suggests that this might extend to the “origins” of a robot. The study found that when we think about the people who create robots (and other technologies), we seem to regard the work performed by the robot as more authentic.
Traditionally we tend to view AI as less authentic than humans, but the researchers wanted to understand if assigning a form of a human origin story to technology could help to reduce that authenticity gap.
“If you look at what drives purchases of consumers in advanced economies, it’s often not objective characteristics of products or services,” the authors explain. “It’s our interpretation of them, the meaning we derive. It matters a lot if we think something is authentic.”
This can be hugely powerful for companies, as it is believed that authenticity is so powerful that we’re willing to pay more for goods and services that we believe are authentic.
The researchers tested the authenticity of AI technology in a range of scenarios, from recruitment to therapy. The work in each scenario was performed by a hypothetical AI agent, called Cyrill. In each scenario, Cyrill was given a backstory related to the work “he” did.
Gaining trust between robots and humans has been an ongoing source of research for some time now. For instance, research from the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory suggests that human facial expressions could be crucial in establishing that trust, at least on the battlefield.
“We wanted to characterize and quantify factors that impact the emotional experience that humans have with trust in automated driving,” the researchers explain. “With this information, we want to develop a robust way to predict decision errors in automation use to eventually enable active, online mitigation strategies and effective calibration techniques when humans and agents are teaming in real-time.”
Suffice to say, however, giving robots a human origin story is perhaps rather more straightforward than giving them human facial characteristics. It also appeared to have a stronger impact on the authenticity of the robot. Indeed, this boost was even found when the origin story was deliberately tailored to be less humanlike.
The issue of developing trust with robots is becoming more pressing as our interactions with them become more frequent. For instance, research from the Nara Institute of Science and Technology explored how robots can build trust both by touching humans but also by engaging in a degree of small talk while they do so.
The researchers tested the impact of robotic touch and also robotic touch when combined with speech on a pool of Japanese volunteers. For instance, sometimes the volunteers received a gentle stroke on their back from the robot’s arm, whereas in others they also received remarks such as “Hello, how are you doing?” alongside the stroke.
The volunteers reported better mood in the conditions where the robot both touched them and talked to them. What’s more, they also said that their mood was most positively affected when the speech and touch happened simultaneously. The results also found that there was considerably more facial activity in muscles associated with smiling when the robot touched and spoke to participants. People in this condition were also more inclined to think of their robot companion as human-like.
While we may assume that the way in which we build such trusting relationships will inevitably differ from the approach taken with fellow humans, that may not be the case. Research from the University of Montreal suggests that the way we build trust with robots is very similar to the way we do so with humans.
The researchers conducted a trust game experiment, whereby human volunteers were asked to bestow a $10 endowment to a partner, who was either a human, a robot, or a robot acting on behalf of a human. It was in many ways a classic game theory setup, with the human volunteer knowing that gains were to be made, but the trust would be key. The robots in the experiment were programmed to mimic reciprocation behaviors from previous human players.
It’s common in these kinds of games for decisions to quickly converge around outcomes that are mutually beneficial to both parties. In this experiment, a key factor was the emotional reaction of people following their interactions with robots versus humans.
The results suggest that people develop trust similarly in both humans and robots. Traditionally, people would trust humans for both monetary gains and also gain information about the other party, and a similar pattern emerged in the relations with the robots.
This is positive, especially as interactions between man and machine are becoming both more frequent and are taking place in more sensitive domains. Nonetheless, if we want to encourage trusting relationships to form, giving technology both a face and a back story might not do any harm.