The role of Emotions in Human-Robot Interaction (HRI)

Why do we personify things?

Have you ever had an emotional connection to inanimate objects? Our mind seeks to categorize many aspects of the world. One thing that is hard to understand is why people tend to attribute life to inanimate objects, for example, robots? When I was a kid, I had a Tamagotchi — a handheld digital pet. It was a tiny machine that you could carry around in a pocket: it was eating, sleeping, and even pooping. So, you needed to keep observing the state of the Tamagotchi. When you failed to take care of your virtual pet, it could die. And could be sent up to heaven, or wherever pixels go to die… 😟 Surprisingly, I was so attached to keep my virtual pet alive. Now, I think that it happened because I felt the behaviour of Tamagotchi as an emotional response to my actions. And it is not just kids who are fooled by this phenomenon, as an adult we also anthropomorphize and feel the affection for the inanimate objects.

The best example of it is a HitchBOT, do you remember it? It is a Canadian robot designed as a travelling companion. It hitchhiked across Canada, Germany, and the Netherlands with human help. According to social media, it even went to a football game and grabbed a cold beer through the kindness of strangers who later became friends. But in 2015, there was a tweet showing that the robot was stripped and decapitated in Philadelphia. As a result, nearly 46,000 followers tweeted their attitude to this situation:

‘I’m sorry your trip was cut short in Philly’


‘Poor innocent HitchBot, I hope you can heal and hitch on westward’

A lot of people were quite disturbed by this news. Interestingly, HitchBOT somehow managed to create a social relationship with humans. Now, after this incident, can we say that there is a role of emotions in the human-robot relationship? Well, before diving into this topic, let’s first understand what can be considered a robot?

I, Robot

How can we define a robot? One might say that robots are humanoids, namely, physical devices with legs, hands, etc. But, what about vending machines, smartphones, or self-driving cars — can we categorize them as robots? Well, not really. Although there is no vivid definition of the word ‘robot’, some may define it as “a physical machine that can sense its environment, think or make a decision about what is sensed, and then act on its environment.”

For example, a simple coffee machine wouldn’t be called a robot. Right? You choose a coffee, later this machine gives a specific coffee to you based on your choice. What if the coffee machine would make some decisions on its-own? Now, imagine you enter the room, a coffee machine reads your mood and then makes a coffee based on your mood.

Essentially, an automated machine that we, now, can refer to as a robot is doing the following:

  • Environment Sensing: It is sensing its environment. It detects your movements, reads your mood by analyzing your facial expression or heart-rate depending on which approach it uses.
  • Decision Making: It decides which predefined mood can best fit your actual mood and which beverage should be served based on that mood.
  • Action Taking: It acts to serve a coffee automatically.

The application of emotions to HRI

We can say that emotions play an important role in enhancing the relationship between humans and robots. There are so many examples proving this statement. I, already, mentioned HitchBOT and, now, let take a look at a few more robots that potentially show emotional behaviour.


Paro (PersonAl RObot) is a robot baby seal that interacts with patients and reduces their stress levels.

It has been used therapeutically in nursing homes and has five kinds of different sensors:

  • Tactile
  • Light
  • Audition
  • Temperature
  • Posture sensors

From earthquake victims in Japan to dementia patients across the world, Paro has been used to calm distressed people. A robot seal is soothing and helps patients feel comfortable in their surroundings. It also inspires conversation amongst nursing home residents when placed in a common area. A study of PARO found that it enhanced elders’ quality of life in nursing homes and enhanced children’s rehabilitation.


These cute little robotic guys — are inspired by Pixar’s Wall-E and Eve and can communicate naturally by responding to our actions.

The robot is controlled through a smartphone application with different modes such as letting the robot roam freely, teaching it a person’s name and face, playing games, or creating simple programs. Cozmo and Vector express a broad range of real emotions.

They also explore their surroundings automatically and avoid an obstacle when they face it. They both express excitement when they recognize our faces or wins the game. All emotions are dependent on external inputs, especially the way how we treat them, give enough attention, and so on.


AIBO is one of the most popular entertainment robots which is to some extent reminiscent of a dog–puppy.

  • It is equipped with a sensor for touching
  • It can hear and recognize its name and up to 50 verbal commands.
  • It also has a set of predetermined action patterns like walking, paw shaking, ball chasing, etc.

Furthermore, it seems that AIBO has a positive effect on their partner’s emotional state. Observations show that people categorize AIBO as non-living objects. However, they also consider a robot as a companion and view it as a family member.

All these examples illustrate that people may be more willing to integrate new technology into their lives by anthropomorphizing them. Now, to understand the relationship between humans and robots, let’s answer why emotions are so important for robots themselves? The emotion might support survival strategies for robots, as in the case of the travelling HitchBOT that was able to move only with the help of humans. Another role of the emotion might be the creation of a more efficient and stronger relationship, through making the interaction more natural. Robots need to mimic the emotions that we experience while interacting with each other in order to make the interaction between humans and robots more natural. So far, so good… The question that arises: is it necessary for a robot to experience these emotions? Ronald C.Arkin and Lilla Moshkina, the authors of “Affect in HRI”, are convinced that it is sufficient „to convey the perception“ that the robot experiences emotions.

The emotional interaction between humans and robots

So, we know that in reality, robots do not experience these emotions, but we still keep projecting human emotions into machines.

A simple answer to the question of why we keep anthropomorphizing the robots or refer to them as experiencing the world in a lifelike way is the popularity of sci-fi books.

Science fiction and pop culture prime us to personify robots.

But wait…

The whole article cannot be concluded with a pop-science, right?

First, there is no correct reasoning for why people attribute life to robots. We can assume that we interpret the behaviours such as the physical movements of a robot as an emotional response. For example, when you leave Cozmo on a table without paying attention to it, he will keep himself busy by moving his cubes around. Or, while playing with Cozmo, an emotional response such as a particular facial expression may stimulate you to stop or continue the game. In a word, the embodied systems such as robots have the advantage of sending para-linguistic communication signals to us, such as gestures, facial expressions, intonation, gaze direction, or body posture.

Also, the background stories of a robot may engender an emotional relationship with these robots. This is especially noticeable when the story can be related to the scenarios that are more likely to happen to people such as in the case of the travelling HitchBOT. People who like to travel can relate easily to the adventures that HitchBOT went through.

By the way, have you noticed none of these robots has long and vague names? Imagine, if your Cozmo, or personal vacuum cleaner (aka Roomba) would be called “TL model 7123498”, would you still anthropomorphize it? Robots that are given personified names in combination with back-stories may, therefore, make people even more emotionally attached.

As robots enter our social space, we will inherently project human emotions into machines similar to the techniques we employ in rationalizing a pet’s behaviour. This propensity to anthropomorphize is seen as a mechanism that requires examination and employment in human-robot interactions. And, it is exciting to see how emotions will play a central role in the progression of these types of relationships.

Thanks for reading! See you in the next posts! 👋

Software Developer. This platform features my articles on Artificial Intelligence, Internet of Things, Development, and more.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store