Influence of Emotional Robots Design on Humans

30. Apríl, 2012, Autor článku: Tóth Filip, Elektrotechnika, Študentské práce
Ročník 5, číslo 4 This page as PDF Pridať príspevok

This paper describes the abilities of social robots and human-robot social interaction with connection to influence of emotional robots design on humans. Robots will be integrating into our everyday live, but we little known about how robots can influence to humans. These parts discuss the issues pertinent to create meaningful social interactions between robots and humans through employing degrees of qualities in a robot’s physical design and behavior.

This paper is divided to seven different parts with connections between them. We will talk about: currently and future robotics, emotional robotics, robotic design, expressive characteristics, interaction between human and robot, qualities of a social robot and the last is about autistic children and robotics.

1. Introduction

Social robots are playing an important part in our future social world. Socially intelligent robots provide a natural HMI (human-machine interface) and new mechanisms of more complex robotic behaviour for human-robot interactions. However, robot’s social skills often require complex perceptual systems, social and cognitive abilities. The current problem is we want robots that people who do not known programming can communicate with. The researchers hypothesize that face-to-face robot interaction is the best model for social robots and they should present information and give feedback to its user.

The best way to research in this area is imitation human interactions because people are incredibly skilled at interpreting the behaviour of other humans. They need to take advantage of as many of these human interaction modalities as possible in order to make robotic communication richer and more effective and also the scientists are using good way which ones are most significant and useful for human-robot interaction. Most daily human behaviour can be highly predictable, because it conforms to social norms that people keep. Therefore robots must do it according to those norms and then they be useful in society, robots will need to behave in ways that are socially correct as people.

Usually social robot has a humanoid characters that use multimodal communication such as mimics the body language, nonverbal cues that people use in facial conversations and express emotions. This area with social robots is less developed such as similar work with software virtual agents, but it is becoming more and more common.

2. Currently and future robotics

Today, robotics is most used in industry, where it is used for complex repetitive tasks, which is the best way robots help people. Professional service robots are working in domains inaccessible to people, for example as navigating abandoned mines or cleaning up radiation spaces. But currently, personal service robots have the highest expected growth rate. Robots already build our cars, vacuum our floors and even mow our lawns. Robots, their mobile versions that is, are getting more involved in man’s daily affairs. Mostly these are robotic vacuum cleaners, or robotic lawn mowers, but these days we can also see robotic toys with a higher level of intelligence.

These toys are (to a limited extent) able to emotionally interact with children (their users). Similar robotic toys have started being used as an alternative to animals as an additional therapy for child patients. These toys try to socially interact with the patient and create a general positive emotional effect. The researchers in the area of emotional robotics, such as David Hanson (Hanson robotics), are able to create near perfect replicas of human faces, with which it very hard to distinguish whether it is a human or a robot (Fig. 1.).


Fig. 1. Robot Jules is the first humanoid robot who can realistically mimic a real person’s expressions, developed by HANSON ROBOTICS

Eventually it will be possible to create a robot similar to a human, and emulate his emotional behaviour, however current day technologies are not there yet. We will have to wait until scientists and engineers create new computational technologies, capable of actually simulating the processes in the brain in real time, and of reasonable size, so a humanoid robot could make use of them.

The future of social robots creates more questions than answers. Will they become our companions one day? Engineers and scientists are designing robots with enough social skills to interpret and understand human feelings, learn from human teachers, play chess, make conversation and even make jokes. Is the future full of robotic companions with emotions a delightful dream, or a dreadful nightmare? Generally it is expected that future humanoid robots will have an external appearance very similar to that of a human being and will be able to perfectly express their feelings and emotions.

The goal of this is to have a positive influence on humans. “Research suggests that robot behaviour can influence emotional
state of human. Emotions play a vital role in human-human interactions, and are likely to be a significant influence in human-robot interaction. It will be important for robots to respond to human emotions, and for robots
to display appropriate emotion to humans. Emotions are important to many cognitive tasks and hence robots with some level of emotional intelligence will be essential for effective influence to human.”

3. Emotional robotics

First we must actually define what emotional robotics is and what an emotional robot is. “An emotional robot would be a cognitively and physiologically biometric machine. The body of the robot, including all sensors and actuators, must be included in the design of the emotion
system.” This presents the designers of emotional robots with yet another problem with the term Emotion. “The term “emotion” is so ambiguous that to use it technically requires specific definition. Emotions are presented in all cultures in a variety of vague, inconsistent, and fantastical ways. The fact that the word is often associated or interchanged with the word “feelings” might indicate that emotions are very directly related to an organism’s internal state (what it feels inside).”

Researchers considered emotions as a sub-system of the nervous system. Many times, they are inspired by functionality to the organisms, which it has evolved by evolution. The beneficial aspects can then be mimicked in emotional robotics [13]. Samuel H. Kenyon (2003) in “THE NEED FOR EMOTIONAL ARCHITECTURES IN PRACTICAL ROBOTS” wrote about emotional robots: “Just as designers of biomimetic robots use natural body parts as inspiration for arms, legs, hands, sight, hearing, etc., designers of emotional robots will use natural internal survival systems as inspiration for artificial systems of similar intention.”

4. Robotic design

Another problem arises from actually designing the robot. If the robot is to positively influence the users emotional state, its outward appearance must be aesthetically pleasing to the user. An important work in this area is title “The Buddha in the Robot” written by Mashiro Mori (1970). In this work he presents a very important graph, known as the Uncanny valley (Fig. 2.). “The graph Uncanny valley means the relationship between how humanlike a robot appears and a subject’s perception of familiarity. A robot familiarity increases with its similarity until a certain point is reached at which slight “nonhuman” imperfections cause the robot to appear repulsive.”


Fig. 2. Graph of nonlinear relation, which is intensified by movement, between a character’s degree of human likeness and the human perceiver’s emotional response [9]

The Uncanny valley represents the relationship between how humanlike a robot looks and a subject’s perception of familiarity towards it. In the other way the term “uncanny valley” refers to an artificial system, such as a robot, drop in likeability when it becomes too humanlike [14]. The familiarity with a robot increases up to a certain point, at which slight “nonhuman” imperfections cause the robot to look repulsive. The outcome of this is that when designing a robot, two options are available. The first, simpler option is to create a robot that is essentially a cute toy. Designers going down this path can follow the instructions as laid out in [8, 14]. The other, much more complex option is to design a robot indistinguishable from a human being.

Recent researches in area of Uncanny valley was reviewed many times. Current researches do not say about Uncanny valley, but the researchers are measuring likeability of the robot and his design. The authors Chin-Chang Ho and Karl F. MacDorman (2010) wrote in their article Revisiting the uncanny valley theory: Developing and validating an alternative to the Godspeed indices: ”Likeability is virtually synonymous with interpersonal warmth, which is also strongly correlated with other important measures, such as comfortability, communality, sociability, and positive (vs. negative) affect.” In general cases we can talk about robotic design such as two important characteristics, which includes anthropomorphism and likeability [9].

An international team of scientists led by the University of California, San Diego developed deep study connected to emotional reaction human on robots, which are similar to human [10]. The quality of human likeness is not seen as a hindrance to social robot development, but it is something useful for mechanism in social robot research. Experiments showed, that such as increasing of human similarity robots, then it is increasing curve, which showing relationship to them. This increase, when human-like-robots become made more human, will stop and it will decrease rapidly, until the point is reached beyond, where robots are for human beings completely repulsive.


Fig. 3. Repliee Q2 is an upgraded version of Repliee Q1. The face of Repliee Q2 has become more humanlike. Furthermore, it has 13 DoFs in the head so that it can make some facial expressions and mouth shapes.

They were shown previous twelve videos of the same
actions performed by the human on whom the android was modelled and also a stripped version of the android with viscera skinned to its underlying metal joints and electric wiring, revealing its mechanics until it could no longer be mistaken for a human. Scientists prepared three conditions. The first: a human with biological appearance and movement, the second: a robot with mechanical appearance and mechanical motion, and the third: a human like android with the same mechanical movement as the robot. This was followed by the same experiment but the robot had no human-like skin attached on its skeleton. After these experiments, researches showed their functional magnetic resonance imaging (fMRI) (Fig. 4.) results, they saw, in essence, evidence of mismatch.


Fig. 4. Functional magnetic resonance imaging of human brain responds on robot, android and human

Some parts of the brain are active when the human like appearance of the android and with robotic motion did not active. In other words, if we are looking on robot, which looks human and moves likes a human, we have normal activity in our brain with that. If we are looking on robot, which looks like a robot and acts like a robot, we have normal activity in our brain with that as well. The result is that, our brains have no difficulty processing the information. But, the problem arises when we are looking on human-like robot with motion. I can be at odds (human-like robot and motion), the researchers said. The conclusion of this experiment was that the tested subjects had different brain activities in case when the robot with human-like appearance was showed [10].

5. Expressive characteristic

The expressive characteristics of emotional robots, such as the voice, facial expression, gesture and posture serve an important role in communicating an emotional state to humans (Fig. 5.). It has benefits for people in two ways: by communicating feelings to human and by influencing humans’ behaviour. Emotional robots are incorporated into service or entertainment robots, there is a growing interest in understanding how humans react to and interact with them. Humans’ feelings about robots are expected to be a significant factor in the success of emotional robots. People can experience a wide range of feelings when interacting with robots; emotions are not limited. Broadly speaking, research in emotional robotics can be divided into two distinct approaches [6].


Fig. 5. Robot Nexi was designed by MIT Media Lab’s Personal Robotics Group, capable of expressing facial emotions much in the same way as humans can

Some researchers concentrate on the practical task of giving robots the ability to interact with humans in emotional ways. Others set ambitious task of endowing robots with an artificial system common to humans.

6. Interaction between human and robot

The best way, If social robots want to influence humans correctly, then they need to perceive emotional states of human, which can interactive to him. Here emerged another question: How can robots understand user mood? “Intelligent social robots often use high-level SW and we can say that it is their perceptual system.” Robots are able to evaluate external events, such as visual, auditory and haptic stimuli are sensed by the robot inputs and are filtered by a number of feature extractors (e.g., colour, motion, pitch, etc.) [2]. In the high-level perceptual system, features are bound by releaser processes that encode the robot’s current set of beliefs about the state of the robot and its relation to the world [2].

Social – emotional robots should be able to affect the emotional state of people. That requires communication from the robot has to be done in an emotional way. How can robots express their emotions? Since humanoid robots share the human morphology, they can communicate using natural communication modalities, such as facial expression, body posture, gestures, gaze direction and voice [2]. Researchers are still investigating how humans are willing to report their emotions to social robots if a robot asks them and how people react to other social robots and behaviours.

They interested in situations and they are analyzing for example body movements, facial expressions, physiological responses, voice patterns, etc. Some researches have established that humans experience so many of different emotional reactions to social robots and that these are important for their research, means how people rate the quality of the human-robot interactions. One of the conclusions can be that people feel good in robot society, but also to make people feel good about themselves [15].

7. Qualities of a social robot

Christoph Bartneck and Jodi Forlizzi (2004) in “A Design-Centred Framework for Social Human-Robot Interaction” wrote definition of a social robot: “A social robot is an autonomous or semi-autonomous robot that interacts and communicates with humans by following the behavioral norms expected by the people with whom the robot is intended to interact.” From their definition implies that social robots have a physical embodiment.

They said that the autonomy is important requirement for a social robot and also, a semiautonomous robot they defined as social if robot is communicating with acceptable rules of social norms. In their definition, social robot should to be able mimicking human activity, human society and culture. Interaction between human and robot can be started by an encounter such as eyes contact and a statement made in a particular voice. When the robot responds, the interaction moves into the next phase, characterized by both directions of gaze, robot speaks to human and with facial gestures. The social robotic behaviour can be taken from social abilities on human social interaction.

The properties are consisting of form, modality, social norms, autonomy, and interactivity. The best way for social robots is using all modalities, which people naturally using to communicate to the other. For example, social robot should be able using the right way artifical speach with tone of voice and intonation, verbal and non-verbal cues such as posture, gesture and stance. But on the other hand social robots should be having some kind of consciousness, which obey human social norm or rules, and monitoring them at all time.

The main qualities of social robots are: form of behavioural, modality, social norms, autonomy and interactivity. The following words will be devoted to qualities of social robots. The first quality is form of behavioural. This quality suggests social behaviour. Robot can mimicking a lifelike object or mimicking a human. The second quality is modality. We can say about modality, which robots are using the number of communication channels. For example some robots are using visual, auditory and haptic communication channels. The third quality is social norms. Social norms between human and robot can be interpreted as the interactions between people.

In the other words, for good interactions with human, robot must exhibit apparent reciprocal social norms between him and human (or maybe between him and another robot, in the future). The next quality is autonomy. The most important things about robotic autonomy are technological capabilities means technological capabilities to act, also without next inputs or other commands to set robot’s properties. The best result in this area is fully autonomous social robotic system. The last one important quality is interactivity. It means that social robot should be able to respond in reaction to interaction with a people or other social robots.

8. Autistic children and robotics

The researchers in Yale child study center have begun to research interaction with a simple emotion robot which generates a small set of facial expressions [5] (Fig. 6.). Some of their experiments were how to measure autistic social skills by emotional robot “Autistic children were universally engaged with the robot, and often spent the majority of the session touching the robot, vocalizing at the robot, and smiling at the robot.”


Fig. 6. Extremely simple commercial robot ESRA, which generates a small set of facial expressions.

Autism was first identified in 1943 by Kanner [5]. Child autism is an interesting even more fascinating disorder. It is a disorder of neural development characterized by impaired social interaction and communication and by restricted and repetitive behaviour and by learning difficulties. Autism belongs to category of mental disorders called Pervasive Development Disorder-Not otherwise Specified. Behavioural abnormalities are noticeable almost in every aspect of child’s life. Their level can be very different. Parents usually notice signs in the first five years of their child’s life. Disorder is three to four times more common with male than with female individuals.

There are more signs than were mentioned for example children have unreasonable evaluation of social emotional situations, insufficient response to emotions of others, lack of creativity and imagination, specific interest in non-functional aspects of subjects such as odour or surface. This disorder is connected with mental disorder as well, which is present in 3⁄4 of disabled.

“Current research suggests that 1 in every 300 children will be diagnosed with the broadly-defined autism spectrum disorder (ASD), but studies have found prevalence rates that vary between 1 in every 500 to 1 in every 166. For comparison, 1 in every 800 children is born with Down syndrome, 1 in every 450 will have juvenile diabetes, and 1 in every 333 will develop cancer by the age of 20” [5, 12]. Causes of children autism: It has long been presumed that there is common cause at genetic factors (There is a high chance, which is almost 95, 7 %, that identical twins, who were developed from one zygote, can have the same disorder) [16].

The next causes of children autism are: brain damages, lack of brain-centre connections (there is lack of connection among different brain-centers), so-called extremely men- like type of brain(It was proven that high level of testosterone in mother body during pregnancy can cause this disorder) and monotropism hypothesis (strong focusing only at one subject, no multi-tasking) [16]. Autism disorder attracts many of professionals in different fields of science. It was believed until now that cause of autism can have psychogenic background, the therapy was also based on these principles [17].

This hypothesis has many supporters all around the world. Other professionals (psychiatrists, psychologists, geneticists, peadiatrists) tried explain autism on principles based on their own observations and knowledge. Everyone have their cut on creating picture about autism. Cause of this disorder is still unclear. Autism is a syndrome, which is defined by observed symptoms [5]. The world of feelings is for autistic people undiscovered field. It is believed they cannot some of these feelings understand and even more express. They cannot imagine what the thoughts of others are [18].


Fig. 7. Robot KASPAR is a child-sized humanoid robot developed by the Adaptive Systems Research Group at the University of Hertfordshire.

Michael A Goodrich and his team (2011) in “A Case for Low-Dose Robotics in Autism Therapy Categories and Subject Descriptors” used social interaction between child and robot (Fig. 7.). This therapy is confirmed by next words [19]: “Robots appear to be engaging to many children with autism, and evidence suggests that engagement can facilitate social interaction not only between child and robot but also between child and another human. We report on a therapy model that uses a robot in no more than 20% of available therapy time, and describe how a humanoid robot can be used during that limited time to promote generalizable child-human interactions. Preliminary evidence indicates that such low-dose robotics can promote positive child-human interactions.”

9. Conclusion

In this paper was described development of social robots with focuses on emotional expressions, uncanny valley, human-robot interactions and how social robots will help us to diagnose, treat, and understand autism. It means the future develop of social robot will be a great potential for social interaction with humans. The social robot architecture offers good architecture of behaviour that integrates separable emotional layers into a coherent whole. Emotional robotic (social robotic) has a lot of question about understanding the role emotion-like processes might play in socially interactions with human and social development of robots that co-exist with people in the human environment.

At part 7, was written a definition of social robots and described qualities of a social robot that classifies important properties. The qualities of a social robot are including form of behavioural, modality, social norms, autonomy and interactivity. At the end of paper was described robotics as therapeutic and diagnostic devices for autistic children. It is very important for work on therapeutic and diagnostic applications, which have the potential to enhance understanding of autistic disorders. Many times, people respond to artificial life forms with natural sympathy such as sophisticated social robots.

Their artificial brains are not filled with badly doubts about whether are these emotions real or not. The differences between science fact and science fiction will be slowly closing. Today’s computer science and social robotic research area still have a long way to go before they acquire a full human range of emotions but we can talk, that researchers have already made some progress. For making further progress, robotic engineers and computer scientists will have to join forces with other different research area such as psychology and neuroscience.

References

  1. Bartneck, C., Reichenbach, J., & Breemen, A. v. (2004). In your face, robot! The influence of a character’s embodiment on how users perceive its emotional expressions. Proceedings of the Design and Emotion 2004 Conference, Ankara
  2. C. Breazeal (2003), “Emotion and sociable humanoid robots,” E. Hudlika (ed), International Journal of Human Computer Interaction, 59, pp. 119-155.
  3. Geoffrey A. Hollinger, Yavor Georgiev, Anthony Manfredi, Bruce A. Maxwell, Zachary A. Pezzementi, Benjamin Mitchell. Design of a Social Mobile Robot Using Emotion-Based Decision Mechanisms. In Proceedings of IROS’2006. pp.3093~3098
  4. Hsuan-Kuan Huang, Hung-Hsiu Yu, Yao-Jiunn Chen, Yaw-Nan Lee. (2008): Development of an Interactive Robot with Emotional Expressions and Face Detection. ROMAN 2008. The 17th IEEE International Symposium on Robot and Human Interactive Communication (2008)
  5. B. Scassellati. How social robots will help us to diagnose, treat, and understand autism. 12 th International Symposium of Robotics Research (ISRR). San Francisco, CA. Oct. 2005.
  6. Evans D.: Can robots have emotions? Psychology Review Vol. 11, No.1 (September 2004) pp.2-5.
  7. Adolphs, R. Could a Robot Have Emotions? Theoretical Perspectives from Social Cognitive Neuroscience. In Who Needs Emotions? The Brain Meets the Robot; Fellous, J.-M., Arbib, M.A., Eds.; Oxford: New York, 2005; pp 9-25
  8. DiSalvo, C., Gemperle, F., Forlizzi, J., and Kiesler, S. (2002). All Robots are Not Created Equal: The Design and Perception of Humanoid Robot Heads. Designing Interactive Systems 2002 Conference Proceedings, London, England, June, 2002, 321-326.
  9. Ho, C.-C., MacDorman, K. F. (2010). Revisiting the uncanny valley theory: Developing and validating an alternative to the Godspeed indices. Computers in Human Behavior, 26(6), 1508–1518.
  10. C. Frith, The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions, Social Cognitive and Affective Neuroscience, 2011;
  11. Bartneck, C., Forlizzi, J. (2004). A Design-Centred Framework for Social Human-Robot Interaction. Proceedings of Ro-Man2004, Kurashiki. pp. 591-594
  12. Centers for Disease Control and Prevention, (online): http://www.cdc.gov/ncbddd/dd/ddautism.htm last visited January 20, 2012.
  13. Kenyon, S. H. (2003). THE NEED FOR EMOTIONAL ARCHITECTURES IN PRACTICAL ROBOTS. Intelligence, 1-13.
  14. Seyama, J., & Nagayama, R. S. (2007). The Uncanny Valley: Effect of Realism on the Impression of Artificial Human Faces. Presence Teleoperators Virtual Environments, 16(4), 337-351. MIT Press.
  15. Broadbent, E., MacDonald, B., Jago, L., Juergens, M., & Mazharullah, O. (2007). Human reactions to good and bad robots. 2007 IEEERSJ International Conference on Intelligent Robots and Systems, 3703-3708. IEEE.
  16. Szatmari, P. (2003). The causes of autism spectrum disorders. BMJ British Medical Journal,326(7382), 173-174. BMJ Publishing Group Ltd.
  17. Bashina, V. M. (2001). Current approaches to the problem of autism in childhood. Vestnik Rossiiskoi akademii meditsinskikh nauk Rossiiskaia akademiia meditsinskikh nauk, (7), 7-13.
  18. Jones, A. P., Happe, F., Gilbert, F., Burnett, S., & Viding, E. (2010). Feeling, caring, knowing: different types of empathy deficit in boys with psychopathic tendencies and autism spectrum disorder. The Journal of Child Psychology and Psychiatry and Allied Disciplines, 51(11), 1188-1197. WILEY-BLACKWELL PUBLISHING, INC.
  19. Goodrich, M. A., Colton, M., Brinton, B., & Fujiki, M. (2011). A Case for Low-Dose Robotics in Autism Therapy Categories and Subject Descriptors. Proceedings of the IEEE, 143-144. ACM.
  20. Matsui, D., Minato, T., MacDorman, K. F., & Ishiguro, H. (2005). Generating natural motion in an android by mapping human motion. 2005 IEEERSJ International Conference on Intelligent Robots and Systems, Alberta, C, 3301-3308. Ieee.

Figure references


Models of Personality and Emotions, Erasmus 2011/12, University of Vienna, January 2012

Napísať príspevok