Notes:
Body language refers to the nonverbal cues that people use to communicate. It includes gestures, posture, facial expressions, and other movements of the body. These cues can convey emotions, intentions, and other thoughts and feelings. Body language is an important aspect of communication and can often convey more information than the words being spoken.
Other terms for body language include nonverbal communication, kinesics, and nonverbal behavior. Body language refers to the use of physical gestures, movements, and expressions to communicate and convey meaning without using words. It is a form of nonverbal communication that can include facial expressions, eye movements, posture, and gestures. Body language can be conscious or unconscious, and it is often used in conjunction with verbal communication to convey meaning and convey emotions.
Body language can be expressed in virtual humans through a variety of methods, such as through the use of facial expressions, gestures, posture, and even through changes in the virtual human’s virtual body. For example, a virtual human may raise their eyebrows to show surprise or skepticism, or they may fold their arms to show that they are feeling closed off or defensive. Virtual humans may also use changes in their posture, such as leaning forward or standing up straight, to convey different emotions or attitudes. In order to accurately express body language in virtual humans, developers may use motion capture technology to track the movements and expressions of real people, or they may use computer-generated animations to create realistic body language for the virtual humans.
There are several technologies that can be used to implement body language in virtual humans. These technologies include:
- Motion capture: This technology involves using sensors to track the movement of a person’s body and translate it into a virtual environment. This can be used to create realistic body language for virtual humans.
- 3D modeling: 3D modeling software can be used to create detailed models of the human body and its various poses and expressions. This can be used to create virtual humans with expressive body language.
- Artificial intelligence: AI algorithms can be used to analyze and interpret body language, allowing virtual humans to respond appropriately to the body language of others.
- Virtual reality: Virtual reality systems can be used to create immersive environments in which users can interact with virtual humans and experience their body language in a realistic way.
One way that neural behavior systems and neural dialog systems can be coordinated in virtual humans is by using a central processing unit that can coordinate the actions of both systems. This can be done using a variety of techniques, such as using machine learning algorithms to train the central processing unit to recognize patterns in the behavior and dialog of the virtual human and adjust its actions accordingly. Other methods may involve using sensor data and other inputs to guide the behavior and dialog of the virtual human, or using pre-programmed scripts or decision trees to dictate its actions. Ultimately, the specific approach used will depend on the specific goals and requirements of the virtual human application.
Wikipedia:
References:
See also:
Behavior Realizers | BML (Behavior Markup Language) & Dialog Systems | Embodiment Meta Guide | EMBR (Embodied Agents Behavior Realizer) | SmartBody
Emotion recognition based preference modelling in argumentative dialogue systems
N Rach, K Weber, A Aicher… – 2019 IEEE …, 2019 – ieeexplore.ieee.org
… PerDial’19 – The 1st International Workshop on Pervasive Computing and Spoken Dialogue Systems Technology 839 Page 3 … To enrich our observations of emotional responses, we chose to also include body language in our emotion recognition system …
Coastal at semeval-2019 task 3: Affect classification in dialogue using attentive bilstms
AV González, VPB Hansen, J Bingel… – Proceedings of the 13th …, 2019 – aclweb.org
… In Tutorial and Re- search Workshop on Affective Dialogue Systems, pages 89–100. Springer. Laurence Devillers, Ioana Vasilescu, and Lori Lamel. 2002 … 2010. Multi- modal semi-automated affect detection from conver- sational cues, gross body language, and facial fea- tures …
Augmentative and Alternative Communication System Using Information Priority and Retrieval
Y Heo, S Kang – 2019 IEEE International Conference on Big …, 2019 – ieeexplore.ieee.org
… I. INTRODUCTION An Augmentative and Alternative Communication (AAC) system is defined as an assisted dialogue system for those who suffer from … unaided communication systems that rely on the user’s body to convey messages, such as gestures, body language, and/or …
Observing dialogue in therapy: Categorizing and forecasting behavioral codes
J Cao, M Tanana, ZE Imel, E Poitras, DC Atkins… – arXiv preprint arXiv …, 2019 – arxiv.org
Page 1. Observing Dialogue in Therapy: Categorizing and Forecasting Behavioral Codes Jie Cao†, Michael Tanana‡, Zac E. Imel‡, Eric Poitras‡, David C. Atkins?, Vivek Srikumar† †School of Computing, University of Utah …
Take the Initiative: Mixed Initiative Dialogue Policies for Pedagogical Agents in Game-Based Learning Environments
JB Wiggins, M Kulkarni, W Min, KE Boyer… – … Conference on Artificial …, 2019 – Springer
… conducting multi-party dialogues [2], generating and understanding emotion [4, 5], and producing and interpreting body language [1]. However … a conversation (through actions such as seeking information or changing the topic) has the initiative [7]. Dialogue systems typically use …
Socially-Aware Dialogue System
R Zhao – 2019 – lti.cs.cmu.edu
… sessions that fit their interests (Matsuyama et al., 2016)2. To incorporate socially-aware intelligence to a traditional dialogue system, we argue … what we call social reasoning ? to determine how to converse with the user, including spoken language and body language, to best …
Learn to Gesture: Let Your Body Speak
T Gan, Z Ma, Y Lu, X Song, L Nie – Proceedings of the ACM Multimedia …, 2019 – dl.acm.org
… We introduced a novel application that help people learn the exemplary body language at low … Adequate head movements and manual gestures have shown positive impacts for multimodal dialogue systems and social robotics [2]. Social robotics incorporate both speech and …
Unobtrusive Vital Data Recognition by Robots to Enhance Natural Human–Robot Communication
G Bieber, M Haescher, N Antony, F Hoepfner… – … , Societal and Ethical …, 2019 – Springer
… Furthermore, the communicating counterparts’ movements while speaking or performing natural body language have to be considered … In Tutorial and Research Workshop on Affective Dialogue Systems (pp. 36–48). Berlin, Heidelberg: Springer.Google Scholar. He, CY (2017) …
Computational linguistics: Introduction to the thematic issue
A Gelbukh – Computación y Sistemas, 2019 – cys.cic.ipn.mx
… Spanish, typically in the form of text or speech, as well as, in multimodal setting, associated facial expressions and body language … They write: Retrieval- based dialogue systems converse with humans by ranking candidate responses according to their relevance to the history of …
Conversation is multimodal: thus conversational user interfaces should be as well
S Schaffer, N Reithinger – … of the 1st International Conference on …, 2019 – dl.acm.org
… An important part of human communication is nonverbal communication, which includes body language, visual language, symbolism, touch, music and various … So called multimodal dialog systems have been developed with a wide range of research foci (eg [5, 16]) and for a …
Multimodal open-domain conversations with robotic platforms
K Jokinen, G Wilcock – Multimodal Behavior Analysis in the Wild, 2019 – Elsevier
… As there are millions of articles in Wikipedia, this can reasonably be called an open-domain dialog system … the interest level correctly, the external interface should not be limited to verbal feedback, but should include intonation, eye-gaze, gestures, body language and other …
Visualizing natural language interaction for conversational in-vehicle information systems to minimize driver distraction
M Braun, N Broy, B Pfleging, F Alt – Journal on Multimodal User Interfaces, 2019 – Springer
… following section. 3.2.2 Dimensions of the design space. User Input In conversations between humans, we mostly rely on speech to transfer messages. However gaze behavior, gestures, and body language also play a role. In …
Project r-castle: Robotic-cognitive adaptive system for teaching and learning
D Tozadore, AHM Pinto, J Valentini… – … on Cognitive and …, 2019 – ieeexplore.ieee.org
… Regarding the specific algorithms to collect the objective measures, those that depend on users’ verbal communication (nW, RWa, Tta) will be provided by the Dialogue System, whereas the others, are analyzed with Adaptive Modules al- gorithms themselves …
Shoehorning in the name of science
J Edlund – Proceedings of the 1st International Conference on …, 2019 – dl.acm.org
… 2007. Body Language: Lessons from the Near-Human. In Genesis Redux: Essays on the history and philosophy of artificial life, Jessica Riskin (Ed.). Chicago, Chapter 17, 346–374 … 1999. A Responsive Dialog System. In .Machine Conversations …
Towards More Realistic Human-Robot Conversation: A Seq2Seq-based Body Gesture Interaction System
M Hua, F Shi, Y Nan, K Wang, H Chen… – arXiv preprint arXiv …, 2019 – arxiv.org
Page 1. Towards More Realistic Human-Robot Conversation: A Seq2Seq-based Body Gesture Interaction System Minjie Hua1, Fuyuan Shi1, Yibing Nan1, Kai Wang1, Hao Chen1, and Shiguo Lian1 Abstract—This paper presents …
Online processing for speech-driven gesture motion generation in android robots
CT Ishi, R Mikata, T Minato… – 2019 IEEE-RAS 19th …, 2019 – ieeexplore.ieee.org
Abstract— Hand gestures commonly occur in daily dialogue interactions, and have important functions in communication. In this study, we proposed and implemented an online processing for a speech-driven gesture motion generation in an android robot dialogue system …
Designing a Personality-Driven Robot for a Human-Robot Interaction Scenario
HB Mohammadi, N Xirakia, F Abawi… – … on Robotics and …, 2019 – ieeexplore.ieee.org
… C. Motion module The Motion module controls NICO’s physical interaction and also projects body language cues … D. Dialog module The Spoken Dialog System (SDS) handles the direct verbal interaction with the user via multiple language-related submodules …
Towards robot-assisted children speech audiometry
S Ondáš, D Hládek, M Pleva, J Juhár… – 2019 10th IEEE …, 2019 – ieeexplore.ieee.org
… NAO can be considered as a great tool to prepare a multimodal dialogue system due to its support of vision, hearing, gesture production, and body language. Moreover, an autonomous mode of the robot supports human-like behavior …
Comparison and efficacy of synergistic intelligent tutoring systems with human physiological response
F Alqahtani, N Ramzan – Sensors, 2019 – mdpi.com
The analysis of physiological signals is ubiquitous in health and medical diagnosis as a primary tool for investigation and inquiry. Physiological signals are now being widely used for psychological and social fields. They have found promising application in the field of computer …
PRIMER: An Emotionally Aware Virtual Agent.
C Gordon, A Leuski, G Benn, E Klassen, E Fast… – IUI …, 2019 – research.ibm.com
… KEYWORDS Virtual Reality, Virtual Agents, Spoken Dialogue Systems, Mixed- Initiative Dialogue … Ellie is capable of displaying a range of different reactive emotions, gestures, body language, and linguistic behavior, depending on the user’s current perceived emotional state …
Designing for dialogue: The challenges of facilitating online civil dialogue for the urban planning process
I PERSSON, T HENRIK – 2019 – odr.chalmers.se
Page 1. Designing for dialogue The challenges of facilitating online civil dialogue for the urban planning process Master’s thesis in Computer science and engineering ISAK PERSSON, HENRIK TRAN Department of Computer …
The role of speech technology in biometrics, forensics and man-machine interface.
S Singh – International Journal of Electrical & Computer …, 2019 – search.ebscohost.com
… speech, but the speech generated by the machine lacks individuality, expression and the communicative intent and the dialogue systems of the … Speech recognition systems that teach body language and facial expression can also be used to evaluate the danger, for example …
VICA, a visual counseling agent for emotional distress
Y Sakurai, Y Ikegami, M Sakai, H Fujikawa… – Journal of Ambient …, 2019 – Springer
… Statistical significance of difference in length of interactions involving the same users shows that modern dialogue systems are a substantial, though not dramatic, improvement on their predecessor … Ellie’s body language is designed to mirror that of an actual therapist …
Virtual human standardized patients for clinical training
T Talbot – Virtual Reality for Psychological and Neurocognitive …, 2019 – Springer
… 17.4) has the ability to understand the spoken dialog and responds to the student in a lifelike, natural manner with realistic voice, body language, gestures and facial expressions. As the single student progresses through the scenario, a branching dialog system can lead to …
Human-Robot Scaffolding, an Architecture to Support the Learning Process
E González, J Páez, F Luis-Ferreira, J Sarraipa… – Iberian Robotics …, 2019 – Springer
… The first step to create the model is to recognize the task state and the body language … In addition, support during decision-making requires consideration of aspects such as the dialogue system for the understanding of robot interventions [33, 34], motor interaction requirements …
Do Conversational Partners Entrain on Articulatory Precision?
N Lubold, SA Borrie, TS Barrett, MM Willi… – …, 2019 – humaninteractionlab.com
… Index Terms: entrainment, alignment, articulatory precision, human-computer interaction, dialog systems … device that facilitates success by enhancing both social connection and mutual understanding [1]–[3]. Entrainment has been observed in body language, lexical content …
Design of conversational humanoid robot based on hardware independent gesture generation
D Baumert, S Kudoh, M Takizawa – arXiv preprint arXiv:1905.08702, 2019 – arxiv.org
… T. Makino, and Y. Matsuo, “Syntactic filtering and content-based retrieval of twitter sentences for the generation of system utterances in dialogue systems,” in Situated … [10] S. Levine, C. Theobalt, and V. Koltun, “Real-time prosody- driven synthesis of body language,” in ACM …
Modeling Mentor-Mentee Dialogues in Film
A Dobrosovestnova, M Skowron, S Payr… – Cybernetics and …, 2019 – Taylor & Francis
… Bearing in mind that the objective we are pursuing is to design a dialogue system that relies on recognition and generation of written text, the model we propose does not incorporate other communication modalities (eg body language, prosody, pauses) that otherwise play an …
More general evaluation of a client-centered counseling agent
T Horii, Y Sakurai, E Sakurai, S Tsuruta… – 2019 IEEE World …, 2019 – ieeexplore.ieee.org
… Such nodding called “Unazuki” in Japanese is a kind of body language … On the other hand, it is necessary for the dialogue system to have knowledge about the assumed client and the content to be talked in advance, and the construction therefore requires a large cost …
Toward a self-adapting resource-restricted voice-based Classification of Naturalistic Interaction Stages
N Weißkirchen – duepublico2.uni-due.de
… are capable of interpreting the syntax of its user, but also on the non-syntactical information conveyed through voice affect or body language … The user learns to interact by a dialogue system, in which the user and the machine interact by voicing their questions and responding …
Nonverbal Behavior in
A Cafaro, C Pelachaud… – The Handbook of …, 2019 – books.google.com
… 2014] for behav- ior planning and generation. Flipper is a library for specifying dialogue rules for dialogue systems, that uses XML-templates to describe the preconditions, effects and BML behaviors of these rules. A simple example of agreeting template is shown in Figure 6.6 …
There is no general AI: Why Turing machines cannot pass the Turing test
J Landgrebe, B Smith – arXiv preprint arXiv:1906.05833, 2019 – academia.edu
… For we also review the current state of the art in dialogue system building, and conclude by identifying what we see as the potential for dialogue systems that would still be useful even though they fall … 1. non-verbal level: including facial expression, gestures and body language …
An overview of machine learning techniques applicable for summarisation of characters in videos
G Nair, KE Johns, A Shyna… – … Conference on Intelligent …, 2019 – ieeexplore.ieee.org
… C. Emotion Detection Emotion detection is the process of recognising emotions from the facial expressions, body language, as well as … Real-Time Speech Emotion and Sentiment Recognition for Interactive Dialogue Systems [25] aims at improving the user experience while …
Stepped Warm-Up–The Progressive Interaction Approach for Human-Robot Interaction in Public
M Zhao, D Li, Z Wu, S Li, X Zhang, L Ye, G Zhou… – … Conference on Human …, 2019 – Springer
… For instance, Kendon [28] suggests that friends usually exchange greetings twice, first using body language at a far distance and again by … 1). Xiaodu is benefited from the AI techniques (eg, NLP, dialogue system, speech recognition) of Baidu, and is able to communicate …
There is no Artificial General Intelligence
J Landgrebe, B Smith – arXiv preprint arXiv:1906.05833, 2019 – arxiv.org
… Efforts are directed mainly towards what are called dialogue systems, or in other words systems able to engage in two-party conversations, which are optimistically projected to be widely used in commercial agent-based applications in areas such as travel booking or service …
Multi-modal motion-capture-based biometric systems for emergency response and patient rehabilitation
ML Gavrilova, F Ahmed, ASMH Bari, R Liu… – … and Implementation of …, 2019 – igi-global.com
… One of the first publicly available body language databases was created to recognize 10 emotions by using RGB camera … Intelligent avatars and other interactive dialog systems can improve user’s experience while accessing smart communication devices (Faundez-Zanuy et al …
SocialNLP EmotionX 2019 Challenge Overview: Predicting Emotions in Spoken Dialogues and Chats
B Shmueli, LW Ku – arXiv preprint arXiv:1909.07734, 2019 – arxiv.org
… Emotions involve a complicated interplay of mind, body, language, and culture [Bazzanella, 2004] … In particular, dialogue systems such as those available on social media or instant messaging services are rich sources of textual data and have become the focus of much attention …
13 Older adults’ experiences with Pepper humanoid robot
A Poberznik, S Merilampi – Tutkimusfoorumi – theseus.fi
… 150 Pepper–humanoid social robot Pepper is a humanoid robot capable of demonstrating body language and perceiving and interacting with … Metrics of Evaluation of Pepper Robot as a Social Companion for the Elderly: 8th International Workshop on Spoken Dialog Systems …
Teaching Robots Behaviors Using Spoken Language in Rich and Open Scenarios
V Paléologue – 2019 – hal.archives-ouvertes.fr
Page 1. HAL Id: tel-02566784 https://hal.archives-ouvertes.fr/tel-02566784 Submitted on 15 May 2020 HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not …
A taxonomy of social cues for conversational agents
J Feine, U Gnewuch, S Morana, A Maedche – International Journal of …, 2019 – Elsevier
Trust and Decision Making in Turing’s Imitation Game
H Shah, K Warwick – Advanced Methodologies and Technologies in …, 2019 – igi-global.com
… of each other in the former, thus “cannot depend on such confidence-engendering measures as physical proximity, handshakes, body language” or “a com … It is a matter of time before we see a critical mass of smarter dialogue systems representing brands to characterise trust in e …
Nonverbal behavior in multimodal performances
A Cafaro, C Pelachaud, SC Marsella – The Handbook of Multimodal …, 2019 – dl.acm.org
Page 1. 6Nonverbal Behavior in Multimodal Performances Angelo Cafaro, Catherine Pelachaud, Stacy C. Marsella 6.1 Introduction The physical, nonverbal behaviors that accompany face-to-face interaction convey a wide variety …
Bridging the Gap between Robotic Applications and Computational Intelligence in Domestic Robotics
J Zhong, T Han, A Lotfi, A Cangelosi… – 2019 IEEE Symposium …, 2019 – ieeexplore.ieee.org
… D. Communication: Speech Recognition and dialogue system Natural language is perhaps the most natural way to commu- nicate in our daily life … Some commercial products have been exploring methods to use common sense knowledge to build better dialogue systems …
Annotation-efficient approaches towards real-time emotion recognition
IP Lajos – 2019 – ritsumei.repo.nii.ac.jp
… even on small sets of labeled data, thus being applicable for the pre-training of commercial games, dialogue systems, and other … of conversations, the only directly perceivable stimulus is usually the communication of one’s thoughts through verbalization and body language …
Personality In Conversational UI design.
BA Kumar – engrxiv.org
Page 1. Personality in Conversational UI Design, mathematical models. Dr Bheemaiah, Anil Kumar , AB, Seattle WA 98125, USA anilkumarbheemaiah@gmail.com?. Abstract This paper describes mathematical representations …
Technologies for automated analysis of co-located, real-life, physical learning spaces: Where are we now?
YHV Chua, J Dauwels, SC Tan – … of the 9th International Conference on …, 2019 – dl.acm.org
Page 1. Technologies for automated analysis of co-located, real-life, physical learning spaces: Where are we now? Yi Han Victoria Chua Nanyang Technological University 50 Nanyang Avenue Singapore vicchuayh@gmail.com …
Exploring perceived emotional intelligence of personality-driven virtual agents in handling user challenges
X Ma, E Yang, P Fung – The World Wide Web Conference, 2019 – dl.acm.org
… KEYWORDS virtual agents, emotional intelligence, perceived emotional intelli- gence, personality, emotions, affective computing, dialog system ACM Reference … the most com- mon affective features, including facial expression (eg, smiles and frowns), body language (eg, shrug …
Computational Sarcasm for Different Languages: A Survey
A Dubey, A Joshi, P Bhattacharyya – cfilt.iitb.ac.in
… Sarcasm on the internet is hard to interpret because of the following reasons: 1. Speaker’s body language is unknown which is a major part of how people communicate with each other. 2. Tone of voice makes a huge difference …
Use of non-verbal vocalizations for continuous emotion recognition from speech and head motion
SN Fatima, E Erzin – 2019 14th IEEE Conference on Industrial …, 2019 – ieeexplore.ieee.org
… that are based on continuous emotion recognition (CER) include human-robot communication, call center dialog systems, smart gaming … S. Narayanan, “Tracking continuous emotional trends of participants during affective dyadic interactions using body language and speech …
Hmm, Did You Hear What I Just Said?: Development of a Re-Engagement System for Socially Interactive Robots
HL Cao, PC Torrico Moron, PG Esteban, A De Beir… – Robotics, 2019 – mdpi.com
… environment and expressing verbal and nonverbal behaviors using speech, facial expressions, paralanguage, and body language [1,2 … of linguistic hesitation actions (eg, “uhm”, “hmm”) to manage conversational engagement in open-world, physically situated dialog systems …
Human-Robot Interaction
H Ayano?lu, JS Sequeira – Emotional Design in Human-Robot Interaction, 2019 – Springer
… Scholar. Alonso-Martín F, Castro-gonzález A, Javier F, De Gorostiza F, Salichs MÁ (2015) Augmented robotics dialog system for enhancing … EindhovenGoogle Scholar. Beck A, Cañamero L, Bard KA (2010) Towards an affect space for robots to display emotional body language …
Interaction with an Embodied Conversational Agent in Virtual Reality
T Hariri – 2019 – search.proquest.com
… better. If the agent had a natural gaze behavior and body language based on the user’s behavior … Recording Answer Figure 2: Design Dialog System I downloaded the character from Turbosquid and created the fourteen blend-shapes for lip-syncing using Autodesk Maya2 …
A need for trust in conversational interface research
J Edwards, E Sanoubari – … of the 1st International Conference on …, 2019 – dl.acm.org
… example of these interfaces is conversational user inter- faces (CUIs) which can include text-based dialogue systems, voice- based … Some of these methods might include behavioural measures, subjective questionnaires, analysis of body language and linguistic choices, and …
Comparing the utility of different classification schemes for emotive language analysis
L Williams, M Arribas-Ayllon, A Artemiou… – Journal of Classification, 2019 – Springer
… Written online communication has led to the emergence of informal, sometimes ungrammatical, textual conventions (Purver and Battersby 2012) used to compensate for the absence of body language and intonation, which otherwise account for 93% of non-verbal …
Empathic Response Generation in Chatbots.
T Spring, J Casas, K Daher, E Mugellini… – SwissText, 2019 – ceur-ws.org
… In normal face-to-face conversations, emotions are also expressed over the tonality of the speaker, body language, gestures and facial expressions … A. Bartl and G. Spanakis. 2017. A retrieval- based dialogue system utilizing utterance and context embeddings.
Deep models for converting sarcastic utterances into their non sarcastic interpretation
A Dubey, A Joshi, P Bhattacharyya – Proceedings of the ACM India Joint …, 2019 – dl.acm.org
… Sarcasm on the internet is hard to interpret because of the following reasons: (1) Speaker’s body language is unknown which is a major part of how people communicate with each other … 2006. ” yeah right”: sarcasm recognition for spoken dialogue systems.. In INTERSPEECH …
Exploring the Uncanny Valley Theory in the Constructs of a Virtual Assistant Personality
MP Garcia, SS Lopez – Proceedings of SAI Intelligent Systems Conference, 2019 – Springer
… Conversational Artificial Intelligence, also referred as Virtual Assistants, are spoken dialogue systems that have the purpose of helping users complete a … way we say what we say (tone of voice, pauses, etc.) and the remaining 55% accounts for body language (gestures, posture …
Prototyping relational things that talk: a discursive design strategy for conversational AI systems
B Aga – 2019 – pearl.plymouth.ac.uk
Page 1. 1 PROTOTYPING RELATIONAL THINGS THAT TALK: A DISCURSIVE DESIGN STRATEGY FOR CONVERSATIONAL AI SYSTEMS by BIRGITTE AGA A thesis submitted to the University of Plymouth in partial fulfilment for the degree of DOCTOR OF PHILOSOPHY …
PHD THESIS SUMMARY
EAM FLOREA, IA AWADA – upb.ro
… The user’s emotions can be recognized by the perception of the user’s facial expressions, the user’s body language or the user’s voice. The works that recognized the emotion of the user … the user’s facial expression or the user’s body language or the user’s voice …
User-adaptive interaction in social robots: A survey focusing on non-physical interaction
GS Martins, L Santos, J Dias – International Journal of Social Robotics, 2019 – Springer
… User’s pose. Robot commands. Rule-based. Gaze position, facial emotion, body language, perceived bond, satisfaction, amusement, anxiety, enjoyment, observed leadership and expectancy. Manual classification, Questionnaires. (c) Social Robots with dynamic user models. [.
Online counselling services for Youth@ risk.
R de la Harpe, C Settley, R Cilliers – CONF-IRM, 2019 – researchgate.net
… people seeking help hides people who are not sincere and also limits the interpretation of the extend of the issue without the ability to also read body language, tone of … Application of Synchronous Text-Based Dialogue Systems in Mental Health Interventions: Systematic Review …
An Investigation of the Accuracy of Real Time Speech Emotion Recognition
JS Deusi, EI Popa – … on Innovative Techniques and Applications of …, 2019 – Springer
… Humans express emotions in several ways: heart rate, perspiration, facial expressions, body language, voice tone, crying and laughing [1]. These … Schuller [1] questions “Alexa, Cortana, Siri, and many other dialogue systems have hit the consumer home on a larger scale than …
This is the author’s version of a work that was published in the following source
J Feine, U Gnewuch, S Morana, A Maedche – 2019 – researchgate.net
… In the 1980s, this was followed by the appearance of voice-based dialog systems, voice user interfaces … Despite the large number of different terms used to describe this technology (eg, CA, ECA, chatbot, dialog systems, companions, virtual …
Design and implementation of embodied conversational agents
F Geraci – 2019 – rucore.libraries.rutgers.edu
… mands, but true conversation is achieved by also capturing information from facial expressions and body language, to finally assign a semantic meaning, in context of it all … For instance, facial expressions, the emotional load of the speech and body language, etc …
Improv with Robots: Creativity, Inspiration, Co-Performance
J Rond, A Sanchez, J Berger… – 2019 28th IEEE …, 2019 – ieeexplore.ieee.org
… progressing a scene. Improvisers can provide offers through dialogue (“Pass me that hammer, Ed”), the way words are said (tone of voice), motions (throwing an imaginary ball), and body language (looking sad). Insertion of …
Rehabilitation, the great absentee of virtual coaching in medical care: Scoping review
P Tropea, H Schlieter, I Sterpi, E Judica, K Gand… – Journal of Medical …, 2019 – jmir.org
Journal of Medical Internet Research – International Scientific Journal for Medical Research, Information and Communication on the Internet.
Communicating Bad News: Insights for the Design of Consumer Health Technologies
EK Choe, ME Duarte, H Suh, W Pratt… – JMIR human …, 2019 – humanfactors.jmir.org
JMIR Human Factors – a leading peer-reviewed ehealth journal.
Towards gameworld studies
S Conway, B Elphinstone – Journal of Gaming & Virtual Worlds, 2019 – ingentaconnect.com
… the ‘lived-body’ ([1962] 2005). Such phrases, amongst other things, point towards the embodied nature of lived experience: meaning is always filtered through one’s world, body, language, history and so on. On this basis, one …
User Interface Design
R Heimgärtner – Intercultural User Interface Design, 2019 – Springer
… The aim of the first stage is to facilitate the understanding of information and to generate trust through the conscious use of culture-specific signs. Culture-specific characters include: Language,. Body language,. Way of dressing,. Design of living, working and public spaces …
Dominant and submissive nonverbal behavior of virtual agents and its effects on evaluation and negotiation outcome in different age groups
AM Rosenthal-von der Pütten, C Straßmann… – Computers in Human …, 2019 – Elsevier
JavaScript is disabled on your browser. Please enable JavaScript to use all the features on this page. Skip to main content Skip to article …
Beyond dyadic interactions: considering chatbots as community members
J Seering, M Luria, G Kaufman, J Hammer – Proceedings of the 2019 …, 2019 – dl.acm.org
Page 1. Beyond Dyadic Interactions: Considering Chatbots as Community Members Joseph Seering Carnegie Mellon University Pittsburgh, Pennsylvania, USA jseering@cs.cmu.edu Michal Luria Carnegie Mellon University Pittsburgh, Pennsylvania, USA mluria@cs.cmu.edu …
Eyetracking-based assessment of affect-related decay of human performance in visual tasks
J Przyby?o, E Ka?toch, P Augustyniak – Future Generation Computer …, 2019 – Elsevier
… others. The non-verbal manner in which emotions are expressed by the human has been studied by social sciences and is commonly known as body language. Human emotions play a crucial role in human–computer interfaces …
Contextual language understanding Thoughts on Machine Learning in Natural Language Processing
B Favre – 2019 – hal-amu.archives-ouvertes.fr
… The ELIZA chatbot (Weizenbaum 1976) or contestants to the Loeb- ner Prize competition (Stephens 2004) are dialog systems which rely on conversational tricks in order to evade difficult questions (such as invoking boredom, switching topics, etc.) Machine Translation is …
Designing a social robot to support children’s inquiry learning: A contextual analysis of children working together at school
DP Davison, FM Wijnen, J van der Meij… – International journal of …, 2019 – Springer
Page 1. International Journal of Social Robotics https://doi.org/10.1007/s12369-019- 00555-6 Designing a Social Robot to Support Children’s Inquiry Learning: A Contextual Analysis of Children Working Together at School Daniel …
The facilitator is a Bot: towards a conversational agent for facilitating idea elaboration on idea platforms
EAC Bittner, GC Küstermann, C Tratzky – 2019 – aisel.aisnet.org
… Different terms have been used for different CAs, such as machine conversation system, virtual or computer agent, dialogue system, and chatbot … conversing with the learner(s) via text messages, speech or other modalities such as facial expressions, gestures or body language …
Negotiating meanings online: Disagreements about word meaning in discussion forum communication
J Myrendal – Discourse Studies, 2019 – journals.sagepub.com
This article describes word meaning negotiation (WMN) in online discussion forum communication, a form of computer-mediated communication (CMC). WMN occurs when participants who are engaged in a di…
Virtual humans: Today and tomorrow
D Burden, M Savin-Baden – 2019 – books.google.com
… 6.6 Figure 6.7 Figure 6.8 Figure 6.9 Figure 6.10 Figure 6.11 Figure 6.12 Figure 8.1 Figure 8.2 Figure 8.3 Figure 8.4 Figure 10.1 Figure 12.1 Figure 13.1 Figure 13.2 Figure 13.3 The key mind processes of a virtual human Kismet robot Research on dialogue systems for language …
Using Event Representations to Generate Robot Semantics
P Gärdenfors – ACM Transactions on Human-Robot Interaction (THRI), 2019 – dl.acm.org
Page 1. 21 Using Event Representations to Generate Robot Semantics PETER GÄRDENFORS, Lund University and University of Technology Sydney Most semantic models employed in human-robot interactions concern how …
Examining the effects of robotic service on brand experience: the moderating role of hotel segment
APH Chan, VWS Tung – Journal of Travel & Tourism Marketing, 2019 – Taylor & Francis
… Hotels across all three segments – budget, midscale, and luxury – typically train their staff on impression management, covering non-verbal aspects such as body language, posture, and behaviour, to make a strong sensory impression (Manzur & Jogaratnam, 2007) …
Usability Engineering
R Heimgärtner – Intercultural User Interface Design, 2019 – Springer
… of standards is “the ergonomic design of computer workstations taking into account the individual needs of users” (ISO 9241-110, p. 9). Part 110 specifies seven principles of dialogue design, which are to be applied as guidelines in the design and evaluation of dialogue systems …
The Development of Telepresence Robot
P Duan – 2019 – search.proquest.com
… However, in order to introduce customers to the “temperature” service such as dish awareness, historical evaluation, and jokes, a voice recognition dialogue system based on the cloud AI server is usually configured to improve the customer experience. Bookstore …
DIL-A Conversational Agent for Heart Failure Patients
S Moulik – 2019 – search.proquest.com
… verbal and nonverbal human-computer interaction [58]. A conversational agent (CA) can also be considered a dialogue system. It is in essence a computer system intended to converse with a human with a coherent structure. Dialogue systems have employed …
of deliverable First periodic report
J van Loon, H op den Akker, T Beinema, M Broekhuis… – 2019 – council-of-coaches.eu
… initial designs of all the important elements of the framework: the virtual appearance of the coaches, the body language and the … Agent Platform – an open source platform that will allow researchers and developers to create their own multi-agent dialogue systems for various …
Hybrid framework for speaker-independent emotion conversion using i-vector PLDA and neural network
S Vekkot, D Gupta, M Zakariah, YA Alotaibi – IEEE Access, 2019 – ieeexplore.ieee.org
… INDEX TERMS CV-GMM, Speech Emotion, Feed-forward ANN, i-vector, MFCC, PLDA I. INTRODUCTION EMOTIONS form a prominent para-linguistic element of human communication, consisting of speech, facial ex- pressions, gestures, body language, etc …
Spoken conversational search: audio-only interactive information retrieval
J Trippas – 2019 – researchbank.rmit.edu.au
… 25 2.5.1 Spoken Dialogue Systems … SCoSAS Spoken Conversational Search Annotation Schema SCS Spoken Conversational Search SDS Spoken Dialogue System TREC Text REtrieval Conference TTS Text-To-Speech WOZ Wizard of Oz xiv Page 17 …
Using Socially Expressive Mixed Reality Arms for Enhancing Low-Expressivity Robots
T Groechel, Z Shi, R Pakkar… – 2019 28th IEEE …, 2019 – ieeexplore.ieee.org
… Fig. 2: Keyframes for Kuri’s clapping animation. Since physical robots are limited by cost, physical safety, and mechanical constraints, socially interactive robots often explore additional communication channels, ranging from lights [28] to dialogue systems [29] …
International Journal of Transmedia Literacy (IJTL). Vol 4 (2018): Expanding Universes. Exploring Games and Transmedial Ways of World-building
R Koskimaa, K Maj, K Olkusz – 2019 – books.google.com
Page 1. 4 December 2018 Page 2. 4 December 2018 Expanding Universes. Exploring Games and Transmedial Ways of World-building Edited by Raine Koskimaa, Krzysztof Maj and Ksenia Olkusz Guest Editors’ Profiles Introduction …
Chatbots, will they ever be ready? Pragmatic shortcomings in communication with chatbots
S TONTS – 2019 – politesi.polimi.it
… Spoken words Writing, sign language Paralanguage (pitch, volume, speaking rate, etc.) Body language (gestures, facial expressions, eye contact, etc.) cation is often essential when con- veying information and making judg- ments about others (Hargie, Dickson 2004) …
An Active Learning Paradigm for Online Audio-Visual Emotion Recognition
I Kansizoglou, L Bampis… – IEEE Transactions on …, 2019 – ieeexplore.ieee.org
… I. INTRODUCTION EMOTIONAL state, expressed through non-verbal cues, such as facial expressions, body language, voice tone and tempo in speech, owns a decisive role in communica- tion since it can define the actual meaning of an utterance …
Automated analysis of non-verbal behaviour of schizophrenia patients
D Chakraborty – 2019 – dr.ntu.edu.sg
… lexical analysis of interviews with schizophrenic patients,” in Proceedings of the 9th International Workshop on Spoken Dialogue Systems (IWSDS), 2018 … When humans communicate with each other, their speaking mannerisms and body language play …
Context-aware speech synthesis: A human-inspired model for monitoring and adapting synthetic speech
M Nicolao – 2019 – etheses.whiterose.ac.uk
Page 1. Context-aware speech synthesis: A human-inspired model for monitoring and adapting synthetic speech Mauro Nicolao Department of Computer Science University of Sheffield January 2019 Dissertation submitted to …
Universal Access in Human-Computer Interaction. Multimodality and Assistive Environments: 13th International Conference, UAHCI 2019, Held as Part of the …
M Antona, C Stephanidis – 2019 – books.google.com
Page 1. Margherita Antona Constantine Stephanidis (Eds.) Universal Access in Human-Computer Interaction Multimodality and Assistive Environments 13th International Conference, UAHCI 2019 Held as Part of the 21st HCI …
Responsible innovation in online therapy
KV Vold, D Peters, R Calvo, D Robinson – 2019 – repository.cam.ac.uk
Page 1. Responsible innovation in online therapy A report on technical opportunities, ethical issues, and recommendations for design Report Date: 22.07.2019 Imperial Consultants, Ltd. This report is the independent expert opinion of the authors …
Emotion Recognition from Speech using Machine Learning algorithms
N Chaiko, R Zamparelli – 2019 – ixa.si.ehu.es
Page 1. Emotion Recognition from Speech using Machine Learning algorithms Author: Natallia Chaiko Advisors: Prof. Eva Navas1 Prof. Roberto Zamparelli2 1 University of the Basque Country 2 University of Trento European …
A comparative study of social bot classification techniques
F Örnbratt, J Isaksson, M Willing – 2019 – diva-portal.org
… A topology suggesting six different types of bots (Gorwa & Guilbeault, 2018) have been constructed: ? Web Robots (crawlers) ? Chatbots (natural language based dialog system) ? Spambots (bots that advertise and post spam on online messaging platforms) …
Aurora-a study on the guided meditation using immersive media
J Dantas Silva – 2019 – ls00012.mah.se
Page 1. Aurora – a study on the guided meditation using immersive media Author: Juliana Dantas Silva Media Technology: Strategic Media Development (ME 620A) Master Thesis, 15 credits, Advanced Level Supervisor: Erik …
Restoration Work: Responding to Everyday Challenges of HIV Outreach
N Kumar, A Ismail, S Sherugar… – Proceedings of the ACM …, 2019 – dl.acm.org
Page 1. 54 Restoration Work: Responding to Everyday Challenges of HIV Outreach NEHA KUMAR, Georgia Institute of Technology, USA AZRA ISMAIL, Georgia Institute of Technology, USA SAMYUKTA SHERUGAR, Google …
Investigating the Role of Social Media in Supporting Parents and Teachers of Students with Down’s Syndrome: Focus on Early Intervention Services in the Kingdom of …
AH ALShamare – 2019 – discovery.ucl.ac.uk
Page 1. 1 Investigating the Role of Social Media in Supporting Parents and Teachers of Students with Down’s Syndrome: Focus on Early Intervention Services in the Kingdom of Saudi Arabia By Awatif Habeeb ALShamare A …