Notes:
A behavior planner is a component of an artificial intelligence (AI) system that is responsible for deciding what actions the system should take in a given situation. Behavior planners are often used in robotics and autonomous systems, where they help to coordinate the actions of the system in order to achieve a specific goal or objective.
Behavior planners typically operate by examining the current state of the system and its environment, and determining the best course of action based on a set of rules or objectives. In some cases, behavior planners may use machine learning algorithms to learn from past experience and adapt their actions over time.
Behavior planners can work together with dialog systems, which are AI systems that are designed to generate and interpret natural language input and output. For example, a behavior planner in a home automation system might receive input from a dialog system indicating that a user has asked the system to turn on the lights. The behavior planner would then decide how to fulfill this request, based on the current state of the system and the available options.
Wikipedia:
See also:
Behavior Realization & Artificial Intelligence 2019 | Behavior Realizer & Artificial Intelligence
Design and development of the Slovak multimodal dialogue system with the BML Realizer Elckerlyc
S Ondas, J Juhar – 2012 IEEE 3rd International Conference on …, 2012 – ieeexplore.ieee.org
… TTS MM Fusion & Analysis Behaviour planner Realizers Distributed Dialogue Manager … One of the considerable reasons was also a large number of spoken dialogue systems based on aforementioned framework, eg Olympus [17], the multimodal dialogue system for map …
Autonomous Robotic Dialogue System with Reinforcement Learning for Elderlies with Dementia
J Magyar, M Kobayashi, S Nishio… – … on Systems, Man …, 2019 – ieeexplore.ieee.org
… Many autonomous dialogue systems have been developed, some even targeted to solve the lack of … the number of possible states, and so better prepare the dialogue system for a … Finally, the robot’s behavior planner selects an appropriate response which is then performed by …
SiAM-dp: an open development platform for massively multimodal dialogue systems in cyber-physical environments.
R Neßelrath – 2016 – pdfs.semanticscholar.org
… Other extensive projects deploy well elaborated platforms for multimodal dialogue systems … Ontology Based SiAM-dp is fully ontology based and uses a single domain adaptable knowledge repre- sentation throughout the complete dialogue system …
Believable Virtual Characters in Human-Computer Dialogs.
Y Jung, A Kuijper, DW Fellner, M Kipp… – Eurographics …, 2011 – michaelkipp.de
… 3.1.1. Albeit the architectural design of dialog systems as well as accompanying topics like natural language processing, dis- course planning … location, whereas lower-level behav- iors like producing a gesture or changing posture are in the responsibility of the behavior planner …
A novel unity-based realizer for the realization of conversational behavior on embodied conversational agents
I Mlakar, Z Kacic, M Borko, M Rojc – International Journal of Computers, 2017 – iaras.org
… The CAs range from chat-bots and 2D, carton-like implementations of talking heads [2,3,4 … Thus, the integration of behaviour planners and behaviour generators (eg behaviour specification tools) is only natural … Natural language processing implementation on Romanian ChatBot …
Nonverbal behavior in multimodal performances
A Cafaro, C Pelachaud, SC Marsella – The Handbook of Multimodal …, 2019 – dl.acm.org
Page 1. 6Nonverbal Behavior in Multimodal Performances Angelo Cafaro, Catherine Pelachaud, Stacy C. Marsella 6.1 Introduction The physical, nonverbal behaviors that accompany face-to-face interaction convey a wide variety …
Towards conversational agents that attend to and adapt to communicative user feedback
H Buschmeier, S Kopp – International Workshop on Intelligent Virtual …, 2011 – Springer
… The behaviour planner delays the generation of the 2 MARY TTS – https://mary.dfki … IJCAI 1997 Workshop on Col- laboration, Cooperation and Conflict in Dialogue Systems, Nagoya, Japan … Fujie, S., Miyake, R., Kobayashi, T.: Spoken dialogue system using recognition of user’s …
Adaptive grounding and dialogue management for autonomous conversational assistants for elderly users
R Yaghoubzadeh, K Pitsch, S Kopp – International Conference on …, 2015 – Springer
… that parallels to older adults exist in interactions with spoken dialogue systems, such as a … is that the information grounding process in a multimodal spoken dialogue system for people … microphone, and a touch-screen PC running flexdiam, parser, NLU, behavior planner, a 3D …
A novel realizer of conversational behavior for affective and personalized human machine interaction-EVA U-Realizer
I MLAKAR, Z KA?I?, M BORKO… – WSEAS Trans. Environ …, 2018 – academia.edu
… The CAs range from chat-bots and 2D cartoon-like realizations of talking heads [2,3,4,5], which are used primary as the human-like … The coupling of behaviour planners and generators (eg behaviour specification tools) with the animation components is, therefore, only natural …
MPML3D: Scripting agents for the 3D internet
H Prendinger, S Ullrich, A Nakasone… – IEEE Transactions on …, 2010 – ieeexplore.ieee.org
… of BML is to develop a general platform that allows to combine behavior planners and behavior … be desired by the content creator, are very difficult to implement with chatbot technology … Very recently, a new commercial approach of embodied chatbots for SL has been proposed …
A multimodal system for real-time action instruction in motor skill learning
I de Kok, J Hough, F Hülsmann, M Botsch… – Proceedings of the …, 2015 – dl.acm.org
… The hardware system consists of a CAVE environment and a motion capture system; the software components are composed of a rendering engine, motion analysis and dialogue system– see Figure 3 for an overview … AsapRealizer Behaviour Planner …
Virtual agents for professional social skills training: An overview of the state-of-the-art
K Bosman, T Bosse, D Formolo – International Conference on Intelligent …, 2018 – Springer
… Job interviews. Speech. Multi-modal cues. Speech. Sequential behaviour planner based on stance. Virtual-Suspect. Police interrogations. Free text. Text … Textual feedback. Replay session. Level 1. AIML-Chatbot. Facial animations. C# and .NET. Communicate! Communication skills …
A User Perception–Based Approach to Create Smiling Embodied Conversational Agents
M Ochs, C Pelachaud, G Mckeown – ACM Transactions on Interactive …, 2017 – dl.acm.org
… 2012], like the well-known Eliza chatbot [Weizenbaum 1966], with rich nonverbal be- havior but poor linguistic competences … The behavior planner transforms these communicative intentions into a set of multimodal signals (eg, speech, gestures, facial expressions) …
Selecting and expressing communicative functions in a saiba-compliant agent framework
A Cafaro, M Bruijnes, J van Waterschoot… – … on Intelligent Virtual …, 2017 – Springer
… This is typically the joint task of a Dialogue Manager (DM) on the one hand (intent planner) and a non-verbal behaviour generation (NVBG) system on the other (behaviour planner) … OpenDial is a toolkit for developing spoken dialogue systems created by Lison [10] …
A Review of the Development of Embodied Presentation Agents and Their Application
T Rist, E André, S Baldes… – Life-Like Characters: Tools …, 2013 – books.google.com
… The behavior planner has the task of decomposing complex discourse goals into basic acts that can be exe- cuted by the character player … The interaction manager, in a certain sense, corresponds to a dialogue manager as found in NL dialogue systems since it is respon- sible …
Fostering user engagement in face-to-face human-agent interactions: a survey
C Clavel, A Cafaro, S Campano… – Toward Robotic Socially …, 2016 – Springer
… Dybala and colleagues [47] proposed a humor-equipped casual conversational system (chatbot) and demonstrated that it enhances the user’s … that produces the communicative intentions and handles the emotional state of the agent; (2) a Behavior Planner that transforms the …
A Human Robot Interaction Toolkit with Heterogeneous Multilevel Multimodal Mixing
B Vijver – 2016 – essay.utwente.nl
… However, such chatbots can usually only communicate with text, making it hard to create a positive social interaction … This can be a dialog system with AsapRealizer, but the MIDI-controller will also be considered as an external control …
Synthesis of listener vocalizations: towards interactive speech synthesis
SC Pammi – 2012 – scidok.sulb.uni-saarland.de
… Page 4. Page 5. Short Summary Spoken and multi-modal dialogue systems start to use listener vocaliza- tions, such as uh-huh and mm-hm, for natural interaction. Generation of listener vocalizations is one of the major objectives of emotionally colored …
Situated interaction
D Bohus, E Horvitz – The Handbook of Multimodal-Multisensor Interfaces …, 2019 – dl.acm.org
… at dialog between computers and people were text-based dialog systems, such as Eliza [Weizenbaum 1966], a pattern- matching chat-bot that emulated a … Most work in spoken dialog systems has traditionally focused on dyadic settings, where a dialog system interacts with a …
Conversational behavior reflecting interpersonal attitudes in small group interactions
B Ravenet, A Cafaro, B Biancardi, M Ochs… – … on Intelligent Virtual …, 2015 – Springer
… a model to manage the turn-taking between a user and a spoken dialog system using data … behavior corresponding to the interpersonal attitudes expressed thanks to the Behavior Planner from [27 … Raux, A., Eskenazi, M.: A finite-state turn-taking model for spoken dialog systems …
Challenge discussion: advancing multimodal dialogue
J Allen, E André, PR Cohen, D Hakkani-Tür… – The Handbook of …, 2019 – dl.acm.org
… How can dialogue systems be built to be robust to dialogue failures, such as failures of understanding, or even simply to changes in the world (eg, the bus … 5.2.2 Chatbot Dialogues Many groups have been building so-called “chatbots” that mimic conversational engagement …
Enabling robust and fluid spoken dialogue with cognitively impaired users
R Yaghoubzadeh, S Kopp – … of the 18th Annual SIGdial Meeting …, 2017 – pub.uni-bielefeld.de
… For a comparison of the present work to exist- ing approaches and implementations of dialogue systems, we will … Behavior Planners, Behavior Realizer observer … The autonomous dialogue system was overseen by an experimenter, who had three options to aid the system in …
Google, Inc.(search)
ACM Books – dl.acm.org
… systems, including virtual assistant ar- chitectures offered by major IT firms, chatbots, and machine … the participants agreed on the limitations of end-to-end trained chatbot systems, they … The first is a clinical application—a mul- timodal dialogue system for annotation and retrieval …
Automotive multimodal human-machine interface
D Schnelle-Walka, S Radomski – The Handbook of Multimodal …, 2019 – dl.acm.org
… Dialogue planner Presentation manager Multimodal dialogue system Driving context … Driving performance Cognitive load Figure 12.3 Components influencing cognitive load and driving performance of an in-car multi- modal dialog system (after [Neßelrath and Feld 2013]) …
Standardized representations and markup languages for multimodal interaction
R Tumuluri, D Dahl, F Paternò… – The Handbook of …, 2019 – dl.acm.org
Page 1. 9Standardized Representations and Markup Languages for Multimodal Interaction Raj Tumuluri, Deborah Dahl, Fabio Patern`o, Massimo Zancanaro 9.1 Introduction This chapter discusses some standard languages …
Deliverable D5. 3
P Gebhard – perso.limsi.fr
… for dialogue control while Scenejo relies on the ALICE chatbot technology [ALICE 12 … and potential benefits of combining elements from computer games, dialogue systems and language … subsequent components are needed, such as the GRETA Behavior Planner and GRETA …
An architecture for fluid real-time conversational agents: integrating incremental output generation and input processing
S Kopp, H van Welbergen, R Yaghoubzadeh… – Journal on Multimodal …, 2014 – Springer
… The Behavior Planner can prime the recognition of feedback behaviors (eg nodding, saying “uh huh”) that are expected to complement an … Since incremental input processing has received a significant amount of attention in the spoken dialogue systems community [1, 38], and …
Architectures and Standards for IVAs at the Social Cognitive Systems Group
H van Welbergen, K Bergmann… – … and Standards for …, 2014 – biecoll2.ub.uni-bielefeld.de
… 3. How to share and combine smaller components: Many interesting components for IVAs that are smaller than, eg, a full Behavior Planner have been … His research is on robust multimodal spoken dialog systems to assist older people and people with cogni- tive impairments …
Generation of communicative intentions for virtual agents in an intelligent virtual environment: application to virtual learning environment
B Nakhal – 2017 – tel.archives-ouvertes.fr
… 3.4.1 Implementation of Behavior Planner and Behavior Realizer interfaces ….. 71 3.4.2 Integrating ECA platforms ….. 71 4 APPLICATION ….. 81 …
I Probe, Therefore I Am: Designing a Virtual Journalist with Human Emotions
KK Bowden, T Nilsson, CP Spencer, K Cengiz… – arXiv preprint arXiv …, 2017 – arxiv.org
… KK Bowden is with the Natural Language and Dialog Systems Lab, University of California, Santa Cruz, USA … were continuously carried out in parallel to the development of the dialogue system in order to … The workings of the behavior planner are worked out in the next section …
Multimodal conversational interaction with robots
G Skantze, J Gustafson, J Beskow – The Handbook of Multimodal …, 2019 – dl.acm.org
… body. Hal only stares at the interlocutor with his (now emblematic) red eye. For a long time, spoken dialogue systems developed in research labs and employed in the industry also lacked any physical embodiment. One reason …
A computational model for the emergence of turn-taking behaviors in user-agent interactions
M Jégou, P Chevaillier – Journal on Multimodal User Interfaces, 2018 – Springer
… that dialog systems often falsely detect user’s utterances as barge-ins, whereas the latter is only providing feedback to the system by producing a vocal backchannel or is not speaking to the system. Several studies have thus explored how a spoken dialog system could reliably …
Sentiment analysis: from opinion mining to human-agent interaction
C Clavel, Z Callejas – IEEE Transactions on affective computing, 2015 – ieeexplore.ieee.org
… Detection and avoidance of user frustration in driving situations [77] or for tutoring systems [78] or for a child conversational computer game [83]. Detection of various emotions according to the application for dialog systems [25], [81], [83] …
All together now
A Hartholt, D Traum, SC Marsella, A Shapiro… – … Workshop on Intelligent …, 2013 – Springer
… NVBG is a rule-based behavior planner designed to operate flexibly with the information it is provided: it generates behaviors given the … K., Gerten, J., Nazarian, A., Traum, D.: FLoReS: A For- ward Looking, Reward Seeking, Dialogue Manager, Spoken Dialog Systems (2012) 46 …
Affective Conversational Interfaces
M McTear, Z Callejas, D Griol – The Conversational Interface, 2016 – Springer
… 15.6. Fig. 15.6 The SAIBA model. The intent planner decides the communicative intention, the behavior planner schedules the communicative signals, and finally the behavior realizer realizes the behaviors scheduled to generate the corresponding animation. As shown in Fig …
Commercialization of multimodal systems
PR Cohen, R Tumuluri – The Handbook of Multimodal-Multisensor …, 2019 – dl.acm.org
Page 1. 15Commercialization of Multimodal Systems Philip R. Cohen, Raj Tumuluri 15.1 Introduction This chapter surveys the broad and accelerating commercial activity in build- ing products incorporating multimodal-multisensor interfaces …
Software platforms and toolkits for building multimodal systems and applications
M Feld, R Ne?elrath, T Schwartz – The Handbook of Multimodal …, 2019 – dl.acm.org
… implemented (eg, pro-active assistant, question answering system, troubleshooting chatbot, social bot … Dialogue Platforms are underlying frameworks that are used to execute a dialogue system. Dialogue Systems are software agents that allow users to converse with a machine …
Topic management for an engaging conversational agent
N Glas, C Pelachaud – International Journal of Human-Computer Studies, 2018 – Elsevier
… Macias-Galindo et al. (2012) use a semantic relatedness mechanism to transition between conversational snippets in an ECA that engages in chatty dialogue, and Stede and Schlangen (2004) use an ontology-like topic structure that makes a dialogue system (non-embodied …
Multimodal databases
M Valstar – The Handbook of Multimodal-Multisensor Interfaces …, 2019 – dl.acm.org
Page 1. 10Multimodal Databases Michel Valstar 10.1 Introduction In the preceding chapters, we have seen many examples of Multimodal, Multisen- sor Interfaces (MMIs). Almost all of these interfaces are implemented as computer …
Early integration for movement modeling in latent spaces
R Hornung, N Chen, P van der Smagt – The Handbook of Multimodal …, 2019 – dl.acm.org
Page 1. 8Early Integration for Movement Modeling in Latent Spaces Rachel Hornung, Nutan Chen, Patrick van der Smagt 8.1 Introduction In this chapter, we will show how techniques of advanced machine and deep learn- ing …
Privacy concerns of multimodal sensor systems
G Friedland, MC Tschantz – The Handbook of Multimodal-Multisensor …, 2019 – dl.acm.org
Page 1. 16Privacy Concerns of Multimodal Sensor Systems Gerald Friedland, Michael Carl Tschantz 16.1 Introduction This chapter explains that ignoring the privacy risks introduced by multimodal sys- tems could have severe consequences for society in the long term …
Multimodal dialogue processing for machine translation
A Waibel – The Handbook of Multimodal-Multisensor Interfaces …, 2019 – dl.acm.org
… Field-adaptable and extendable systems. Languages and vocabularies change, and interpreting dialogue systems must evolve alongside such changing languages and vocabularies and adapt to any given dialogue scenario …
Medical and health systems
D Sonntag – The Handbook of Multimodal-Multisensor Interfaces …, 2019 – dl.acm.org
… Application domains include serious games, conversational agents, or dialogue systems for healthy behavior promotion; intelligent interactive … 11.4.1 Case Study 1: A Multimodal Dialog System In this case study, we present a dialogue system for the annotation and retrieval of …
Ergonomics for the design of multimodal interfaces
A Heloir, F Nunnari, M Bachynskyi – The Handbook of Multimodal …, 2019 – dl.acm.org
Page 1. IIPART MULTIMODAL BEHAVIOR Page 2. Page 3. 7Ergonomics for the Design of Multimodal Interfaces Alexis Heloir, Fabrizio Nunnari, Myroslav Bachynskyi 7.1 Introduction There are many ways a machine can infer …
Towards reasoned modality selection in an embodied conversation agent
C Ten-Ventura, R Carlini, S Dasiopoulou, GL Tó… – … on Intelligent Virtual …, 2017 – Springer
… content and addressee profile features, it is different from most of the approaches to modality handling in multimodal dialogue systems, which tend to … Our model can be considered as a proposal for an alternative realization of the Behavior Planner in the SAIBA-framework …
Embedded multimodal interfaces in robotics: applications, future trends, and societal implications
EA Kirchner, SH Fairclough, F Kirchner – The Handbook of Multimodal …, 2019 – dl.acm.org
Page 1. 13Embedded Multimodal Interfaces in Robotics: Applications, Future Trends, and Societal Implications Elsa A. Kirchner, Stephen H. Fairclough, Frank Kirchner 13.1 Introduction In the past, robots were primarily used …
AsapRealizer 2.0: The next steps in fluent behavior realization for ECAs
H Van Welbergen, R Yaghoubzadeh… – … Conference on Intelligent …, 2014 – Springer
… This is a topic we have given a great deal of attention in the development of ECAs and dialogue systems within our two research … for fluent behavior realiza- tion with AsapRealizer (see also Table 1). Adaptations of behavior may be steered: 1) by the behavior planner (top-down …
Multimodal integration for interactive conversational systems
M Johnston – The Handbook of Multimodal-Multisensor Interfaces …, 2019 – dl.acm.org
Page 1. IPART MULTIMODAL LANGUAGE AND DIALOGUE PROCESSING Page 2. Page 3. 1Multimodal Integration for Interactive Conversational Systems Michael Johnston 1.1 Introduction This chapter discusses the challenges …
Towards a socially adaptive virtual agent
AB Youssef, M Chollet, H Jones, N Sabouret… – … on Intelligent Virtual …, 2015 – Springer
… Acosta and Ward proposed a spoken dialogue system capable of inferring the user’s emotion from vocal features and to adapt its response according to this … Lastly, the behaviour planner selects the appropriate social signals to express the emotion and the attitude of the agent …
Continuous interaction with a virtual human
D Reidsma, I de Kok, D Neiberg, SC Pammi… – Journal on Multimodal …, 2011 – Springer
… dialog processes [51]. This need for con- tinuous interaction is also reflected in the recent develop- ments combining incremental perception and incremental generation into incremental dialog systems [45]. Incremen- tal perception …
The SEMAINE API: A component integration framework for a naturally interacting and emotionally competent Embodied Conversational Agent
M Schröder – 2011 – publikationen.sulb.uni-saarland.de
… Page 15. 3 Since an ECA-based interactive dialogue system requires multiple input and output processing functionalities, it is typically built from components. Each com … At the most coarse- grained level, three building blocks of dialogue systems are usually distinguished …
Episodic memory model for embodied conversational agents
M Elvir – 2010 – stars.library.ucf.edu
… aspects of chatbots: 1) focus on the Loebner prize, 2) template-based AIML-type bots … 12 techniques, and 3) slow development of reasoning from natural language in dialog systems … interface realism for chatbot systems distinguishes ECAs from ELIZA- or ALICE-based agents …
Response selection and turn-taking for a sensitive artificial listening agent.
M ter Maat – 2011 – Citeseer
… Dialogue systems such as this make it possible to use natural language to interact with … build a fully automatic version of a SAL: a multi-modal dialogue system which focusses … namely the Listener intent planner, the Listener action selector, the Behaviour planner, the Behaviour …
SPEECH TECHNOLOGIES IN MODERN HCI APPLICATIONS
J Juhar, M Pleva – researchgate.net
… TTS MM Fusion & Analysis Behaviour planner Realizers Distributed Dialogue Manager Other input modality … REFERENCES Juhar et al. (2006) Development of Slovak GALAXY VoiceXML based spoken language dialogue system to retrieve information from the Internet. Proc …
Towards adaptive social behavior generation for assistive robots using reinforcement learning
J Hemminahaus, S Kopp – 2017 12th ACM/IEEE International …, 2017 – ieeexplore.ieee.org
… 337 Page 7. 1st click Help? Random wait 2 sec select randomly execute 2nd click 0 sec yes 1st click Help? WoZ wait 2 sec select acc. Q-function execute 2nd click x sec yes Behavior Planner Behavior Selection R a n d o m C on d ition Ad ap tiv e C o n d itio n …
Greta: Towards an interactive conversational virtual companion
E Bevacqua, K Prepin, R Niewiadomski… – … : perspectives on the …, 2010 – academia.edu
… The control of the behaviours of the agent and of the robot is done via the BML language. The Behavior Planner outputs FAPs for the agent and Aibo commands for the robot … Gustafson, J., Lindberg, N., Lundeberg, M., The August spoken dialog system …
Agent communication for believable human-like interactions between virtual characters
J Van Oijen, F Dignum – … AAMAS Workshop on Cognitive Agents for …, 2012 – Springer
… 49 in this model. To support flexible interactions, the model was implemented as an information state-based dialogue system, inspired by the theory in [17] … Behavior Planner. Realizes a communicative intent scheduled by the Dialogue Model …
Modeling and simulating empathic behavior in social assistive robots
B De Carolis, S Ferilli, G Palestra… – Proceedings of the 11th …, 2015 – dl.acm.org
… the most appropriate to the emotion felt by the agent, the behavior planner module computes the agent behavior using plans … and research workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems (PIT ’08 …
Emotion and attitude modeling for non-player characters
B Ravenet, F Pecune, M Chollet, C Pelachaud – Emotion in Games, 2016 – Springer
… The Sequential Behavior Planner takes as input an utterance to be said by the agent augmented with information on the communicative intentions and its attitude it wants to … In: van Kuppevelt J, Dybkjær L, Bernsen NO (eds) Advances in natural multimodal dialogue systems …
20 Body Movements Generation for Virtual Characters and Social Robots
A Beck, Z Yumak… – Social Signal …, 2017 – books.google.com
… BML has been used in various embodied conversational agent projects as well as in various behaviour planners, behaviour realizers … Spoken and multimodal dialog systems and applications–rigid head motion in expressive speech animation: Analysis and synthesis …
From Annotation to Multimodal Behavior
K Jokinen, C Pelachaud – Coverbal Synchrony in Human …, 2013 – books.google.com
… The next module called Behavior Planner instantiates Figure 1. SAIBA framework. Page 227 … In Affective Dialogue Systems. Proceedings of Tutorial and Research Workshop, Kloster Irsee, Germany, June 14–16. Lecture Notes in Computer Science …
Virtual Learning Environments and Intelligent Tutoring Systems-Survey of current approaches and design methodologies
K Muñoz, P Mc Kevitt, T Lunney… – Review Paper available …, 2013 – academia.edu
Page 1. REVIEW PAPER Virtual Learning Environments and Intelligent Tutoring Systems Survey of current approaches and design methodologies Karla Muñoz • Paul Mc Kevitt • Tom Lunney • Intelligent Systems Research Centre …
Nonverbal Behavior in
A Cafaro, C Pelachaud… – The Handbook of …, 2019 – books.google.com
… Flipper is a library for specifying dialogue rules for dialogue systems, that uses XML-templates to describe the preconditions, effects and BML … Adaptation can be steered (1) by the behavior planner (top-down), for example when requesting the ECA to speak louder,(2) by the …
Towards an empathic social robot for ambient assisted living.
BN De Carolis, S Ferilli, G Palestra… – ESSEM@ AAMAS, 2015 – researchgate.net
… a goal has been selected as the most appropriate to the emotion felt by the agent, the behavior planner module computes the … and research workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multi- modal Dialogue Systems (PIT ’08 …
of deliverable First periodic report
J van Loon, H op den Akker, T Beinema, M Broekhuis… – 2019 – council-of-coaches.eu
… Open Agent Platform – an open source platform that will allow researchers and developers to create their own multi-agent dialogue systems for various … Not all doctors will have this time available, but fortunately we are developing a dialogue system with which the end user can …
The SEMAINE API: towards a standards-based framework for building emotion-oriented systems
M Schröder – Advances in human-computer interaction, 2010 – hindawi.com
… The project aims to build a multimodal dialogue system with an emphasis on nonverbal skills—detecting and emitting vocal and facial signs related to the interaction, such as backchannel signals, in order to register and express information such as continued presence …
Computational model of listener behavior for embodied conversational agents
E Bevacqua – 2010 – books.google.com
… 71 6.4 Architecture . . . . . 72 6.4.1 Listener Intent Planner . . . . . 73 6.4.2 Behavior Planner . . . . . 75 6.4.3 Behavior Realizer . . . . . 77 6.4.4 FAP-BAP Player …
Embodied conversational characters: Representation formats for multimodal communicative behaviours
B Krenn, C Pelachaud, H Pirker, C Peters – Emotion-Oriented Systems, 2011 – Springer
… The behaviour planner takes as input a given intention and/or emotional state, for instance to greet the communication partner happily, and outputs representations for the visual and acoustic signals to be generated by the behaviour realisation modules, such as a waving hand …
Modeling nonverbal behaviors for virtual agents
J Lee – 2012 – search.proquest.com
… 2. Chapter 3 describes the literature based approach for modeling nonverbal behaviors for virtual agents and reviews virtual human systems utilizing the Nonverbal Behavior Gen- erator (Lee & Marsella, 2006), the behavior planner framework developed through this work …
Simulating empathic behavior in a social assistive robot
B De Carolis, S Ferilli, G Palestra – Multimedia Tools and Applications, 2017 – Springer
… be triggered. Once a goal has been selected as the most appropriate to the emotion felt by the agent, the behavior planner module computes the robot’s behavior using plans represented as context- adapted recipes. Each plan …
Closing the Modelling Gap: Transfer Learning from a Low-Fidelity Simulator for Autonomous Driving
A Balakrishnan – 2020 – uwspace.uwaterloo.ca
… The behaviour planner needs to take into consideration not only the obstacles in the local environment but also the traffic rules, road features and the AV’s state to make the optimal and, most importantly, safe decision. 1 Page 11 …
D4. 7 1st Expressive Virtual Characters
F Yang, C Peters – 2016 – prosociallearn.eu
… interaction 1 . The SEMAINE project 2 built a Sensitive Artificial Listener (SAL) and a multimodal dialogue system which can react to the user’s verbal and non-verbal behavior and sustain the interaction for a long time. Greta …
Gestures become more informative when communication is unsuccessful
MW Hoetjes – 2016 – repository.ubn.ru.nl
Page 1. PDF hosted at the Radboud Repository of the Radboud University Nijmegen The following full text is a publisher’s version. For additional information about this publication click this link. http://hdl.handle.net/2066/162512 …
Building Effective Robot Tutoring Interactions for Children
A Ramachandran – 2018 – search.proquest.com
Building Effective Robot Tutoring Interactions for Children. Abstract. Socially assistive robots have the potential to assist people in a variety of social and cognitive tasks, often taking the role of a coach or tutor to support users over time …