Behavior Realizers


Notes:

A behavior realizer (aka behaviour realiser) is basically a behavior generator, often based on behavior markup language.  The task of the behavior realizer is to realize the behaviors scheduled by the behavior planner.  In other words, the behavior realizer makes a concrete action based on the blueprint supplied by the behavior planner.  The behavior realizer takes the behavior markup language behavior specifications and transforms them into overt behaviors of an embodied conversational agent.  A behavior realizer may also execute communicative actions for realization in a game engine.

The architecture of an agent may include six basic blocks:

  1. Human Behavior Detector
  2. Human Behavior Interpreter
  3. Attention Tracker
  4. Intention Planner
  5. Behavior Planner
  6. Behavior Realizer
  • Behavior Realizer Module
  • Behavior Planner
  • Behaviour Realiser
  • BML (Behavior Markup Language)
  • BMLR (Behavior Markup Language Realizer)
  • FAPS (Facial Action Parameters)
  • SAIBA Framework

Resources:

See also:

BML (Behavior Markup Language) & Dialog SystemsLinguistic Realizers | Psyclone AIOS (Artificial Intelligence Operating System)Realizers In Natural Language Processing


Greta: an interactive expressive ECA system R Niewiadomski, E Bevacqua, M Mancini… – Proceedings of The 8th …, 2009 – dl.acm.org … These signals are sent to the Behavior Realizer that generates the MPEG4 FAP-BAP files. Finally, the animation is played in the FAP-BAP Player. … The communicative intentions are written in FML-APML language and sent to the Behavior Realizer module via Psyclone. … Cited by 62 Related articles All 13 versions

An incremental multimodal realizer for behavior co-articulation and coordination H van Welbergen, D Reidsma, S Kopp – Intelligent virtual agents, 2012 – Springer … We present AsapRealizer, a BML 1.0 behavior realizer that achieves these capabilities by building upon, and extending, two state of the art existing realizers, as the result of a collaboration between two research groups. 1 Introduction … Cited by 14 Related articles All 13 versions

EMBR–A realtime animation engine for interactive embodied agents A Heloir, M Kipp – Intelligent Virtual Agents, 2009 – Springer … On the other hand, the concepts of this layer can be used as building blocks to formally describe behaviors on the next higher level (BML). To sum up, the main contributions of this paper are: – Introducing a new, free behavior realizer for embodied agents – Presenting a … Cited by 43 Related articles All 11 versions

Greta: Towards an interactive conversational virtual companion E Bevacqua, K Prepin, R Niewiadomski… – … : perspectives on the …, 2010 – researchgate.net … defined). Finally the task of the third element of the SAIBA framework, Behavior Realizer, is to realize the behaviours scheduled by the Behavior Planner. … time. It corresponds to the Behavior Realizer module of the SAIBA architecture. … Cited by 13 Related articles

Robot behavior toolkit: generating effective social behaviors for robots CM Huang, B Mutlu – Proceedings of the seventh annual ACM/IEEE …, 2012 – dl.acm.org … These decisions are expressed in the behavior realizer, which is not a part of the Toolkit. … Behavior realizer is a ROS node that interprets the behavior output generated by the Toolkit and sends the com- mands necessary to execute the behavior to the robot. … Cited by 28 Related articles All 5 versions

Continuous interaction within the SAIBA framework J Zwiers, H van Welbergen, D Reidsma – Intelligent virtual agents, 2011 – Springer … based languages BML and FML [5,4,12]. SAIBA Intent Planner SAIBA Behavior Planner Behavior Realizer (Elckerlyc) FML BML feedback feedback Fig. 1. The abstract SAIBA architecture There is a sound theory behind this … Cited by 8 Related articles All 13 versions

Cross-media agent platform R Niewiadomski, M Obaid, E Bevacqua… – Proceedings of the 16th …, 2011 – dl.acm.org … 2007]. Finally the task of the third el- ement of the SAIBA framework, Behavior Realizer, is to realize the behaviors scheduled by the Behavior Planner. … Elckerlyc [van Welbergen et al. 2010] is a modular and extensible Behavior Realizer following the SAIBA framework. … Cited by 9 Related articles All 7 versions

Generating co-speech gestures for the humanoid robot NAO through BML QA Le, C Pelachaud – Gesture and Sign Language in Human-Computer …, 2012 – Springer … And the third module, Behavior Realizer, synchronizes and realizes the planned behaviors. Fig. … The Behavior Realizer module has been developed to create the anima- tion for both agents with different behavior capabilities. Figure 2 presents an overview of our system. … Cited by 8 Related articles All 5 versions

A BML based embodied conversational agent for a personality detection program GS Méndez, D Reidsma – Intelligent Virtual Agents, 2011 – Springer … The project uses the Elckerlyc platform, a Behavior Markup Language (BML) compliant behavior realizer for ECAs [1]. BML allows the design and implementation of verbal and nonverbal behaviors in an abstract way, without reference to the ECA model in question. … Cited by 3 Related articles All 10 versions

Real-time animation of interactive agents: Specification and realization A Heloir, M Kipp – Applied Artificial Intelligence, 2010 – Taylor & Francis … In this article we present a new realizer called EMBR 1 See also http://embots.dfki.de/EMBR View all notes. (embodied agents behavior realizer) and its control language EMBRScript. … Introducing a new, free behavior realizer for embodied agents. … Cited by 27 Related articles All 4 versions

A demonstration of audiovisual sensitive artificial listeners M Schroder, E Bevacqua, F Eyben… – … , 2009. ACII 2009. …, 2009 – ieeexplore.ieee.org … function. Behavioural representations are then processed by speech synthesis and a visual behaviour realiser, and rendered by a player. Details on the technological setup are available in [4]. 4. Outline of a SAL session After … Cited by 17 Related articles All 40 versions

A demonstration of continuous interaction with elckerlyc H van Welbergen, D Reidsma, J Zwiers – 2010 – eprints.eemcs.utwente.nl … human. We show how such small adjustments can be specified and how we implemented these behaviors in our behavior realizer Elckerlyc. We intend to demonstrate our implementation in the demo session of the workshop. … Cited by 4 Related articles All 15 versions

A multimodal scheduler for synchronized humanoid robot gesture and speech M Salem, S Kopp, I Wachsmuth… – 9th International Gesture …, 2011 – pub.uni-bielefeld.de … In particular, we build on the Articulated Communicator Engine (ACE), which is one of the most sophisticated multimodal schedulers and behavior realizers in that it replaces the use of lexicons of canned behaviors with a real-time production of flexibly planned behavior … Cited by 4 Related articles All 4 versions

The SEMAINE API: towards a standards-based framework for building emotion-oriented systems M Schröder – Advances in human-computer interaction, 2010 – dl.acm.org Page 1. Hindawi Publishing Corporation Advances in Human-Computer Interaction Volume 2010, Article ID 319406, 21 pages doi:10.1155/2010/319406 Research Article The SEMAINE API: Towards a Standards-Based Framework for Building Emotion-Oriented Systems … Cited by 68 Related articles All 8 versions

Design and implementation of an expressive gesture model for a humanoid robot QA Le, S Hanoune, C Pelachaud – … Robots (Humanoids), 2011 …, 2011 – ieeexplore.ieee.org … of three separated modules: (i) the first module, Intent Planning, defines the communicative intents to be conveyed; (ii) the second, Behavior Planning, plans the corresponding multimodal behaviors to be realized; (iii) and the third module, Behavior Realizer, synchronizes and … Cited by 9 Related articles All 6 versions

Expressive Gestures Displayed by a Humanoid Robot during a Storytelling Application C Pelachaud, R Gelin… – New Frontiers in …, 2010 – perso.telecom-paristech.fr … Page 3. Behavior Planner decides and schedules which verbal and nonver- bal behaviors to use to convey these intentions and emotional states Behavior Realizer computes the animation to be visualized by the agent A gestuary is a repository of nonverbal behaviors. … Cited by 8 Related articles

Elckerlyc goes mobile enabling technology for ECAs in mobile applications R Klaassen, J Hendrix, D Reidsma – UBICOMM 2012, The Sixth …, 2012 – thinkmind.org … The Elckerlyc platform has been described and compared with other BML behaviour realizers (for example EMBR [7] and Greta [8]) in various papers [9], [10], [4]. Dependent on the application and task that the intelligent system has, the virtual human presents for example the … Cited by 7 Related articles All 5 versions

Multimodal backchannels for embodied conversational agents E Bevacqua, S Pammi, SJ Hyniewska… – Intelligent Virtual …, 2010 – Springer … request. Finally, the agent behaviour is realized by the Behaviour Realizer module and rendered by a 3D character player. 5 Evaluation Description We performed an evaluation study to analyze multimodal backchannels. To … Cited by 21 Related articles All 12 versions

Expressive body animation pipeline for virtual agent J Huang, C Pelachaud – Intelligent Virtual Agents, 2012 – Springer … Sequences of time-marked BML-like signals are instantiated within the Behavior Realizer. Fig. … In the Behavior Realizer module, the parts with a star correspond to our work in the animation pipeline. 3.1 Overview of Our Pipeline … Cited by 6 Related articles All 6 versions

Thalamus: Closing the mind-body loop in interactive embodied characters T Ribeiro, M Vala, A Paiva – Intelligent virtual agents, 2012 – Springer … the virtual character. They separate the Behavior Realizer in two sub-layers: Keyframe Generator which is common for both agents, and Animation Generator, which is specific to the embodiment. Kipp et al. have also proposed … Cited by 7 Related articles All 4 versions

Incremental dialogue understanding and feedback for multiparty, multimodal conversation D Traum, D DeVault, J Lee, Z Wang, S Marsella – Intelligent Virtual Agents, 2012 – Springer … topic of conversation. It supports both ex- pressive and evocative feedback for a variety of conversational roles and goals. It has been implemented, and connected to the behavior realizer developed by [35]. Future work includes … Cited by 22 Related articles All 10 versions

A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction M Salem, K Rohlfing, S Kopp, F Joublin – RO-MAN, 2011 IEEE, 2011 – ieeexplore.ieee.org … In particular, we build on the Articulated Communicator Engine (ACE), which is one of the most sophisticated multimodal schedulers and behavior realizers by replacing the use of lexicons of canned behaviors with an on-line production of flexibly planned be- havior … Cited by 18 Related articles All 3 versions

Expressive gesture model for humanoid robot C Pelachaud – Affective Computing and Intelligent Interaction, 2011 – Springer … The existing behavior planner module of the GRETA system remains unchanged. On the other hand, a new behavior realizer module and a gestural database have been built to compute and realize the animation of the robot and of the virtual agent respectively. … Cited by 2 Related articles All 3 versions

Providing gender to embodied conversational agents M Vala, G Blanco, A Paiva – Intelligent Virtual Agents, 2011 – Springer … The proposed system was developed around SAIBA Framework using SmartBody as the behavior realizer and tries to address this problem by adding a set of involuntary gender specific movements to the agents behaviour in an automatic man- ner. … Cited by 2 Related articles All 5 versions

Communicating emotional states with the Greta agent R Niewiadomski, M Mancini, S Hyniewska… – A Blueprint for …, 2010 – books.google.com … It specifies the verbal and non-verbal behaviours ofECAs (Vilhjálmsson et al. 2007). Finally, the task ofthe third element of the SAIBA framework, Behaviour Realizer, is to realize the behaviours scheduled by the Behaviour Planner. Page 275. … Cited by 1 Related articles

A computational model of social attitude effects on the nonverbal behavior for a relational agent B Ravenet, M Ochs, C Pelachaud – Proc. of Workshop Affect …, 2012 – magalie.ochs.free.fr … Finally, the Behavior Planner module generates a set of gestures to perform depending on the communicative intentions and the probabilities of the associated signals. Then, this set is trans- ferred to the Social Expressivity Modulator, instead of the Behavior Realizer. … Cited by 3 Related articles All 2 versions

AsapRealizer 2.0: The next steps in fluent behavior realization for ECAs H van Welbergen, R Yaghoubzadeh, S Kopp – Intelligent Virtual Agents, 2014 – Springer … We present the conglomeration of our research efforts in enabling the realization of such fluent interactions for Embod- ied Conversational Agents in the behavior realizer ‘AsapRealizer 2.0’ and show how it provides fluent realization capabilities that go beyond the state-of-the-art … Cited by 1

A controller-based animation system for synchronizing and realizing human-like conversational behaviors A ?erekovi?, T Pejša, IS Pandži? – Development of Multimodal Interfaces: …, 2010 – Springer … In this paper we present an approach to implement a behavior realizer compatible with BML language. The system’s architecture is based on hierarchi- cal controllers which apply preprocessed behaviors to body modalities. … Cited by 2 Related articles All 10 versions

AsapRealizer in practice–A modular and extensible architecture for a BML Realizer D Reidsma, H van Welbergen – Entertainment Computing, 2013 – Elsevier … platform itself. This last point is crucial, and will be worked out in more detail in the next chapter. AsapRealizer, successor to Elckerlyc and ACE, is a state-of-the-art Behavior Realizer for virtual humans. Elsewhere, we described … Cited by 1 Related articles All 7 versions

Designing Appropriate Feedback for Virtual Agents and Robots. M Lohse, H van Welbergen – Position paper at RO-MAN 2012 …, 2012 – pub.uni-bielefeld.de … Intent Planner Behavior Planner Behavior Realizer FML BML feedback feedback Fig. 2. The SAIBA architecture C. Synchronization within Systems One main challenge that has been addressed with BML is synchronization among behaviors. … Cited by 4 Related articles All 5 versions

Task-oriented conversational behavior of agents for collaboration in human-agent teamwork M Barange, A Kabil, C De Keukelaere… – Advances in Practical …, 2014 – Springer … Decision Making Dialogue Management Belief Revision Behavior Realiser Perception … team members. The behavior realiser module is responsible for the execution of actions and for the turn taking behavior of the agent. 3.1 Knowledge Organisation and Processing … Cited by 1 Related articles All 2 versions

A Common Gesture and Speech Production Framework for Virtual and Physical Agents QA Le, J Huang, C Pelachaud – Proceedings of 14th ACM …, 2012 – robotics.usc.edu … intents that the agent aims to com- municate to the users such as emotional states, beliefs or goals; (ii) the second, Behavior Planner, selects and plans the corresponding multi-modal behavior to be realized; (iii) and the third module, Behavior Realizer, synchronizes and … Cited by 1 Related articles All 4 versions

Communicative capabilities of agents for the collaboration in a human-agent team M Barange, A Kabil, C De Keukelaere… – ACHI 2014, The …, 2014 – thinkmind.org … Dialogue Management Belief Revision Behavior Realiser Perception Information State (Context model) Semantic Knowledge … The behavior realiser module is responsible for the execution of actions and the turn taking behavior of the agent. A. Knowledge Organisation … Cited by 1 Related articles

Generating finely synchronized gesture and speech for humanoid robots: a closed-loop approach M Salem, S Kopp, F Joublin – Proceedings of the 8th ACM/IEEE …, 2013 – dl.acm.org … II. CONCEPT OVERVIEW AND IMPLEMENTATION In our previous work we presented a system that employs a virtual agent framework with its original scheduler as an underlying behavior realizer for the generation of multimodal robot behavior [6]. In the work described in this … Cited by 3 Related articles All 3 versions

The C2BDI Agent Architecture for Teamwork Coordination Using Spoken Dialogues between Virtual Agents and Users M Barange, A Kabil, P Chevaillier – Advances in Practical Applications of …, 2014 – Springer … Page 2. 316 M. Barange, A. Kabil, and P. Chevaillier Decision Making Dialogue Management Belief Revision Behavior Realiser Perception Information State (Context model) Semantic Knowledge Perception Memory Knowledge Base Dialogue Semantic Cognitive Perceptual … Cited by 1 Related articles All 2 versions

Incremental, Adaptive and Interruptive Speech Realization for Fluent Conversation with ECAs H van Welbergen, T Baumann, S Kopp… – Intelligent Virtual …, 2013 – Springer … Our architecture serves 1) as a platform for those experimenting with the ‘best’way to deploy fluent behavior realization strategies or those researching social effects of certain deployment strategies and 2) as a building block (specifically the Behavior Realizer) in a ECA … Related articles All 7 versions

A Real-Time Architecture for Embodied Conversational Agents: Beyond Turn-Taking B Nooraei, C Rich, C Sidner – ACHI 2014, The Seventh International …, 2014 – thinkmind.org … When the resource arbitrator starts a new behavior realizer that requires the focus stack, it pushes the goal associated with the proposing schema onto the stack (unless it is already there). When the behavior realizer finishes, the goal is automatically popped off the stack. … Cited by 1 Related articles All 5 versions

I’m the mayor: a robot tutor in enercities-2 T Ribeiro, A Pereira, A Deshmukh, R Aylett… – Proceedings of the 2014 …, 2014 – dl.acm.org … The BML is sent to a Behaviour Realizer, which performs the behaviour on the specific characters. Within Thalamus, different modules can produce BML code that is then scheduled and executed on the robotic tutor using a robot-specific realizer. 1 http://unity3d.com/ … Cited by 3 Related articles All 4 versions

Multimodal Behaviour Generation Frameworks in Virtual Heritage Applications: A Virtual Museum at Sverresborg MJ Stokes – 2009 – diva-portal.org … Figure 2: A MagiCster virtual agent displays different emotions in response to APML 2.1.4 SmartBody and BML Realiser SmartBody is an open-source character animation system developed to serve the “behaviour realiser” role in the SAIBA framework. It was 7 Page 12. … Related articles All 5 versions

Suggestions for Extending SAIBA with the VIB Platform F Pecune, A Cafaro, M Chollet, P Philippe… – perso.telecom-paristech.fr … formats respectively. The Behavior Realizer outputs are keyframes performers used to compute the real-time ani- mation of the agent’s body and face. An external player (ie 3D engine) can be plugged into the system. This supports …

A Demonstration of Continuous Interaction with Elckerlyc H Welbergen, D Reidsma, J Zwiers – 2010 – doc.utwente.nl … human. We show how such small adjustments can be specified and how we implemented these behaviors in our behavior realizer Elckerlyc. We intend to demonstrate our implementation in the demo session of the workshop. … Related articles All 2 versions

We Never Stop Behaving: The Challenge of Specifying and Integrating Continuous Behavior HH Vilhjálmsson, EI Björgvinsson, HE Helgadóttir… – ru.is … occurs in any detail. Furthermore, complex motion engines can be valuable assets and it seems that the SAIBA community could work on interfaces that support their migration between behavior realizers. Being able to shop …

SIMONA–the Slovak embodied conversational agent S Ondas, J Juhar, M Trnka – Intelligent Decision Technologies, 2014 – IOS Press … It was created as an extension of a multimodal dialogue system with the BML realizer Elckerlyc as an embodied agent. Elckerlyc is a BML compliant behavior realizer developed by the Human Media Interaction group at the University of Twente. …

The Influence of Emotion on Interpersonal Relations during Bad News Conversations B van Straalen, D Heylen – lorentzcenter.nl … This holds for the Graphical User Interface (GUI) component and the Behavior realizer component. … The selected behavior is sent, together with the (possibly) updated affect states to the Behavior realizer module. In our agent system the behavior realizer module is Elckerlyc [12]. … Related articles

Design and development of the Slovak multimodal dialogue system with the BML Realizer Elckerlyc S Ondas, J Juhar – … , 2012 IEEE 3rd International Conference on, 2012 – ieeexplore.ieee.org … Elckerlyc is a BML compliant behavior realizer for generating multimodal verbal and nonverbal behavior for virtual agents. … 1). Elckerlyc is a BML compliant behavior realizer for generating multimodal verbal and nonverbal behavior for virtual agents [21]. … Related articles

Architectures and Standards for IVAs at the Social Cognitive Systems Group H van Welbergen, K Bergmann… – Proceedings of the …, 2014 – techfak.uni-bielefeld.de … To satisfy our requirements on fluent behavior re- 8 Page 2. Behavior Realizer (AsapRealizer) Behavior Planner Intent Planner Sensor Processor Behavior Interpreter (fast) Function Interpreter (slow) BML BML feedback FML FML feedback PML ? functionLayer …

Towards mobile embodied 3d avatar as telepresence vehicle Y Tokuda, A Hiyama, T Miura, T Tanikawa… – Universal Access in …, 2013 – Springer … Behavior planner inte- grates all necessary information to make a schedule and logically design blue prints to realize the user’s intention. Finally, behavior realizer makes a concrete action based on the blue prints given by behavior planner. … Related articles All 2 versions

An architecture for fluid real-time conversational agents: integrating incremental output generation and input processing S Kopp, H van Welbergen, R Yaghoubzadeh… – Journal on Multimodal …, 2013 – Springer … of propositions. A Behavior Planner translates intentions into surface behaviors specified in BML. The Behavior Realizer takes BML behavior specifications and transforms them into overt behavior of an ECA. As the behavior … Cited by 3 Related articles All 5 versions

How emotions affect relations between interlocutors in bad news conversations B van Straalen, D Heylen – Communications Control and …, 2012 – ieeexplore.ieee.org … This holds for the Graphical User Interface (GUI) component and the Behavior realizer component. … The selected behavior is sent, together with the (possibly) updated affect states to the Behavior realizer module. In our agent system the behavior realizer module is Elckerlyc [13]. … Related articles

Elckerlyc on Android: A Lightweight Embodiment J Hendrix – hmi.ewi.utwente.nl … by a virtual human. It was developed as part of the SAIBA frame- work [6], and is used in a number of behavior realizers such as SmartBody [5], EMBR [2], Greta [4] and of course the aforementioned Elckerlyc. BML Realiz- ers … Related articles All 3 versions

Elckerlyc goes mobile-Enabling natural interaction in mobile user interfaces R Klaassen, J Hendrix, D Reidsma, R Akker… – … journal on advances …, 2013 – doc.utwente.nl … The Elckerlyc platform is a BML realizer for real-time generation of behaviours of virtual humans (VHs). The Elckerlyc platform has been described and compared with other BML behaviour realizers (for example EMBR [15] and Greta [16]) in various papers [6][17][18]. … Cited by 1 Related articles All 11 versions

[BOOK] Computational model of listener behavior for embodied conversational agents E Bevacqua – 2010 – books.google.com … 72 6.4.1 Listener Intent Planner . . . . . 73 6.4.2 Behavior Planner . . . . . 75 6.4.3 Behavior Realizer . . . . . 77 6.4.4 FAP-BAP Player . . . . . 78 6.5 Conclusion . . . . . … Cited by 9 Related articles All 4 versions

Editorial: New modalities for interactive entertainment G Volpe, D Reidsma, A Camurri, A Nijholt – 2013 – eprints.eemcs.utwente.nl … The first one introduces AsapRealizer, a novel platform for controlling virtual humans. It combines and extends previous Behaviour Realizers, and can be adapted to suit the needs of a particular application in broader scenarios. … Related articles All 3 versions

CIGA: A middleware for intelligent agents in virtual environments J van Oijen, L Vanhée, F Dignum – Agents for Educational Games and …, 2012 – Springer … Agent Acting. Agent behavior is performed using a Behavior Realizer provided for each agent participating within CIGA. … They correspond to the actions implemented in the GE Interface layer described previously and are executed by the Behavior Realizer. … Cited by 17 Related articles All 7 versions

Multimodal plan representation for adaptable BML scheduling H van Welbergen, D Reidsma, J Zwiers – Autonomous agents and multi- …, 2013 – Springer … 1 We use this “constraint problem” view to develop a novel flexible motor plan representation, but its value is actually broader than that. It can also contribute to clarifying standards for multimodal behavior generation, validating behavior realizers and/or scripts, etcetera. … Cited by 2 Related articles All 13 versions

Presentation Differences: Does an ECA Improve Appreciation and Quality of Task Performance? M Vlot-van Kralingen – 2012 – referaat.cs.utwente.nl … The ECA that was used is Ar- mandia (see Figure 1); Armandia is a 3D ECA that runs on the Elckerlyc platform [10]. Elckerlyc is a BML com- pliant behavior realizer for generating multimodal verbal and nonverbal behavior for Virtual Humans. … Related articles

[BOOK] Behavior generation for interpersonal coordination with virtual humans: on specifying, scheduling and realizing multimodal virtual human behavior H Welbergen – 2011 – doc.utwente.nl Page 1. BEHAVIOR GENERATION FOR INTERPERSONAL COORDINATION WITH VIRTUAL HUMANS ON SPECIFYING, SCHEDULING AND REALIZING MULTIMODAL VIRTUAL HUMAN BEHAVIOR HERWIN VAN WELBERGEN Page 2. ii PhD dissertation committee: … Related articles All 5 versions

Agent communication for believable human-like interactions between virtual characters J van Oijen, F Dignum – Cognitive Agents for Virtual Environments, 2013 – Springer … Last, the Behavior Realizer executes communicative actions for realization in the game engine. At each stage, feedback information about the progress of a realization is sent to previous stages allowing an agent to monitor the execution of its intent. … Behavior Realizer. … Cited by 6 Related articles All 11 versions

A computational model of social attitudes for a virtual recruiter Z Callejas, B Ravenet, M Ochs… – Proceedings of the 2014 …, 2014 – dl.acm.org … Fi- nally, the Behavior Realizer and the Text-To-Speech (TTS) engine display the animation of the agent. … Finally, the resulting selection of behaviors and expressivity parameters is sent through BML (Behav- ior Markup Langage) to the Behavior Realizer. … Related articles All 2 versions

Avlaughtercycle J Urbain, R Niewiadomski, E Bevacqua, T Dutoit… – Journal on Multimodal …, 2010 – Springer … Greta is a complex architecture com- posed of several modules (ie Intent Planner, Behavior Plan- ner, Behavior Realizer, Player; see [16] for details) that uses Fig. 1 Greta, the 3D humanoid agent used in AVLaughterCycle, laughing … Cited by 23 Related articles All 7 versions

Eye, Lip And Crying Expression For Virtual Human MC Prasetyahadi, IR Ali, AH Basori, N Saari – 2013 – ijidm.org … al. [2010] presented an approach to implement a behavior realizer compatible with Behavior Markup Language (BML), the system based on hierarchical controllers which apply preprocessed behaviors to body modalities. Then … Cited by 1 Related articles

Hybrid control for embodied agents applications J Miksatko, M Kipp – KI 2009: Advances in Artificial Intelligence, 2009 – Springer … avatar engines can be connected. Currently, we use both a commercial engine from Charamel1 and our own research prototype engine called EMBR (Embodied Agents Behavior Realizer) [13]. In general, the underlying framework … Cited by 2 Related articles All 8 versions

Expressing social attitudes in virtual agents for social training games N Sabouret, H Jones, M Ochs, M Chollet… – arXiv preprint arXiv: …, 2014 – arxiv.org … Moreover, the Behaviour Planner has been extended to give the capability to the agent to display dif- ferent social attitudes (Section 6). Finally, the Behaviour Realizer outputs for each of these signals the animation pa- rameters. … Cited by 1 Related articles All 6 versions

A model to generate adaptive multimodal job interviews with a virtual recruiter Z Callejas, B Ravenet, M Ochs, C Pelachaud – lrec-conf.org … This file is used by the Behavior Planner to instantiate the appropriate non-verbal behaviors depending on the attitude and dialog act (communicative intention) of the agent. Finally, the Behavior Realizer and the Text-To- Speech (TTS) engine display the animation of the agent. … Related articles

Evaluating an expressive gesture model for a humanoid robot: Experimental results QA Le, C Pelachaud – Submitted to 8th ACM/IEEE …, 2012 – perso.telecom-paristech.fr … to communicate given intents. This module decides how to render behaviors expressive through expressivity parameters; (iii) the third module, Behavior Realizer, synchronizes and realizes planned behaviors. The results of … Cited by 1 Related articles

Towards meaningful robot gesture M Salem, S Kopp, I Wachsmuth, F Joublin – Human Centered Robot …, 2009 – Springer … In this paper, we first discuss related work, highlighting the fact that not much research has so far focused on the generation of robot gesture (Section 2). In Section 3, we describe our multi-modal behavior realizer, the Articulated Communicator Engine (ACE), which implements … Cited by 5 Related articles All 8 versions

From Annotation to Multimodal Behavior K Jokinen, C Pelachaud – Coverbal Synchrony in Human- …, 2013 – books.google.com … These behaviors are encoded into the Behavior Mark-up Language—BML (Kopp et al., 2006; Vilhjálmsson et al., 2007). The third module, Behavior Realizer, receives this list of signals and computes the corresponding animation. … Related articles

Towards influencing of the conversational agent mental state in the task of active listening S Ondáš, E Bevacqua, J Juhár, P Demeter – Development of Multimodal …, 2010 – Springer … Such a language specifies the verbal and non-verbal behaviors of ECAs [15]. The task of the Behavior Realizer is to realize the behaviors scheduled by the Behavior Planner. Finally, the animation is played in the Greta Player. … Cited by 1 Related articles All 5 versions

Nonverbal behaviour of an embodied storyteller F Jonkman – 2012 – essay.utwente.nl … storyteller. The gaze model and the viewing perspectives were implemented in the verbal and non-verbal behaviour realizer Elckerlyc [WR10]. … environment. This will be done by using Elckerlyc [WR10], a verbal and non-verbal behaviour realizer. … Related articles All 10 versions

MPML3D: scripting agents for the 3D internet H Prendinger, S Ullrich, A Nakasone… – … , IEEE Transactions on, 2011 – ieeexplore.ieee.org … Hence, many pro- jects use BML, including ECA systems, behavior planners and realizers, and tools for ECA creation (see [28], for details). The vision of BML is to develop a general platform that allows to combine behavior planners and behavior realizers easily. … Cited by 28 Related articles All 9 versions

Touching Virtual Agents: Embodiment and Mind G Huisman, M Bruijnes, J Kolkmeier, M Jung… – Innovative and Creative …, 2014 – Springer … The ASAP behavior realizer (a SAIBA compliant BML 1.0 realizer [87]) is capable of incremental behavior generation, meaning that new agent behavior can be integrated seamlessly with ongoing behavior. Page 6. Touching Virtual Agents 119 3.3 Application Areas … Related articles All 3 versions

A crowdsourcing toolbox for a user-perception based design of social virtual actors M Ochs, B Ravenet, C Pelachaud – ceur-ws.org … GretaModular offers several modules, each dedicated to particular function- ality. The core modules, based on the SAIBA framework [22], include an Intent Planner, a Behavior Planner and a Behavior Realizer to compute multimodal expressions of communicative intentions. … Related articles

Positive influence of smile backchannels in ECAs E Bevacqua, SJ Hyniewska, C Pelachaud – International Workshop on …, 2010 – cs.huji.ac.il … The Behavior Planner module receives as input the agent’s communicative intentions and generates as output a list of behavioral signals. These signals are sent to the Behavior Realizer that generates the animation. Finally, the animation is played in the Player. … Cited by 12 Related articles All 4 versions

Describing and Animating Complex Communicative Verbal and Nonverbal Behavior Using Eva-Framework I Mlakar, Z Ka?i?, M Rojc – Applied Artificial Intelligence, 2014 – Taylor & Francis … the animation. The embodied agents behavior realizer (EMBR) animation engine (Heloir and Kipp 200918. Heloir, A., and M. Kipp. 2009. EMBR – A Realtime animation engine for interactive embodied agents. In Proceedings … Related articles All 2 versions

TTS-driven Synthetic Behaviour-generation Model for Artificial Bodies I Mlakar, Z Ka?i?, M Rojc – 2013 – cdn.intechopen.com … The co-verbal alignment of non-verbal behaviour is implemented by using external behaviour- planning and a behaviour-realizer engine [7]. The different phases of behaviour generation are linked through BML and FML mark-up languages. … Related articles All 3 versions

Enhancing embodied conversational agents with social and emotional capabilities B Van Straalen, D Heylen, M Theune… – Agents for Games and …, 2009 – Springer … selected. Also each category contains a set of labels that are passed on to the behavior realizer to dictate how the behavior should be executed (eg volume of speech, specific facial expressions, and characteristics for gestures). … Cited by 3 Related articles All 33 versions

From Emotions to Interpersonal Stances: Multi-level Analysis of Smiling Virtual Characters M Ochs, K Prepin, C Pelachaud – Affective Computing and …, 2013 – ieeexplore.ieee.org … be simulated. To simulate the smile reinforcement, the animation resulting from the Behavior Realizer module (ie the Facial Action Parameters – FAPS [38]) is directly modified by the Production space module [32]. Moreover, the … Cited by 3 Related articles All 5 versions

Co-constructing grounded symbols—feedback and incremental adaptation in human–agent dialogue H Buschmeier, S Kopp – KI-Künstliche Intelligenz, 2013 – Springer … In other work, Reidsma and colleagues [21] have focused on mechanisms on the level of realisation. Their behaviour realiser ‘Elckerlyc’ can increase the speech rate and the vol- ume of the speech synthesis flexibly at any point in time. … Cited by 2 Related articles All 6 versions

Compound Gesture Generation: A Model Based on Ideational Units Y Xu, C Pelachaud, S Marsella – Intelligent Virtual Agents, 2014 – Springer … an unnatural result. For the future work, we are hoping to test our sequential gesture model on different behavior realizers such as Greta [19]. Page 14. 490 Y. Xu, C. Pelachaud, and S. Marsella Acknowledgments. We would …

An Intelligent Architecture for Autonomous Virtual Agents Inspired by Onboard Autonomy K Hassani, WS Lee – Intelligent Systems’ 2014, 2015 – Springer … organization and action selection. SAIBA [28] is a popular framework that defines a pipeline for abstract behavior generation. It consists of intent planner, beha- vior planner and behavior realizer. Thalamus [29] framework adds …

Laugh machine J Urbain, R Niewiadomski… – Proceedings …, 2013 – informatik.uni-augsburg.de Page 1. ENTERFACE’12 SUMMER WORKSHOP – FINAL REPORT; PROJECT P2 : LAUGH MACHINE 13 Laugh Machine Jérôme Urbain1, Radoslaw Niewiadomski2, Jennifer Hofmann3, Emeline Bantegnie4, Tobias Baur5, Nadia … Cited by 2 Related articles All 5 versions

A listener model: introducing personality traits E Bevacqua, E De Sevin, SJ Hyniewska… – Journal on Multimodal …, 2012 – Springer … database [26]. Afterwards, the behavioural signals are realised by the Behaviour Realizer module according to the agent’s behavioural characteristics. Finally, the agent’s animation is rendered by a 3D character player. More … Cited by 12 Related articles All 13 versions

Multimodal human machine interactions in virtual and augmented reality G Chollet, A Esposito, A Gentes, P Horain… – … Signals: Cognitive and …, 2009 – Springer … Lately Thiebaux and colleagues[73] have developed a real-time architecture to implement the Behavior Realizer of the SAIBA framework. 4 Multimodal Interaction Communication is multimodal by essence. It involves not only speech but also nonverbal signals. … Cited by 6 Related articles All 15 versions

On designing migrating agents: from autonomous virtual agents to intelligent robotic systems K Hassani, WS Lee – … Asia 2014 Autonomous Virtual Humans and …, 2014 – site.uottawa.ca … action selection. SAIBA [Kopp 2006] is a popular framework that defines a pipeline for abstract behavior generation. It consists of intent planner, behavior planner and behavior realizer. Thalamus [Ribeiro et al. 2012] framework …

Toward incorporating emotions with rationality into a communicative virtual agent A Kiselev, BA Hacker, T Wankerl, N Abdikeev… – AI & society, 2011 – Springer … Architecture of an agent includes six basic blocks: Human Behavior Detector, Human Behavior Interpreter, Attention Tracker, Intention Planner, Behavior Planner and Behavior Realizer. In the following paragraphs, a brief description of each of these blocks is given. … Cited by 3 Related articles All 6 versions

Animating synthetic dyadic conversations with variations based on context and agent attributes L Sun, A Shoulson, P Huang, N Nelson… – … and Virtual Worlds, 2012 – Wiley Online Library … The BML project aims to develop a representation framework for describing both nonverbal and verbal real-time behaviors that are independent of the particular graphical realization. BML is a standard XML-based interface between behavior planners and behavior realizers. … Cited by 4 Related articles All 2 versions

Effect personality matching on robot acceptance: effect of robot-user personality matching on the acceptance of domestic assistant robots for elderly M Brandon – 2012 – essay.utwente.nl Page 1. EFFECT PERSONALITY MATCHING ON ROBOT ACCEPTANCE 1 Effect of Robot-User Personality Matching on the Acceptance of Domestic Assistant Robots for Elderly Merel Brandon Human Media Interaction, University of Twente … Related articles All 5 versions

The Influence of Facial Expressions in Embodied Agents on Human Decision-Making B Stapersma – 2011 – referaat.cs.utwente.nl … See Table 3. Table 3. Facial displays (emotions and intensities) of the Mean Agent Facial displays are animated using Elckerlyc; this is a BML compliant behavior realizer for generating multimodal verbal and nonverbal behavior for Virtual Humans [19]. … Related articles

Synthesis of listener vocalizations: towards interactive speech synthesis SC Pammi – 2012 – scidok.sulb.uni-saarland.de Page 1. Synthesis of Listener Vocalizations Towards Interactive Speech Synthesis Dissertation zur Erlangung des Grades des Doktors der Ingenieurwissenschaften der Naturwissenschaftlich- Technischen Fakultäten der Universität des Saarlandes vorgelegt von Sathish Pammi … Cited by 1 Related articles All 2 versions

Multimodal Signals: Cognitive and Algorithmic Issues AEA Hussain, MMR Martone – 2009 – Springer … Lately Thiebaux and colleagues [7 3] have developed a real-time architecture to implement the Behavior Realizer of the SAIBA framework. 4 Multimodal Interaction Communication is multimodal by essence. It involves not only speech but also nonverbal signals. … Related articles

Separable Approximation of Ambient Occlusion JHT Boubekeur, T Ritschel, M Holländer, E Eisemann – irit.fr Page 1. EUROGRAPHICS 2011 / N. Avis, S. Lefebvre Short Paper Separable Approximation of Ambient Occlusion Jing Huang1 Tamy Boubekeur1 Tobias Ritschel1,2 Matthias Holländer1 Elmar Eisemann1 1Telecom ParisTech – CNRS / LTCI 2Intel Visual Computing Institute … Related articles All 9 versions

[BOOK] Conceptual Motorics-Generation and Evaluation of Communicative Robot Gesture M Salem – 2013 – books.google.com Page 1. Maha Salem Conceptual Motorics – Generation and Evaluation of Communicative Robot Gesture Page 2. CONCEPTUAL MOTORICS — GENERATION AND EVALUATION OF COMMUNIOATIVE ROBOT GESTURE by Maha Salem Page 3. … Cited by 4 Related articles All 3 versions

The SEMAINE API: A component integration framework for a naturally interacting and emotionally competent Embodied Conversational Agent M Schröder – 2012 – scidok.sulb.uni-saarland.de Page 1. The SEMAINE API — A Component Integration Framework for a Naturally Interacting and Emotionally Competent Embodied Conversational Agent Dissertation zur Erlangung des Grades des Doktors der Ingenieurwissenschaften … Cited by 6 Related articles All 2 versions

[BOOK] Response selection and turn-taking for a sensitive artificial listening agent M Maat – 2011 – doc.utwente.nl Page 1. Page 2. Response Selection and Turn-taking for a Sensitive Artificial Listening Agent Mark ter Maat Page 3. PhD dissertation committee: Chairman and Secretary: Prof. dr. AJ Mouthaan, University of Twente, NL Promotor: Prof. dr. … Cited by 5 Related articles All 5 versions