Notes:
Behavior Markup Language (BML) is a standardized markup language that is used to describe nonverbal behavior in virtual humans, such as facial expressions, gestures, and body movements. It is often used in the context of dialog systems, as a way to specify how a virtual human should express themselves nonverbally while interacting with a user.
In a dialog system, BML can be used in combination with text- or speech-based input from a user to create a more natural and engaging interaction. For example, a virtual human might use BML to smile or nod while listening to a user’s response, or to furrow their brow in confusion if they don’t understand something that was said.
BML is typically used in combination with other technologies, such as natural language processing (NLP) and animation software, to create a realistic and believable virtual human. It can be used in a variety of applications, including virtual assistants, training simulations, and interactive entertainment.
Resources:
- hmi-utwente/flipper-2.0 .. dialogue control system flipper
Wikipedia:
See also:
Behavior Realizer & Artificial Intelligence
Embodied conversational agents.
HH Huang – 2018 – psycnet.apa.org
… where the non-verbal information like facial expressions, body postures, gestures, or voice tone of both the agent and the human user(s) are utilized to compensate dialogue system … Towards a common framework for multimodal generation: The behavior markup language …
The rise of the conversational interface: A new kid on the block?
MF McTear – International Workshop on Future and Emerging …, 2016 – Springer
… Conversational interface Chatbot Spoken dialogue system Voice user interface Embodied … SAIBA (Situation, Agent, Intention, Behaviour, Animation), BML (Behavior Markup Language), FML (Functional … statistical methodology for dialog management in spoken dialog systems …
Linguistic approaches to robotics: from text analysis to the synthesis of behavior
A Kotov, N Arinkin, L Zaidelman, A Zinina – International Workshop on …, 2017 – Springer
… The winning script(s) form output behavioral reactions with utterances in BML (Behavior Markup Language) – which are combined and processed by the … Although this approach is suitable for text classification and even for some dialogue systems, it omits an important aspect …
Say Hi to Eliza
G Llorach, J Blat – International Conference on Intelligent Virtual Agents, 2017 – Springer
… Speech Recognition, Speech Synthesis, Dialogue System The Web Speech API permits to use … 2]. It is important to note that more sophisticated dialogue systems can be … Vilhjlmsson H.: Towards a Common Framework for Multimodal Generation: the Behavior Markup Language …
Socially-aware animated intelligent personal assistant agent
Y Matsuyama, A Bhardwaj, R Zhao, O Romeo… – Proceedings of the 17th …, 2016 – aclweb.org
… including relevant hand gestures, eye gaze, head nods, etc.) and outputs the plan as BML (Behavior Markup Language), which is … Alex: A statistical dialogue systems framework … Automatic recognition of conversa- tional strategies in the service of a socially-aware dialog system …
Flipper 2.0: a pragmatic dialogue engine for embodied conversational agents
J van Waterschoot, M Bruijnes, J Flokstra… – Proceedings of the 18th …, 2018 – dl.acm.org
… in a robust and scalable way the typical situ- ations and technical problems that occur when creating a dialogue system. We will make these available together with the software and highlight some in this paper. In Section 2 we explain our view on dialogue systems and discuss …
Introduction: toward the design, construction, and deployment of multimodal-multisensor interfaces
ACM Books – dl.acm.org
… On the other hand, machine-learned dialogue systems can be robust to variation in their … input to behav- ior planning, which results in expressions in the Behavior Markup Language (BML) that … The first is a clinical application—a mul- timodal dialogue system for annotation and …
Engagement with artificial intelligence through natural interaction models
S Feldman, ON Yalcin, S DiPaola – Electronic Visualisation and …, 2017 – scienceopen.com
… that is any test that comes out of our AI dialogue system can be in … effort called SmartBody (Thiebaux 2008) that uses an XML standard language called BML – Behavior markup language … All these systems come together currently controlled by our AI dialog system which takes …
A semi-autonomous system for creating a human-machine interaction corpus in virtual reality: application to the ACORFORMed system for training doctors to …
M Ochs, P Blache, GM de Montcheuil… – Proceedings of the …, 2018 – aclweb.org
… The dialogue system then generates a sequence of instruc- tions, to be sent to a non … Language) as well as the non-verbal signals to express (encoded in BML, Behavior Markup Language) … OpenDial: A Toolkit for Developing Spoken Dialogue Systems with Proba- bilistic Rules …
Social gaze model for an interactive virtual character
B van den Brink, Z Yumak – International Conference on Intelligent …, 2017 – Springer
… 4. Bohus D., Horvitz, E.: Learning to predict engagement with a spoken dialog system in open-world settings … S., Marshall, AN, Pelachaud, C., Pirker, H., Thorisson, KR, Vilhjalmsson, H.: Towards a common framework for multi-modal generation: The behavior markup language …
Semantic Comprehension System for F-2 Emotional Robot
A Kotov, N Arinkin, A Filatov, L Zaidelman… – First International Early …, 2017 – Springer
… researchers often use the bag-of-n-grams – an unordered set of tuples consisting of n consecutive words [2, 6]. Dialogue systems also often … Each script is assigned to one or several behavioral reactions: utterance pattern and a BML record – Behavior Markup Language [25] …
Field trial analysis of socially aware robot assistant
F Pecune, J Chen, Y Matsuyama… – Proceedings of the 17th …, 2018 – researchgate.net
… database. The tem- plates are filled with content items. A generated sentence plan is sent to BEAT, a nonverbal behavior generator [13], and BEAT gen- erates a behavior plan in the BML (Behavior Markup Language) form [20]. The …
Conceptual Operations with Semantics for a Companion Robot
A Kotov, L Zaidelman, A Zinina, N Arinkin… – … Conference on Speech …, 2020 – Springer
… Chapter Dialogue Systems and Chatbots (Draft of October 16, 2019) (2019)Google Scholar. 10. Kopp, S., et al.: Towards a common framework for multimodal generation: the behavior markup language. In: Gratch, J., Young, M., Aylett, R., Ballin, D., Olivier, P. (eds.) IVA 2006 …
Situated interaction
D Bohus, E Horvitz – The Handbook of Multimodal-Multisensor Interfaces …, 2019 – dl.acm.org
… progresses. Dyadic: a term denoting an interaction that involves two participants. Most work in spoken dialog systems has traditionally focused on dyadic settings, where a dialog system interacts with a single human user. Encoding …
Training doctors’ social skills to break bad news: evaluation of the impact of virtual environment displays on the sense of presence
M Ochs, D Mestre, G De Montcheuil… – Journal on Multimodal …, 2019 – Springer
… by a human: the doctor verbal production is inter- preted in real time by the operator who selects the adequate input signal to be transmitted to the dialogue system … Language (FML) and in terms of verbal and non-verbal signals for the Behavior Markup Language (BML) [21] …
Multimodal and multi-party social interactions
Z Yumak, N Magnenat-Thalmann – … aware human-robot and human-agent …, 2016 – Springer
… NVBG takes as input FML and produces in turn Behavior Mark-up Language (BML … Figure 13.9 shows the architecture of the multi-party dialogue system … pp 31–38Google Scholar. 6. Bohus D, Horvitz E (2009) Learning to predict engagement with a spoken dialog system in open …
Simulating listener gaze and evaluating its effect on human speakers
L Frädrich, F Nunnari, M Staudte, A Heloir – International Conference on …, 2017 – Springer
… In: Advances in Natural Multimodal Dialogue Systems, pp. 245–262. Springer (2005) 4. Kopp, S., B, K., Marsella, SS, Marshall, A., Pelachaud, C., Pirker, H., Thorisson, K.: Towards a common framework for multimodal generation in ECAs: The behavior markup language …
Challenge discussion: advancing multimodal dialogue
J Allen, E André, PR Cohen, D Hakkani-Tür… – The Handbook of …, 2019 – dl.acm.org
… Given the huge variation in this multidimensional space, it is no sur- prise that the most frequently deployed commercial dialogue systems are very sim- ple, including system-initiated Interactive Voice Response systems that take over a conversation and provide a limited set of …
An architecture of virtual patient simulation platform to train doctors to break bad news
M Ochs, G de Montcheuil, JM Pergandi, J Saubesty… – 2017 – hal.archives-ouvertes.fr
… Markup Language) or the non-verbal signals to express (described in BML: Behavior Markup Language) … In the next section, we describe in more de- tails the dialog system integrated in the … is a java-based, domain-independent toolkit for de- veloping spoken dialogue systems …
A multimodal system for real-time action instruction in motor skill learning
I de Kok, J Hough, F Hülsmann, M Botsch… – Proceedings of the …, 2015 – dl.acm.org
… system consists of a CAVE environment and a motion capture system; the software components are composed of a rendering engine, motion analysis and dialogue system– see Figure … Action Patterns produce behaviours described in the Behaviour Markup Language (BML) [19 …
Multimodal human-machine interaction including virtual humans or social robots
NM Thalmann, D Thalmann – SIGGRAPH Asia 2015 Courses, 2015 – dl.acm.org
… The Behavior Markup Language: Recent Developments and Challenges … Digital Library Digital Library; C. Becker, S. Kopp, and I. Wachsmuth, “Simulating the emotion dynamics of a multimodal conversational agent,” in Proceedings of Affective Dialogue Systems: Tutorial and …
Multimodal conversational interaction with robots
G Skantze, J Gustafson, J Beskow – The Handbook of Multimodal …, 2019 – dl.acm.org
… body. Hal only stares at the interlocutor with his (now emblematic) red eye. For a long time, spoken dialogue systems developed in research labs and employed in the industry also lacked any physical embodiment. One reason …
of deliverable Dialogue and Argumentation Framework Design
M Snaith, A Pease – 2018 – council-of-coaches.eu
… BML Behaviour Markup Language … (Robertson, 2004) has demonstrated that a language for expressing such dialogue systems (or games, or … DGDL) is a language equipped with everything one might expect to need for the rapid development of a new dialogue system for a new …
20 Body Movements Generation for Virtual Characters and Social Robots
A Beck, Z Yumak… – Social Signal …, 2017 – books.google.com
… language markup language (FML) and produces in turn behaviour markup language (BML) … Spoken and multimodal dialog systems and applications–rigid head motion in expressive … Towards a common framework for multimodal generation: The behavior markup language …
Exploring the alignment space–lexical and gestural alignment with real and virtual humans
K Bergmann, HP Branigan, S Kopp – Frontiers in ICT, 2015 – frontiersin.org
… reported in terms of higher likeability ratings of speech-based dialog systems (Nass and … slow) when interacting with an animated character in a simulated spoken dialog system … in the VH condition with behavior specified in the Behavior Markup Language [BML; Vilhjálmsson et …
Empathy framework for embodied conversational agents
ÖN Yalç?n – Cognitive Systems Research, 2020 – Elsevier
… Sensitive Artificial Listener (SAL) (Schroder et al., 2012) is a multimodal dialogue system that is capable of nonverbal interaction based … 2018) to provide a two-way communication between the framework and the behavior realizer with Behavior Markup Language (BML) (Kopp …
A model of social explanations for a conversational movie recommendation system
F Pecune, S Murali, V Tsai, Y Matsuyama… – Proceedings of the 7th …, 2019 – dl.acm.org
… of generated sentences can be found in Table 2. The final module to be triggered is BEAT [5], a nonverbal behavior realizer which adds and synchronizes non- verbal behavior with the utterance to generate a behavior plan in the Behavior Markup Language (BML) form [16] …
Fostering user engagement in face-to-face human-agent interactions: a survey
C Clavel, A Cafaro, S Campano… – Toward Robotic Socially …, 2016 – Springer
… communicative intent. The multimodal behaviors to express a given communicative function to achieve (eg facial expressions, gestures and postures) are described by the Behavior Markup Language (BML) [70, 137]. The Greta …
Towards reasoned modality selection in an embodied conversation agent
C Ten-Ventura, R Carlini, S Dasiopoulou, GL Tó… – … on Intelligent Virtual …, 2017 – Springer
… Behavior Synchronization module, where the instantiations in different modalities are synchronized in terms of the Behavior Markup Language (BML) [21 … profile features, it is different from most of the approaches to modality handling in multimodal dialogue systems, which tend …
A robot commenting texts in an emotional way
L Volkova, A Kotov, E Klyshinsky, N Arinkin – Conference on Creativity in …, 2017 – Springer
… (eds.): Affective Dialogue Systems. Springer, Berlin (2004)Google Scholar. 3. Bell, L., Gustafson, J., Heldner, M.: Prosodic adaptation in human–computer interaction … Vilhjálmsson, H., et al.: The behavior markup language: recent developments and challenges …
Selecting and expressing communicative functions in a saiba-compliant agent framework
A Cafaro, M Bruijnes, J van Waterschoot… – … on Intelligent Virtual …, 2017 – Springer
… B., Marsella, S., Marshall, AN, Pelachaud, C., Pirker, H., Thórisson, KR, Vilhjálmsson, HH: Towards a common framework for multimodal generation: The behavior markup language … ter Maat, M., Heylen, D.: Flipper: An Information State Component for Spoken Dialogue Systems …
BEAT-o-matic: a baseline for learning behavior expressions from utterances
M Gallé, A Arora – europe.naverlabs.com
… As open-domain dialogue systems are still out-of-scope of our current understanding of natural language processing and machine learning research, one of the biggest short- term … Towards a common framework for multimodal gener- ation: The behavior markup language …
Time to Go ONLINE! A Modular Framework for Building Internet-based Socially Interactive Agents
M Polceanu, C Lisetti – Proceedings of the 19th ACM International …, 2019 – dl.acm.org
… 2) via a compatible1 internet browser, while allowing diverse mod- ules like new characters, data-driven behavior models (using for example TensorFlow.js), dialogue systems or even … Towards a common framework for multimodal generation: The behavior markup language …
Exploring the alignment space–lexical and gestural alignment with real and virtual humans
KBHP Branigan, S Kopp – 2015 – researchgate.net
… have been reported in terms of higher likeability ratings of speech-based dialogue systems (Nass and … character in a simulated spoken dialogue system … the virtual human ‘Billie’ in the VH condition with behavior specified in the Behavior Markup Language (BML; Vilhjálmsson et …
Towards AmI Systems Capable of Engaging in ‘Intelligent Dialog’and ‘Mingling Socially with Humans’
SE Bibri – The Human Face of Ambient Intelligence, 2015 – Springer
… ATLANTISAPI, volume 9). Abstract. This chapter seeks to address computational intelligence in terms of conversational and dialog systems and computational processes and methods to support complex communicative tasks. In so …
Working with a social robot in school: a long-term real-world unsupervised deployment
DP Davison, FM Wijnen, V Charisi… – Proceedings of the …, 2020 – dl.acm.org
… in the interaction. 3.1.5 Robot Behaviour Realisation. The IM requested spoken ut- terances, robot movement, and tablet interface updates, specified in the Behaviour Markup Language (BML) [3, 21]. The robot could display …
in Computer and Information Science 754
A Cuzzocrea, O Kara, D Sl?zak, X Yang – researchgate.net
… focus group. The reactions (gestures, mimics, and text) generated in BML (behavior markup language [46]) are sent to the RCS which preprocesses these for further rendering. The … al. (eds.): Affective Dialogue Systems. Springer, Berlin …
Virtual coaches for healthy lifestyle
HJA op den Akker, R Klaassen, A Nijholt – Toward Robotic Socially …, 2016 – Springer
… the ECA is a more or less sophisticated (spoken or multi-modal) dialogue system … of the progress made in the field of natural interfaces, spoken dialogue systems and embodied … the interpretation of embodied agent behavior specified in the Behavior Markup Language (BML) [67 …
Demonstrating and Learning Multimodal Socio-communicative Behaviors for HRI: Building Interactive Models from Immersive Teleoperation Data
G Bailly, F Elisei – 2018 – hal.archives-ouvertes.fr
… 4 Modeling interactive multimodal behaviors Generation of interactive multimodal behaviors of conver- sational agents often enriches a spoken dialog system that Page 4 … Towards a common framework for multi- modal generation: The behavior markup language. In Int …
Nonverbal behavior in multimodal performances
A Cafaro, C Pelachaud, SC Marsella – The Handbook of Multimodal …, 2019 – dl.acm.org
… They are typically biphasic (two movement components ), small, low energy, rapid flicks of the fingers or hand” [McNeill 1992]. Behavior Markup Language bml is an XML-like mark up language specially suited for representing communicative behavior …
Conversational interfaces: devices, wearables, virtual agents, and robots
M McTear, Z Callejas, D Griol – The Conversational Interface, 2016 – Springer
… architectures and languages, such as the Situation, Agent, Intention, Behavior, Animation (SAIBA) framework, the Behavior Markup Language (BML), and the … was developed by human–computer interaction experts at Furhat Robotics with a strong background in dialog systems …
Standardized representations and markup languages for multimodal interaction
R Tumuluri, D Dahl, F Paternò… – The Handbook of …, 2019 – dl.acm.org
… Application Programming Interface (API). Set of procedures made available by a software application to provide services to external programs. Behavior Markup Language (BML). An XML-based language for describing behaviors that should be realized by animated agents …
Timed Petri nets for fluent turn-taking over multimodal interaction resources in human-robot collaboration
C Chao, A Thomaz – The International Journal of Robotics …, 2016 – journals.sagepub.com
The goal of this work is to develop computational models of social intelligence that enable robots to work side by side with humans, solving problems and achieving task goals through dialogue and c…
Designing an API at an appropriate abstraction level for programming social robot applications
J Diprose, B MacDonald, J Hosking… – Journal of Visual …, 2017 – Elsevier
… Artificial Intelligence Markup Language (AIML) is an XML based markup language for authoring the content of natural language dialogue systems [19]; most of its abstractions can be categorised as methods for controlling primitives … 2.2.6. Behaviour Markup Language …
An Investigation on the Effectiveness of Multimodal Fusion and Temporal Feature Extraction in Reactive and Spontaneous Behavior Generative RNN Models for …
HH Huang, M Fukuda, T Nishida – … of the 7th International Conference on …, 2019 – dl.acm.org
… do. The SEMAINE project [16, 18] was launched to build a Sensitive Artifi- cial Listener (SAL). SAL is a multimodal dialogue system with the social interaction skills needed for a sustained conversation with the user. They focused …
Privacy concerns of multimodal sensor systems
G Friedland, MC Tschantz – The Handbook of Multimodal-Multisensor …, 2019 – dl.acm.org
Page 1. 16Privacy Concerns of Multimodal Sensor Systems Gerald Friedland, Michael Carl Tschantz 16.1 Introduction This chapter explains that ignoring the privacy risks introduced by multimodal sys- tems could have severe consequences for society in the long term …
Early integration for movement modeling in latent spaces
R Hornung, N Chen, P van der Smagt – The Handbook of Multimodal …, 2019 – dl.acm.org
Page 1. 8Early Integration for Movement Modeling in Latent Spaces Rachel Hornung, Nutan Chen, Patrick van der Smagt 8.1 Introduction In this chapter, we will show how techniques of advanced machine and deep learn- ing …
Commercialization of multimodal systems
PR Cohen, R Tumuluri – The Handbook of Multimodal-Multisensor …, 2019 – dl.acm.org
Page 1. 15Commercialization of Multimodal Systems Philip R. Cohen, Raj Tumuluri 15.1 Introduction This chapter surveys the broad and accelerating commercial activity in build- ing products incorporating multimodal-multisensor interfaces …
Affective Conversational Interfaces
M McTear, Z Callejas, D Griol – The Conversational Interface, 2016 – Springer
… The Behavior Markup Language (BML) was proposed to provide a general description of the multimodal behavior that can be used to control the agent (Kopp et al. 2006). Alma. 26 Alma is a computational model of real-time affect for virtual characters …
SceneMaker: creative technology for digital StoryTelling
M Akser, B Bridges, G Campo, A Cheddad… – … , Game Creation, Design …, 2016 – Springer
… Two standard XML languages, FML-APML (Function Markup Language, Affective Presentation Markup Language) for communicative intentions and BML (Behavior Markup Language) for behaviours … Wahlster, W.: Smartkom: Foundations of Multimodal Dialogue Systems …
Multimodal databases
M Valstar – The Handbook of Multimodal-Multisensor Interfaces …, 2019 – dl.acm.org
Page 1. 10Multimodal Databases Michel Valstar 10.1 Introduction In the preceding chapters, we have seen many examples of Multimodal, Multisen- sor Interfaces (MMIs). Almost all of these interfaces are implemented as computer …
Medical and health systems
D Sonntag – The Handbook of Multimodal-Multisensor Interfaces …, 2019 – dl.acm.org
… Application domains include serious games, conversational agents, or dialogue systems for healthy behavior promotion; intelligent interactive … 11.4.1 Case Study 1: A Multimodal Dialog System In this case study, we present a dialogue system for the annotation and retrieval of …
Towards an intelligent system for generating an adapted verbal and nonverbal combined behavior in human–robot interaction
A Aly, A Tapus – Autonomous Robots, 2016 – Springer
… Later on, some research studies discussed how persuasive will be the dialogue systems that are adapted to the user’s model (including the ability to change explicitly and dynamically the aspects of the relationship with the inter- acting human through the use of social talks in …
Automotive multimodal human-machine interface
D Schnelle-Walka, S Radomski – The Handbook of Multimodal …, 2019 – dl.acm.org
… Dialogue planner Presentation manager Multimodal dialogue system Driving context … Driving performance Cognitive load Figure 12.3 Components influencing cognitive load and driving performance of an in-car multi- modal dialog system (after [Neßelrath and Feld 2013]) …
Software platforms and toolkits for building multimodal systems and applications
M Feld, R Ne?elrath, T Schwartz – The Handbook of Multimodal …, 2019 – dl.acm.org
… It basically updates the context and determines how to react to dialogue acts. Dialogue Platforms are underlying frameworks that are used to execute a dialogue system. Dialogue Systems are software agents that allow users to converse with a machine in a coherent structure …
Towards a synthetic tutor assistant: the easel project and its architecture
V Vouloutsi, M Blancas, R Zucca, P Omedas… – … on Biomimetic and …, 2016 – Springer
… module [31, 32] is responsible for the choreography of the behavior (verbal and non-verbal) of the STA using the generic robot-independent Behavior Markup Language (BML) … ter Maat, M., Heylen, D.: Flipper: an information state component for spoken dialogue systems …
The conversational interface
MF McTear, Z Callejas, D Griol – 2016 – Springer
… With the evolution of speech recognition and natural language technologies, IVR systems rapidly became more sophisticated and enabled the creation of complex dialog systems that could handle natural language queries and many turns of interaction …
A taxonomy of social cues for conversational agents
J Feine, U Gnewuch, S Morana, A Maedche – International Journal of …, 2019 – Elsevier
Multimodal dialogue processing for machine translation
A Waibel – The Handbook of Multimodal-Multisensor Interfaces …, 2019 – dl.acm.org
… Field-adaptable and extendable systems. Languages and vocabularies change, and interpreting dialogue systems must evolve alongside such changing languages and vocabularies and adapt to any given dialogue scenario …
Ergonomics for the design of multimodal interfaces
A Heloir, F Nunnari, M Bachynskyi – The Handbook of Multimodal …, 2019 – dl.acm.org
Page 1. IIPART MULTIMODAL BEHAVIOR Page 2. Page 3. 7Ergonomics for the Design of Multimodal Interfaces Alexis Heloir, Fabrizio Nunnari, Myroslav Bachynskyi 7.1 Introduction There are many ways a machine can infer …
Embedded multimodal interfaces in robotics: applications, future trends, and societal implications
EA Kirchner, SH Fairclough, F Kirchner – The Handbook of Multimodal …, 2019 – dl.acm.org
Page 1. 13Embedded Multimodal Interfaces in Robotics: Applications, Future Trends, and Societal Implications Elsa A. Kirchner, Stephen H. Fairclough, Frank Kirchner 13.1 Introduction In the past, robots were primarily used …
Cognitive human–robot interaction
B Mutlu, N Roy, S Šabanovi? – Springer Handbook of Robotics, 2016 – Springer
… BML behavior mark-up language. fMRI … Research in cognitive HRI has also explored the development of dialog systems that explicitly integrate these mechanisms into dialog modeling and the development of specific models and mechanisms for these requirements …
Transferring Human Tutor’s Style to Pedagogical Agent: A Possible Way by Leveraging Variety of Artificial Intelligence Achievements
X Feng, X Guo, L Qiu, R Shi – 2018 IEEE International …, 2018 – ieeexplore.ieee.org
… It apply machine learning technology for rapid development of agent-based dialogue systems with minor or non- programming[10] … 98–109. [14] “The Behavior Markup Language: Recent Developments and Challenges | SpringerLink.” [Online] …
Developing Embodied Agents for Education Applications with Accurate Synchronization of Gesture and Speech
J Xu, Y Nagai, S Takayama, S Sakazawa – Transactions on Computational …, 2015 – Springer
… Although synchronization description schemes (eg the Behavior Markup Language, BML [15]) have been proposed and widely used in the … Research Centre for Artificial Intelligence (DFKI), which include a single, TV-style presentation agent, dialogue systems, and multiple …
The roles and recognition of haptic-ostensive actions in collaborative multimodal human–human dialogues
L Chen, M Javaid, B Di Eugenio, M Žefran – Computer Speech & Language, 2015 – Elsevier
… 2. Related work. Research on spoken dialogue systems has been progressing for at least forty years, and many systems exist, from prototypes to commercial strength (please see Tur and De Mori, 2011 for a recent overview) …
Multimodal integration for interactive conversational systems
M Johnston – The Handbook of Multimodal-Multisensor Interfaces …, 2019 – dl.acm.org
Page 1. IPART MULTIMODAL LANGUAGE AND DIALOGUE PROCESSING Page 2. Page 3. 1Multimodal Integration for Interactive Conversational Systems Michael Johnston 1.1 Introduction This chapter discusses the challenges …
Towards adaptive social behavior generation for assistive robots using reinforcement learning
J Hemminahaus, S Kopp – 2017 12th ACM/IEEE International …, 2017 – ieeexplore.ieee.org
Page 1. Towards Adaptive Social Behavior Generation for Assistive Robots Using Reinforcement Learning Jacqueline Hemminghaus CITEC, Bielefeld University 33619 Bielefeld, Germany jhemming@techfak.uni-bielefeld.de …
Associating gesture expressivity with affective representations
L Malatesta, S Asteriadis, G Caridakis… – … Applications of Artificial …, 2016 – Elsevier
JavaScript is disabled on your browser. Please enable JavaScript to use all the features on this page. Skip to main content Skip to article …
Realistic natural interaction with virtual statues in x-reality environments.
G Margetis, G Papagiannakis, C Stephanidis – International Archives of …, 2019 – d-nb.info
… Several Neural Network approaches pursue realistic spoken dialogue systems, such as Recurrent Neural Networks, Recursive Neural Networks or Deep Reinforced Models and … Towards a common framework for multimodal generation: The behavior markup language …
A verbal and gestural corpus of story retellings to an expressive embodied virtual character
J Tolins, K Liu, M Neff, M Walker, JEF Tree – Proceedings of the Tenth …, 2016 – aclweb.org
… Towards a common framework for multimodal generation: The behavior markup language. In Intelligent virtual agents, pp … Natural language generation as planning under uncertainty for spoken dialogue systems. In Empirical methods in natural language generation (pp …
Nonverbal Behavior in
A Cafaro, C Pelachaud… – The Handbook of …, 2019 – books.google.com
… Behavior Markup Language bml is an XML-like mark up language specially suited for representing communicative behavior … Flipper is a library for specifying dialogue rules for dialogue systems, that uses XML-templates to describe the preconditions, effects and BML behaviors …
Once upon a time… a companion robot that can tell stories
C Adam, L Cavedon – 2015 – hal.archives-ouvertes.fr
… the agent: FML (Function Markup Language) expresses the agent’s communicative intention, while BML (Behaviour Markup Language) expresses the … In: Workshop on Companionable Dialogue Systems @ ACL, Uppsala, Sweden, Association for Computational Lin- guistics (15 …
Non-Verbal Behaviour of a VR Agent Playing a Board Game
GI Baldursdóttir – 2018 – skemman.is
… a type of multimodal interface (speech, facial expressions, hand gestures), software agents and dialogue systems (verbal and … 2.4.2 Behaviour Markup Language The Behavior Markup Language (BML) is the interface between behaviour planning and behaviour realization [16] …
Scenarios in virtual learning environments for one-to-one communication skills training
R Lala, J Jeuring… – International …, 2017 – educationaltechnologyjournal …
A scenario is a description of a series of interactions between a player and a virtual character for one-to-one communication skills training, where at each step the player is faced with a choice between statements. In this paper, we analyse the characteristics of scenarios and provide …
Things that Make Robots Go HMMM: Heterogeneous Multilevel Multimodal Mixing to Realise Fluent, Multiparty, Human-Robot Interaction
NCEDD Reidsma – researchgate.net
… We tackled this challenge by designing a framework which can mix these two types of behaviour, using AsapRealizer, a Behaviour Markup Language realiser. We call this Heterogeneous Multilevel Mul- timodal Mixing (HMMM) …
Modeling affective system and episodic memory for social companions
J Zhang – 2017 – dr.ntu.edu.sg
Page 1. Modeling Affective System and Episodic Memory for Social Companions BY Zhang Juzheng A THESIS SUBMITTED TO THE NANYANG TECHNOLOGICAL UNIVERSITY IN PARTIAL FULFILLMENT OF THE REQUIREMENT FOR THE DEGREE OF …
This is the author’s version of a work that was published in the following source
J Feine, U Gnewuch, S Morana, A Maedche – 2019 – researchgate.net
… Research on voice-based CAs, which are often referred to as spoken-dialog systems, voice-user interfaces, or interactive voice response systems, began in the late 1980s (McTear et al., 2016). One of the most prominent spoken-dialog system projects were ATIS (Air Travel …
Learning socio-communicative behaviors of a humanoid robot by demonstration
DC Nguyen – 2018 – hal.archives-ouvertes.fr
Page 1. HAL Id: tel-01962544 https://hal.archives-ouvertes.fr/tel-01962544v2 Submitted on 26 Feb 2019 HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not …
A Human Robot Interaction Toolkit with Heterogeneous Multilevel Multimodal Mixing
B Vijver – 2016 – essay.utwente.nl
… Such requests are typically specified us- ing a high level behavior script language such as the Behavior Markup Language (BML), which is agnostic of the details … This can be a dialog system with AsapRealizer, but the MIDI-controller will also be considered as an external control …
Interactive narration with a child: impact of prosody and facial expressions
O ?erban, M Barange, S Zojaji, A Pauchet… – Proceedings of the 19th …, 2017 – dl.acm.org
Page 1. Interactive Narration with a Child: Impact of Prosody and Facial Expressions Ovidiu S, erban Normandie Univ, INSA Rouen Normandie, LITIS 76800 Saint-Étienne-du-Rouvray France Mukesh Barange Normandie Univ …
of deliverable First periodic report
J van Loon, H op den Akker, T Beinema, M Broekhuis… – 2019 – council-of-coaches.eu
… will have this time available, but fortunately we are developing a dialogue system with which … and developers can use to extend and create their own multi-agent dialogue systems … open source dialogue platform, and eg the core integration of Behaviour Markup Language (BML …
Timing multimodal turn-taking in human-robot cooperative activity
C Chao – 2015 – smartech.gatech.edu
… 9 2.2 Conversation analysis . . . . . 10 2.3 Spoken dialogue systems . . . . . 11 2.4 Embodied conversational agents . . . . . 12 … 38 Traditional dialogue systems model only speech . . . . . 130 …
SiAM-dp: an open development platform for massively multimodal dialogue systems in cyber-physical environments.
R Neßelrath – 2016 – pdfs.semanticscholar.org
… Other extensive projects deploy well elaborated platforms for multimodal dialogue systems … Ontology Based SiAM-dp is fully ontology based and uses a single domain adaptable knowledge repre- sentation throughout the complete dialogue system …
An architecture for emotional facial expressions as social signals
R Aylett, C Ritter, MY Lim, F Broz… – IEEE Transactions …, 2019 – ieeexplore.ieee.org
… Thus it combines a cognitive architecture able to model a rich internal state with the focus on interactivity of a dialogue system, albeit a pre- authored one. Page 4. 1949-3045 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission …
Sentiment analysis: from opinion mining to human-agent interaction
C Clavel, Z Callejas – IEEE Transactions on affective computing, 2015 – ieeexplore.ieee.org
… Detection and avoidance of user frustration in driving situations [77] or for tutoring systems [78] or for a child conversational computer game [83]. Detection of various emotions according to the application for dialog systems [25], [81], [83] …
Synthesizing Expressive Behaviors for Humanoid Robots
MI Sunardi – 2020 – pdxscholar.library.pdx.edu
… 49 2.4 Behavior Markup Language and its Related Frameworks . . . . . 52 2.5 Synthesis Using Motion Capture Data … 79 3.2.5 Speech and Dialogue System . . . . . 81 3.2.6 User Interface …
A virtual emotional freedom practitioner to deliver physical and emotional therapy
H Ranjbartabar – 2016 – researchonline.mq.edu.au
… 40 List of Abbreviations and Acronyms ASR Automatic Speech Recognition BML Behaviour Mark-up Language ECA Embodied Conversational Agent EEG Electroencephalography EFT Emotional Freedom Technique GSR Galvanic Skin Response IVA Intelligent Virtual Agent …
Now we’re talking: Learning by explaining your reasoning to a social robot
FM Wijnen, DP Davison, D Reidsma, JVD Meij… – ACM Transactions on …, 2019 – dl.acm.org
… The robot’s behaviors were specified using Behavior Markup Language (BML) [Kopp et al … In addition, our dialogue system automatically collected log data that contained all input from the Wizard of Oz tablet and from the tablet used by the children and every output action from …
Socially-Aware Dialogue System
R Zhao – 2019 – lti.cs.cmu.edu
… SAPA) This chapter reviews our knowledge-inspired socially-aware dialogue system in a … we lay out the building blocks for socially-aware dialogue systems, including conversational … recognition of conversational strategies in the service of a socially-aware dialog system …
Extending multimedia languages to support multimodal user interactions
ÁLV Guedes, RG de Albuquerque Azevedo… – Multimedia Tools and …, 2017 – Springer
… children- or elderly- oriented MUIs. BML (Behavior Markup Language) is an XML description language for controlling the verbal and nonverbal behavior of embodied conversational agents. Finally, SEDL (Sensory Effect Description …
Designing a social robot to support children’s inquiry learning: A contextual analysis of children working together at school
DP Davison, FM Wijnen, J van der Meij… – International journal of …, 2019 – Springer
Page 1. International Journal of Social Robotics https://doi.org/10.1007/s12369-019- 00555-6 Designing a Social Robot to Support Children’s Inquiry Learning: A Contextual Analysis of Children Working Together at School Daniel …
Learning data-driven models of non-verbal behaviors for building rapport using an intelligent virtual agent
R Amini – 2015 – digitalcommons.fiu.edu
… The overall goal is to explore the possibilities of using machine learning techniques to move away from hand-crafted rule-based models employed in most of the current health-related dialogue systems (Discussed in Section 2), toward modeling human’s non-verbal behaviors …
An eye gaze model for controlling the display of social status in believable virtual humans
M Nixon, S DiPaola, U Bernardet – 2018 IEEE Conference on …, 2018 – ieeexplore.ieee.org
… SmartBody procedural animation system. By combining this with a chatbot-driven dialog system, we created a job interview simulator with two different variations on the character’s eye gaze. Participants then practiced interviewing …
Extending multimedia languages to support multimodal user interactions
S Colcher – 2017 – maxwell.vrac.puc-rio.br
… 91 Page 13. List of Abreviations ASR Audio Speech Recognition BML Behavior Markup Language DOM Document Object Model DTMF Dual-Tone Multi-Frequency EMMA Extensible MultiModal Annotation markup language GDL Gesture Description Language …
Sound synthesis for communicating nonverbal expressive cues
FA Martín, Á Castro-González, MÁ Salichs – IEEE Access, 2017 – ieeexplore.ieee.org
… In [8], a multimodal dialog manager allows a robot to combine partial information of vision and speech into a coherent message. Many models, such as Behavior Markup Language (BML), face the problem Manuscript received; revised …
First Impressions Count! The Role of the Human’s Emotional State on Rapport Established with an Empathic versus Neutral Virtual Therapist
H Ranjbartabar, D Richards, A Bilgin… – IEEE Transactions on …, 2019 – ieeexplore.ieee.org
… The dialogue engine sends BML (Behavior Markup Language) message to NVBG (Non- Fig. 3. Group 2 Procedure Fig. 2. Group 1 Procedure Page 6. 1949-3045 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission …
of deliverable Initial knowledge base design and coaching strategies
T Beinema, G Huizing, H op den Akker, M Snaith… – 2018 – council-of-coaches.eu
… Ad Adaptation AIF (db) Argument Interchange Format (database) APML Affective Presentation Markup Language BCTT Behavior Change Technique Taxonomy BML Behavior Markup Language BS Behaviour Set CA Context Awareness CBT Cognitive Behaviour Therapy …
Adapting a Virtual Advisor’s Verbal Conversation Based on Predicted User Preferences: A Study of Neutral, Empathic and Tailored Dialogue
H Ranjbartabar, D Richards, AA Bilgin, C Kutay… – Multimodal …, 2020 – mdpi.com
Virtual agents that improve the lives of humans need to be more than user-aware and adaptive to the user’s current state and behavior. Additionally, they need to apply expertise gained from experience that drives their adaptive behavior based on deep understanding of the user’s …
Investigating the role of social eye gaze in designing believable virtual characters
M Nixon – 2017 – summit.sfu.ca
Page 1. Investigating the Role of Social Eye Gaze in Designing Believable Virtual Characters by Michael Nixon M.Sc., Simon Fraser University, 2009 B.Sc., Vancouver Island University, 2004 Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of …
D4. 7 1st Expressive Virtual Characters
F Yang, C Peters – 2016 – prosociallearn.eu
… interaction 1 . The SEMAINE project 2 built a Sensitive Artificial Listener (SAL) and a multimodal dialogue system which can react to the user’s verbal and non-verbal behavior and sustain the interaction for a long time. Greta …
Effective directed gaze for character animation
T Pejsa – 2016 – search.proquest.com
Page 1. EFFECTIVE DIRECTED GAZE FOR CHARACTER ANIMATION by Tomislav Pejsa A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Computer Sciences) at the UNIVERSITY OF WISCONSIN–MADISON 2016 …
Capturing and Animating Hand and Finger Motion for 3D Communicative Characters
NS Wheatland – 2016 – escholarship.org
Page 1. UC Riverside UC Riverside Electronic Theses and Dissertations Title Capturing and Animating Hand and Finger Motion for 3D Communicative Characters Permalink https://escholarship.org/uc/item/39w9397t Author Wheatland, Nkenge Safiya Publication Date …
of deliverable Final prototype description and evaluations of the virtual coaches
G Huizing, B Donval, M Barange, R Kantharaju… – 2019 – council-of-coaches.eu
… AMI Augmented Multiparty Interaction ASAP Articulated Social Agents Platform AU Action Unit BML Behaviour Markup Language CMC Centre for Monitoring and Coaching COUCH Council of Coaches D Deliverable DBT Danish Board of Technology Foundation …
Two techniques for assessing virtual agent personality
K Liu, J Tolins, JEF Tree, M Neff… – IEEE Transactions on …, 2015 – ieeexplore.ieee.org
Page 1. Two Techniques for Assessing Virtual Agent Personality Kris Liu, Jackson Tolins, Jean E. Fox Tree, Michael Neff, and Marilyn A. Walker Abstract—Personality can be assessed with standardized inventory questions with …
The Design and Evaluation of an Application Programming Interface for Programming Social Robots
J Diprose – 2015 – researchspace.auckland.ac.nz
Page 1. Libraries and Learning Services University of Auckland Research Repository, ResearchSpace Copyright Statement The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand). This thesis may …
A User Perception–Based Approach to Create Smiling Embodied Conversational Agents
M Ochs, C Pelachaud, G Mckeown – ACM Transactions on Interactive …, 2017 – dl.acm.org
Page 1. 4 A User Perception–Based Approach to Create Smiling Embodied Conversational Agents MAGALIE OCHS, Aix Marseille Université, CNRS, ENSAM, Université de Toulon, LSIS, Marseille, France CATHERINE PELACHAUD …
Gaze Mechanisms for Situated Interaction with Embodied Agents
S Andrist – 2016 – search.proquest.com
… Behavior Markup Language (BML) (Vilhjlmsson et al., 2007), developed as one of the three stages of SAIBA, denes multimodal behaviors such as gaze, head, face, body, gesture, and speech in a human-readable XML format …