Notes:
Behavior Markup Language (BML) is a standardized, XML-based language that is used to specify and control the behavior of virtual humans and other autonomous agents in interactive systems. It is used to define the timing, duration, and intensity of behaviors, as well as the dependencies between different behaviors.
SmartBody is a software system for real-time character animation and control that is used to create interactive virtual humans and other autonomous agents. It is designed to support the use of BML and other behavior languages to specify and control the behavior of virtual characters.
In the SmartBody system, BML is used to specify and control the behavior of virtual humans in real-time. This includes defining the timing, duration, and intensity of behaviors, as well as the dependencies between different behaviors. For example, BML can be used to specify that a virtual human should raise their hand and wave, or that they should turn their head and look at a particular object.
SmartBody also supports the use of other behavior languages, such as H-Anim and Mecanim, which are also used to specify and control the behavior of virtual characters. These behavior languages can be used in conjunction with BML to create more complex and realistic behaviors for virtual characters.
Mecanim is a character animation system in the Unity game engine that is used to animate characters in a game. It can be used to switch between animations as states, change the animation speed of an animation managed by an animator controller at runtime, and synchronize animations with a PhotonAnimatorView. Mecanim is compatible with 3ds max and Maya and can be used to prepare character models and animations for games using its rigging tools. It is compatible with the Humanoid Avatar preset and requires a minimum of 15 bones to work. Mecanim can also be used to create 2D animations and is a powerful tool for constructing and programming third-person characters.
Wikipedia:
- Humanoid animation (HAnim)
See also:
BML (Behavior Markup Language) & Dialog Systems | EMBR (Embodied Agents Behavior Realizer) & BML (Behavior Markup Language) | SmartBody
SmartSim: Improving Visualization on Social Simulation for an Emergency Evacuation Scenario.
AM Diaz, G Zhu, A Carrera, CA Iglesias, O Araque – AmILP@ ECAI, 2016 – ceur-ws.org
… Moreover, the description of such behaviour is simplified by using Behavior Markup Language (BML) [9] because SmartBody is also a BML realization engine that transforms BML behaviour de- scriptions into real time animations …
Towards adaptive, interactive virtual humans in Sigma
V Ustun, PS Rosenbloom – International Conference on Intelligent Virtual …, 2015 – Springer
… SmartBody [15], a Behavior Markup Language (BML) [4] realization engine, is used as the character animation platform for this study, with communication between the Sigma VH model and SmartBody handled via BML messages …
Nadia: Neural network driven virtual human conversation agents
J Wu, S Ghosh, M Chollet, S Ly, S Mozgai… – Proceedings of the 18th …, 2018 – dl.acm.org
… Android. The behav- ior generation commands for NADiA are Smartbody scripts that communicate via Behavior Markup Language (BML), a language for describing verbal and non-verbal character animation behaviors. Both …
An asap realizer-unity3d bridge for virtual and mixed reality applications
J Kolkmeier, M Bruijnes, D Reidsma… – … Conference on Intelligent …, 2017 – Springer
… the communicative intent and be- havior descriptions of the Embodied Conversational Agent (ECA) with XML based in- terfaces, Functional Markup Language (FML) and Behavior Markup Language (BML) respectively … The VHTK [2] uses SmartBody [9] as BML realization engine …
Effect of Deictic Gestures on Direction-Giving in Virtual Humans
A Pham – 2016 – digitalworks.union.edu
… developed by the USC Institute for Creative Technologies. SmartBody is a Behavior Markup Language (BML) realization engine that transforms BML behavior descriptions into realtime animations. BML commands allow Rachel to …
Presenting a modular framework for a holistic simulation of manual assembly tasks
F Gaisbauer, P Agethen, M Otto, T Bär, J Sues… – Procedia …, 2018 – researchgate.net
… 23] has been used, whereas Smartbody is integrated within the Unity3D engine. The re- spective locomotion commands describing the walk behavior, as well as the reach commands being necessary to control the digital avatar are set using the behavior markup language …
Model of personality-based, nonverbal behavior in affective virtual humanoid character
M Saberi, U Bernardet, S DiPaola – Proceedings of the 2015 ACM on …, 2015 – dl.acm.org
… Smartbody an academic 3D character animation toolkit is used as our animation rendering system which provides locomotion, gazing and nonverbal behavior in real time under our scripted control via the Behavior Markup Language (BML) [5][6]. Movement + Meaning (m+m) is …
Salsational Dance Application: Project Proposal
A Baijnath, J Chetty, M Marajh – projects.cs.uct.ac.za
… From the notation, we will break up each move into the different parts of the body. We will be converting the notation into Behavior Markup Language (BML) [11, 14] which is compatible with SmartBody [14], a character animation library …
Providing Physical Appearance and Behaviour to Virtual Characters
M del Puy Carretero, HV Diez, S García… – … on Articulated Motion …, 2016 – Springer
… In order to specify the behaviour, the standard BML (Behaviour Markup Language) is used … The authoring tool presented in this paper is based on Behavior Markup Language (BML) standard … There are several interpreters like SmartBody [9], BMLRealizer [10] or Elckerlyc [11 …
Introducing a Modular Concept for Exchanging Character Animation Approaches.
F Gaisbauer, P Agethen, T Bär, E Rukzio – Eurographics (Posters), 2018 – diglib.eg.org
… Therefore languages such as the behavior markup language [KKM ? 06] can be utilized … has been validated using an exemplary implementation in the Unity3D environment, com- bining a statistical motion synthesis approach inspired by [MC12] with Smartbody [Sma18] and a …
Directing Virtual Humans Using Play-Scripts and Spatio-Temporal Reasoning
C Talbot – 2018 – researchgate.net
… A newer option includes a Functional Markup Language (FML), Behavior Markup Language (BML), and BML Realizers like SmartBody (Figure 1.3). These also require some lower-level coding, but begin to abstract and parameterize the motion of the …
Agent-Based Dynamic Collaboration Support in a Smart Office Space
Y Wang, RC Murray, H Bao, C Rose – … of the 21th Annual Meeting of the …, 2020 – aclweb.org
… Figure 3: Virtual tutor users using speech, facial expressions, gaze direc- tions, body position, and gestures. Speech is sent to VHT as text while non-verbal behavior is speci- fied using the Behavior Markup Language (BML) realization library, “Smartbody”(Feng et al., 2012) …
A Virtual Emotional Freedom Therapy Practitioner
H Ranjbartabar, D Richards – … of the 2016 International Conference on …, 2016 – ifaamas.org
… The dialogue engine sends BML (Behavior Markup Language) message to NVBG (NonVerbal-Behavior-Generator) module containing the line the … output of NVBG is also BML which are transformed into synchronized sequences of animations by Smartbody character animation …
Simulating listener gaze and evaluating its effect on human speakers
L Frädrich, F Nunnari, M Staudte, A Heloir – International Conference on …, 2017 – Springer
… A., Pelachaud, C., Pirker, H., Thorisson, K.: Towards a common framework for multimodal generation in ECAs: The behavior markup language … com/retrieve/ pii/S0010027714001139 8. Thiebaux, M., Marsella, S., Marshall, AN, Kallmann, M.: Smartbody: Behavior realization for …
Modeling warmth and competence in virtual characters
THD Nguyen, E Carstensdottir, N Ngo… – … on Intelligent Virtual …, 2015 – Springer
… of NVBG are formatted in Behavior Markup Language (BML) [28], an XML-based programming language used for authoring and synchronizing animations. These BMLs are then transformed into synchronized sequences of animations using the animation engine Smartbody [26 …
Hand gesture synthesis for conversational characters
M Neff – Handbook of Human Motion, 2016 – cs.ucdavis.edu
… capable of realizing commands in the Behavior Markup Language (50) that is supplied by a higher level in an agent architecture. These systems emphasize control and use a combination of procedural data and motion clips (eg (15; 48; 49; 11; 45)). The SmartBody system, for …
Virtual Environment Positioning Utilizing Play-Script Spatiotemporal Reasoning
C Talbot, GM Youngblood – IEEE Transactions on Games, 2019 – ieeexplore.ieee.org
… Other methods involve the use of Behavior Markup Language (BML), a form of XML which describes character behaviors [15]. BML Realizers, such as SmartBody [16] and Elckerlyc [17], take this BML and performs the requested behavior(s) on the characters …
m+ m: A novel middleware for distributed, movement based interactive multimedia systems
U Bernardet, D Adhia, N Jaffe, J Wang… – Proceedings of the 3rd …, 2016 – dl.acm.org
… and controls the behavior of the 3D character, by sending Behaviour Markup Language (BML) [10 … needs of components such as the tracking system, and the SmartBody 3D rendering … Towards a Common Framework for Multimodal Generation : The Behavior Markup Language …
Empathy framework for embodied conversational agents
ÖN Yalç?n – Cognitive Systems Research, 2020 – Elsevier
… Smartbody allows for transforming messages received from the module into locomotion, object manipulation, lip syncing, gazing and nonverbal … to provide a two-way communication between the framework and the behavior realizer with Behavior Markup Language (BML) (Kopp …
Web-based embodied conversational agents and older people
G Llorach, J Agenjo, J Blat, S Sayago – Perspectives on Human-Computer …, 2019 – Springer
… of free online editors, such as Adobe Fuse CC or MakeHuman, and standards, mainly BML (Behaviour Markup Language) … There are current tools, such as SmartBody (Feng et al … 2008), while the Behavior Markup Language (BML) goes from the second to the third as it “is one …
A fast and robust pipeline for populating mobile AR scenes with gamified virtual characters
M Papaefthymiou, A Feng, A Shapiro… – SIGGRAPH Asia 2015 …, 2015 – dl.acm.org
… Furthermore, we have integrated a character animation platform, SmartBody, with the glGA framework. Such integration will allow complex interactions with virtual characters through AR … Towards a common framework for multimodal generation: The behavior markup language …
Socially-aware animated intelligent personal assistant agent
Y Matsuyama, A Bhardwaj, R Zhao, O Romeo… – Proceedings of the 17th …, 2016 – aclweb.org
… 2004), which tailors a behavior plan (including relevant hand gestures, eye gaze, head nods, etc.) and outputs the plan as BML (Behavior Markup Language), which is a part of the Virtual Human Toolkit (Hartholt et al., 2013). This plan is then sent to SmartBody, which renders …
Evaluating levels of emotional contagion with an embodied conversational agent
ON Yalç?n, S DiPaola – … of the 41st annual conference of …, 2019 – pdfs.semanticscholar.org
… We use the Smartbody behavior real- izer (Thiebaux, Marsella, Marshall, & Kallmann, 2008), that can provide face and body gestures, gaze, and speech output for virtual characters. We use the standard Behavior Markup Language (BML) (Kopp et al., 2006) as the basis for the …
(Simulated) listener gaze in real?time spoken interaction
L Frädrich, F Nunnari, M Staudte… – … Animation and Virtual …, 2018 – Wiley Online Library
… 3 The solution employed back then was comparable to available agent control frameworks16- 19 and offered a Behavior Markup Language20 interface to … License (mesh, textures, and weight maps of the Brad character from the Institute for Creative Technologies SmartBody16) …
Human Behavior towards Virtual Humans
R Hoegen – 2015 – essay.utwente.nl
… expressions. The functionality of Smartbody could be called by using Behavior Markup Language (BML), an XML-based language. Through BML it is really simple to create scripts for expressed behavior of a VH. Page 17. Page 14 of 40 …
You Move, Therefore I Am: The Combinatorial Impact of Kinesthetic Motion Cues on Social Perception
DC Jeong – 2017 – search.proquest.com
… Physical Movement using Situational DIAMONDS Measurement 61 i. Introduction 61 ii. Method 62 iii. Results 65 iv. Discussion 82 V. Chapter 4: Using SmartBody to Examine the Role of Movement in Social Perception 83 i. Introduction 83 ii. Method 83 Page 6. 5 iii. Results 87 iv …
The cultural influence model: When accented natural language spoken by virtual characters matters
P Khooshabeh, M Dehghani, A Nazarian, J Gratch – AI & society, 2017 – Springer
… AN, Pelachaud C, Pirker H, Vilhjálmsson H (2006) Towards a common framework for multimodal generation: the behavior markup language … Routledge, New York, pp 119–169 Thiebaux M, Marsella S (2007) Smartbody: Behavior realization for embodied conversational agents …
A platform for building mobile virtual humans
AW Feng, A Leuski, S Marsella, D Casas… – … on Intelligent Virtual …, 2015 – Springer
… The API leverages an animation system SmartBody [18] to construct and configure characters and the environment … AN, Pelachaud, C., Pirker, H., Thórisson, KR, Vilhjálmsson, HH: Towards a common framework for multimodal generation: the behavior markup language …
SimCoach Evaluation
D Meeker, JL Cerully, MD Johnson, N Iyer, JR Kurz… – 2015 – rand.org
… Page 14. Page 15. xv Abbreviations BML Behavior Markup Language CI confidence interval DCoE Defense Centers of Excellence for Psychological Health and Traumatic Brain Injury DM dialogue management DoD US Department of Defense …
Transferring Human Tutor’s Style to Pedagogical Agent: A Possible Way by Leveraging Variety of Artificial Intelligence Achievements
X Feng, X Guo, L Qiu, R Shi – 2018 IEEE International …, 2018 – ieeexplore.ieee.org
… SmartBody is a Behavioral Markup Language (BML) realization engine that transforms BML behavior descriptions into real-time animations … 98–109. [14] “The Behavior Markup Language: Recent Developments and Challenges | SpringerLink.” [Online] …
Engagement with artificial intelligence through natural interaction models
S Feldman, ON Yalcin, S DiPaola – Electronic Visualisation and …, 2017 – scienceopen.com
… animated control. We partner with the University of Southern California’s open source effort called SmartBody (Thiebaux 2008) that uses an XML standard language called BML – Behavior markup language. BML allows use …
CHASE: character animation scripting environment.
C Mousas, CN Anagnostopoulos – VRCAI, 2015 – cs.siu.edu
… 2002], the Rich Representation Language [Piwek et al. 2002], the Behavior Markup Language [Vilhjalmsson et al. 2007] and the Player Markup Language [Jung 2008], which developed for controlling the behavior of virtual characters …
20 Body Movements Generation for Virtual Characters and Social Robots
A Beck, Z Yumak… – Social Signal …, 2017 – books.google.com
… NVBG takes as input a functional language markup language (FML) and produces in turn behaviour markup language (BML) … Smartbody: Behavior real- ization for embodied conversational agents … The behavior markup language: Recent developments and challenges …
Challenges for the animation of expressive virtual characters: The standpoint of sign language and theatrical gestures
S Gibet, P Carreno-Medrano, PF Marteau – Dance Notations and Robot …, 2016 – Springer
… The functional markup language (FML) is used to encode the communicative intent, whereas the behavior markup language (BML) specifies the … The SmartBody [1] is an open source modular framework which hierarchically interconnects controllers to achieve continuous motion …
Creating Interactive Robotic Characters: Through a combination of artificial intelligence and professional animation
TG Ribeiro, A Paiva – Proceedings of the Tenth Annual ACM/IEEE …, 2015 – dl.acm.org
… Towards a common framework for multimodal generation: The behavior markup language. In Intelligent Virtual Agents, pages 205–217, 2006. [9] J. Lasseter … SmartBody : Behavior Realization for Embodied Conversational Agents. Information Sciences, (Aamas):12–16, 2008 …
From embodied metaphors to metaphoric gestures.
M Lhommet, S Marsella – CogSci, 2016 – cogsci.mindmodeling.org
… system proposed by Xu, Pelachaud, and Marsella (2014) converts this formal- ism into the standard BML format (Kopp et al., 2006) to be rendered by the SmartBody animation system … Towards a com- mon framework for multimodal generation: The behavior markup language …
Field trial analysis of socially aware robot assistant
F Pecune, J Chen, Y Matsuyama… – Proceedings of the 17th …, 2018 – researchgate.net
… A generated sentence plan is sent to BEAT, a nonverbal behavior generator [13], and BEAT gen- erates a behavior plan in the BML (Behavior Markup Language) form [20]. The BML is then sent to SmartBody [36], which renders the required non-verbal behaviours and the …
FACSvatar: An Open Source Modular Framework for Real-Time FACS based Facial Animation
S van der Struijk, HH Huang, MS Mirzaei… – Proceedings of the 18th …, 2018 – dl.acm.org
… time consuming. Instead of relying on pre-made animations, platforms such as SAIBA, MAX and SmartBody take a coding approach to facial an- imations through the Behavior Markup Language (BML) [14]. A facial animation …
Usability assessment of interaction management support in LOUISE, an ECA-based user interface for elders with cognitive impairment
P Wargnier, S Benveniste, P Jouvelot… – Technology and …, 2018 – content.iospress.com
… The behavior realizer component in LOUISE is able to interpret and execute Behavior Markup Language (BML) [35] commands, and performs all communication functions allowed by this … Our behavior realizer is built on top of SmartBody [43], a state-of-the-art BML realizer …
Using machine learning to generate engaging behaviours in immersive virtual environments
GC Dobre – 2019 8th International Conference on Affective …, 2019 – ieeexplore.ieee.org
… of these are based on components such as: the SmartBody animation system [33 … Toolkit architecture [12], Behaviour Expression Animation Toolkit [6], Behaviour Markup Language [16] or … Towards a common framework for multimodal generation: The behavior markup language …
A review of eye gaze in virtual agents, social robotics and hci: Behaviour generation, user interaction and perception
K Ruhland, CE Peters, S Andrist… – Computer graphics …, 2015 – Wiley Online Library
Abstract A person’s emotions and state of mind are apparent in their face and eyes. As a Latin proverb states: ‘The face is the portrait of the mind; the eyes, its informers’. This presents a signi…
Artificial intelligence moving serious gaming: Presenting reusable game AI components
W Westera, R Prada, S Mascarenhas… – Education and …, 2020 – Springer
… 4.8.1 Nonverbal bodily motion: Behaviour mark-up language. The Behavior Mark-up Language (BML) Realizer created by Utrecht University defines and controls the on-screen representation of virtual characters, in particular their non-verbal behaviours: facial expressions, body …
Proposing a Co-simulation Model for Coupling Heterogeneous Character Animation Systems.
F Gaisbauer, J Lehwald, P Agethen, J Sues… – VISIGRAPP (1 …, 2019 – scitepress.org
… These controllers are embedded in the Smartbody platform, thus being limited in their in- teroperability … Using formats like the Behaviour Markup Language (BML) (Feng et al., 2012b), a basic sce- nario such as walk to, pick-up and put-down can be formulated …
Retrieving target gestures toward speech driven animation with meaningful behaviors
N Sadoughi, C Busso – Proceedings of the 2015 ACM on International …, 2015 – dl.acm.org
… Gesture Speech Head/ Hands Hh&s Gesture Speech t-1 t Head/ Hands Figure 7: The structure of the DBN which is de- signed to capture the joint states of speech and movements while constrained on target gestures. We synthesize animations with the Smartbody toolkit [30] …
Animating an Autonomous 3D Talking Avatar
D Borer, D Lutz, M Guay – arXiv preprint arXiv:1903.05448, 2019 – arxiv.org
… multimodal generation: The behavior markup language. In Intel- ligent Virtual Agents, J. Gratch, M. Young, R. Aylett, D. Ballin, and P. Olivier, Eds., 205–217. KOVAR, L., GLEICHER, M., AND PIGHIN, F. 2002 … Smartbody: Behavior realization for embodied conversational agents …
Embodied conversational agents and interactive virtual humans for training simulators
G Chetty, M White – AVSP 2019 International Conference on …, 2019 – isca-speech.org
… Marshall, A., Pelachaud, C., Pirker, H., Thorisson, K., Vilhjalmsson, H: Towards a Common Framework for Multimodal Generation: The Behavior Markup Language … M., Marshall, A., Marsella, S., Fast, E., Hill, A., Kallmann, M., Kenny, P., Lee, J., SmartBody Behavior Realization …
Conversational interfaces: devices, wearables, virtual agents, and robots
M McTear, Z Callejas, D Griol – The Conversational Interface, 2016 – Springer
… Different standards are being defined to establish common architectures and languages, such as the Situation, Agent, Intention, Behavior, Animation (SAIBA) framework, the Behavior Markup Language (BML), and the Functional Markup Language (FML) (described … SmartBody …
Combining heterogeneous digital human simulations: presenting a novel co-simulation approach for incorporating different character animation technologies
F Gaisbauer, E Lampen, P Agethen, E Rukzio – The Visual Computer, 2020 – Springer
… Most related to our approach, Smartbody [30] provides an animation system focusing on the generation of human motion using hierarchical motion controllers. These controllers are embedded in the Smartbody platform, thus being limited in their interoperability …
Learning individual styles of conversational gesture
S Ginosar, A Bar, G Kohavi, C Chan… – Proceedings of the …, 2019 – openaccess.thecvf.com
Page 1. Learning Individual Styles of Conversational Gesture Shiry Ginosar? UC Berkeley Amir Bar? Zebra Medical Vision Gefen Kohavi UC Berkeley Caroline Chan MIT Andrew Owens UC Berkeley Jitendra Malik UC Berkeley Figure 1: Speech-to-gesture translation example …
Nonverbal Behavior in
A Cafaro, C Pelachaud… – The Handbook of …, 2019 – books.google.com
… Behavior Markup Language bml is an XML-like mark up language specially suited for representing communicative behavior … determines which nonverbal behaviors should be generated in a given context and those behaviors are than realized by using SmartBody [Thiebaux et …
Nonverbal behavior in multimodal performances
A Cafaro, C Pelachaud, SC Marsella – The Handbook of Multimodal …, 2019 – dl.acm.org
… They are typically biphasic (two movement components ), small, low energy, rapid flicks of the fingers or hand” [McNeill 1992]. Behavior Markup Language bml is an XML-like mark up language specially suited for representing communicative behavior …
Closing the gender gap in STEM with friendly male instructors? On the effects of rapport behavior and gender of a virtual agent in an instructional interaction
NC Krämer, B Karacora, G Lucas, M Dehghani… – Computers & …, 2016 – Elsevier
JavaScript is disabled on your browser. Please enable JavaScript to use all the features on this page. Skip to main content Skip to article …
An eye gaze model for controlling the display of social status in believable virtual humans
M Nixon, S DiPaola, U Bernardet – 2018 IEEE Conference on …, 2018 – ieeexplore.ieee.org
… To do so, we first implemented a scenario within SmartBody that would facilitate the comparison of behavioral differences … The scenario was implemented in the SmartBody environment follow the scenario’s script. Four different evaluations were performed …
Investigating the role of social eye gaze in designing believable virtual characters
M Nixon – 2017 – summit.sfu.ca
… different videos of a scripted scenario between characters animated using the SmartBody procedural animation system. These studies found some significant … interactive system that incorporated eye tracking, dialogue, and control of a character in the SmartBody environment …
Combining Heterogeneous Digital Human Simulations
F Gaisbauer, E Lampen, P Agethen, E Rukzio – uni-ulm.de
… However, compared to Smartbody, it is the explicit target to create a platform and technology independent approach across the boundaries of … Using formats like the Behavior Markup Language (BML) [13], a basic sce- nario such as walk to, pick-up and put-down can be …
Lip Syncing Method for Realistic Expressive Three-dimensional Face Model
IRA Al-Rubaye – 2016 – eprints.utm.my
… 3.1 Research Framework 70 3.2 Research Methodology 71 3.3 SmartBody 3D face model from USC Institute for Creative Technologies(Feng and Shapiro, 2013) 76 3.4 Sample face reduction using PCA method from (Wei and Deng, 2015) (a) Among the 102 parameters, used …
Automating the production of communicative gestures in embodied characters
B Ravenet, C Pelachaud, C Clavel… – Frontiers in …, 2018 – frontiersin.org
In this paper we highlight the different challenges in modeling communicative gestures for Embodied Conversational Agents (ECAs). We describe models whose aim is to capture and understand the specific characteristics of communicative gestures in order to envision how an …
Towards adaptive social behavior generation for assistive robots using reinforcement learning
J Hemminahaus, S Kopp – 2017 12th ACM/IEEE International …, 2017 – ieeexplore.ieee.org
Page 1. Towards Adaptive Social Behavior Generation for Assistive Robots Using Reinforcement Learning Jacqueline Hemminghaus CITEC, Bielefeld University 33619 Bielefeld, Germany jhemming@techfak.uni-bielefeld.de …
The effect of an animated virtual character on mobile chat interactions
SH Kang, AW Feng, A Leuski, D Casas… – Proceedings of the 3rd …, 2015 – dl.acm.org
… The user responses to questions are stored in a remote datastore (Amazon Web Services). The system runs on an Android device using the SmartBody animation system … 2006. Towards a common framework for multimodal generation: The behavior markup language …
Toolkit for social experiments in VR
U Kristjánsson – 2019 – skemman.is
… This is visible in two languages defined within the scope of SAIBA. BML (Behaviour Markup Language) [17] and FML (Function Markup Language) [18]. BML defines high level syntax for describing behaviour in abstract physical terms …
A Computational Framework for Expressive, Personality-based, Non-verbal Behaviour for Affective 3D Character Agents
M Saberi – 2016 – summit.sfu.ca
Page 1. A Computational Framework for Expressive, Personality-based, Non-verbal Behaviour for Affective 3D Character Agents by Maryam Saberi B.Sc., Najafabad Azad University, Iran, 2003 MA, Chalmers University of Technology, Sweden, 2009 …
Building a backbone for multi-agent tutoring in GIFT (Work in progress)
BD Nye, D Auerbach, TR Mehta… – Proceedings of the 5th …, 2017 – books.google.com
… in Unity), the nonverbal behavioral generator (NVBG) to automatically determine gestures based on speech, and SmartBody to coordinate … K., Vilhjalmsson, H.,(2006) Towards a Common Framework for Multimodal Generation: The Behavior Markup Language Intelligent Virtual …
Games robots play: Once more, with feeling
R Aylett – Emotion in games, 2016 – Springer
… C, Pirker H, Thórisson KR, Vilhjálmsson H (2006) Towards a common framework for multimodal generation: the behavior markup language … Thiebaux M, Marsella S, Marshall AN, Kallmann M (2008) Smartbody: behavior realization for embodied conversational agents …
A virtual emotional freedom practitioner to deliver physical and emotional therapy
H Ranjbartabar – 2016 – researchonline.mq.edu.au
… 40 List of Abbreviations and Acronyms ASR Automatic Speech Recognition BML Behaviour Mark-up Language ECA Embodied Conversational Agent EEG Electroencephalography EFT Emotional Freedom Technique GSR Galvanic Skin Response IVA Intelligent Virtual Agent …
Expressive Inverse Kinematics Solving in Real-time for Virtual and Robotic Interactive Characters
T Ribeiro, A Paiva – arXiv preprint arXiv:1909.13875, 2019 – arxiv.org
… Smartbody is a popular procedural animation system in the virtual humans field [34] … Smartbody procedurally generates and adapts gestures using an example-based motion synthesis technique for locomotion, reach and object manipulation [35] …
Google, Inc.(search)
ACM Books – dl.acm.org
… FML expressions are input to behav- ior planning, which results in expressions in the Behavior Markup Language (BML) that describe the type and timing of head and body motions, deriving gestures from a “gesticon” (a set of primitive gestural elements) …
First Impressions Count! The Role of the Human’s Emotional State on Rapport Established with an Empathic versus Neutral Virtual Therapist
H Ranjbartabar, D Richards, A Bilgin… – IEEE Transactions on …, 2019 – ieeexplore.ieee.org
… The dialogue engine sends BML (Behavior Markup Language) message to NVBG (Non … The output of NVBG is also BML which are transformed into synchronized sequences of animations by Smartbody character animation system which is a BML realiza- tion engine [70] to …
An architecture for emotional facial expressions as social signals
R Aylett, C Ritter, MY Lim, F Broz… – IEEE Transactions …, 2019 – ieeexplore.ieee.org
… Interesting work in graphical characters has moved towards a standardised mark-up language for this purpose, Behavioural Markup Language (BML) [63] and to middleware based on this [64] such as SmartBody … Fig. 1. An example of BML from SmartBody, controlling Glance …
The TTS-driven affective embodied conversational agent EVA, based on a novel conversational-behavior generation algorithm
M Rojc, I Mlakar, Z Ka?i? – Engineering Applications of Artificial Intelligence, 2017 – Elsevier
… Function Markup Language (FML) (Cafaro et al., 2014), while co-verbal behavior is represented through Behavior Markup Language (BML) (Vilhjalmsson et al … et al., 2009), EMBR (Heloir and Kipp, 2010), EVA-Framework (Rojc and Mlakar, 2016), and SmartBody (Thiebaux et al …
Speech-driven animation with meaningful behaviors
N Sadoughi, C Busso – Speech Communication, 2019 – Elsevier
JavaScript is disabled on your browser. Please enable JavaScript to use all the features on this page. Skip to main content Skip to article …
State of the art in hand and finger modeling and animation
N Wheatland, Y Wang, H Song, M Neff… – Computer Graphics …, 2015 – Wiley Online Library
Abstract The human hand is a complex biological system able to perform numerous tasks with impressive accuracy and dexterity. Gestures furthermore play an important role in our daily interactions, …
Challenge discussion: advancing multimodal dialogue
J Allen, E André, PR Cohen, D Hakkani-Tür… – The Handbook of …, 2019 – dl.acm.org
Page 1. 5Challenge Discussion: Advancing Multimodal Dialogue James Allen, Elisabeth André, Philip R. Cohen, Dilek Hakkani-Tür, Ronald Kaplan, Oliver Lemon, David Traum 5.1 Introduction Arguably, dialogue management …
Multimodal conversational interaction with robots
G Skantze, J Gustafson, J Beskow – The Handbook of Multimodal …, 2019 – dl.acm.org
Page 1. 2Multimodal Conversational Interaction with Robots Gabriel Skantze, Joakim Gustafson, Jonas Beskow 2.1 Introduction Being able to communicate with machines through spoken interaction has been a long-standing vision in both science fiction and research labs …
Socially-Aware Dialogue System
R Zhao – 2019 – lti.cs.cmu.edu
Page 1. Socially-Aware Dialogue System Ran Zhao CMU-LTI-19-008 Language Technologies Institute School of Computer Science Carnegie Mellon University 5000 Forbes Ave., Pittsburgh, PA 15213 www.lti.cs.cmu.edu Thesis …
Effective directed gaze for character animation
T Pejsa – 2016 – search.proquest.com
Page 1. EFFECTIVE DIRECTED GAZE FOR CHARACTER ANIMATION by Tomislav Pejsa A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Computer Sciences) at the UNIVERSITY OF WISCONSIN–MADISON 2016 …
Capturing and Animating Hand and Finger Motion for 3D Communicative Characters
NS Wheatland – 2016 – escholarship.org
Page 1. UC Riverside UC Riverside Electronic Theses and Dissertations Title Capturing and Animating Hand and Finger Motion for 3D Communicative Characters Permalink https://escholarship.org/uc/item/39w9397t Author Wheatland, Nkenge Safiya Publication Date …
Learning data-driven models of non-verbal behaviors for building rapport using an intelligent virtual agent
R Amini – 2015 – digitalcommons.fiu.edu
Page 1. Florida International University FIU Digital Commons FIU Electronic Theses and Dissertations University Graduate School 3-25-2015 Learning Data-Driven Models of Non-Verbal Behaviors for Building Rapport Using an Intelligent Virtual Agent …
This is the author’s version of a work that was published in the following source
J Feine, U Gnewuch, S Morana, A Maedche – 2019 – researchgate.net
Page 1. This is the author’s version of a work that was published in the following source Feine, J., Gnewuch, U., Morana, S., & Maedche, A. (2019). A Taxonomy of Social Cues for Conversational Agents. International Journal of Human-Computer Studies, 132, 138-161 …