Notes:
The references below appear to be discussing the use of virtual human toolkits, which are collections of modules, tools, and libraries that support the creation and control of embodied conversational agents. These toolkits can be used to create virtual humans for use in a variety of applications, including in simulations and case studies. The virtual human toolkit can be used with the Unity game engine and includes modules for audio-visual sensing, automatic text-to-gesture rule generation, and more. It is designed for use by both expert and non-expert users, and has a modular architecture that allows modules to communicate with each other. The virtual human toolkit also includes a graphical user interface called the NPCEditor. The virtual human toolkit can be used to create Spanish-speaking characters and includes a speech synthesizer and speech recognizer. It is also mentioned that some virtual human toolkits are “robot-ready,” meaning they can be used to control the behavior of robots.
The Virtual Agent Interaction Framework (VAIF) is a software package that is designed to make it easy for users to create intelligent agents using the Unity game engine. VAIF allows users to create virtual characters that can interact with users in a natural and intuitive way, using speech and body language. These agents can be used in a variety of applications, including virtual assistants, customer service agents, and educational tools. VAIF includes a range of pre-built components and features that can be customized and combined to create unique and engaging virtual agents. Some of these features include natural language processing, real-time dialogue management, and advanced facial animation. By using VAIF, users can quickly and easily create intelligent agents that can enhance their applications and improve the user experience.
Resources:
- isguser/vaif .. virtual agent interaction framework
See also:
Virtual Human Toolkit 2010 (2x) | Virtual Human Toolkit 2011 (10x) | Virtual Human Toolkit 2012 (11x) | Virtual Human Toolkit 2013 (11x) | Virtual Human Toolkit 2014 (28x) | Virtual Human Toolkit 2015 (40x) | Virtual Human Toolkit 2016 (39x) | Virtual Human Toolkit 2017 (31x) | Virtual Human Toolkit 2018 (31x) | Virtual Human Toolkit 2019 (52x)
Multi-Platform Expansion of the Virtual Human Toolkit: Ubiquitous Conversational Agents
A Hartholt, E Fast, A Reilly, W Whitcup… – … Journal of Semantic …, 2020 – World Scientific
We present an extension of the Virtual Human Toolkit to include a range of computing platforms, including mobile, web, Virtual Reality (VR) and Augmented Reality (AR). The Toolkit uses a mix of in-house and commodity technologies to support audio-visual sensing …
Evaluating Feedback Strategies for Virtual Human Trainers
X Shang, AS Arif, M Kallmann – arXiv preprint arXiv:2011.11704, 2020 – arxiv.org
… modeling algorithms for accomplishing autonomous virtual humans and a number of software solutions have been developed for facilitating their inte- gration in applications, such as the Virtual Agent Inter- action Framework (VAIF) [12] and the Virtual Human Toolkit [14] …
Usability of the Virtual Agent Interaction Framework
D Novick, M Afravi, O Martinez, A Rodriguez… – … Conference on Human …, 2020 – Springer
… with embodied conversational agents. VAIF is intended for use by both expert and non-expert users, in contrast with more sophisticated and complex development tools such as the Virtual Human Toolkit. To determine if VAIF …
The Passive Sensing Agent: A Multimodal Adaptive mHealth Application
S Mozgai, A Hartholt, A Rizzo – 2020 IEEE International …, 2020 – ieeexplore.ieee.org
… B. Software Framework The PSA is a Unity application developed using a custom version of the Virtual Human Toolkit [14], which follows the SAIBA framework [15]. The Virtual Human Toolkit is “a collection of modules, tools …
The Design of Charismatic Behaviors for Virtual Humans
N Wang, L Pacheco, C Merchant, K Skistad… – Proceedings of the 20th …, 2020 – dl.acm.org
… the lecture. The virtual human can also speak through pre-recorded voice. The virtual human is imple- mented using a virtual human toolkit [27] and is capable of per- forming lip-syncing and other speech gestures. The study …
An Adaptive Agent-Based Interface for Personalized Health Interventions
S Mozgai, A Hartholt, AS Rizzo – … of the 25th International Conference on …, 2020 – dl.acm.org
… It is a Unity application developed using a custom version of the Virtual Human Toolkit [4]. This toolkit incorporates and enables automatic … al. 2013. All Together Now: Introducing the Virtual Human Toolkit. In 13th International Conference on Intelligent Virtual Agents …
Introducing Canvas: Combining Nonverbal Behavior Generation with User-Generated Content to Rapidly Create Educational Videos
A Hartholt, A Reilly, E Fast, S Mozgai – Proceedings of the 20th ACM …, 2020 – dl.acm.org
… 3 SYSTEM Canvas is built with the Virtual Human Toolkit, a collection of modules, tools, and libraries that support the research and devel- opment of intelligent virtual agents [6]. The Toolkit has a modular architecture, in which modules communicate to each other through …
Using String Metrics to Improve the Design of Virtual Conversational Characters: Behavior Simulator Development Study
S García-Carbajal, M Pipa-Muniz, JL Múgica – JMIR serious games, 2020 – games.jmir.org
… The most closely related works to that reported here are those of Hartholt et al [5] and Morie et al [11], where the virtual human toolkit is described. More recently, a framework for the rapid development of Spanish-speaking characters has been presented in Herrera et al [12] …
Agent-Based Dynamic Collaboration Support in a Smart Office Space
Y Wang, RC Murray, H Bao, C Rose – … of the 21th Annual Meeting of the …, 2020 – aclweb.org
… et al., 2003) and the Azure Speech Recognizer for speech recognition, the USC Institute for Creative Technologies Virtual Human Toolkit (VHT) to present an embodied conversational agent (Hartholt et al., 2013), OpenFace for face recognition (Amos et al., 2016), OpenPose for …
Triggering and measuring neural response indicative of inhibition in humans during conversations with virtual humans
G Ahamba – 2020 – usir.salford.ac.uk
Page 1. UNIVERSITY OF SALFORD PhD Thesis Triggering and measuring neural response indicative of inhibition in humans during conversations with virtual humans Godson Ahamba @00348636 June 2020 Supervisors Professor David Roberts Dr. Peter Eachus Page 2. i …
Affective and Human-Like Virtual Agents
B Budnarain – 2020 – uwspace.uwaterloo.ca
Page 1. Affective and Human-Like Virtual Agents by Neil Bhavendra Budnarain A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Master of Mathematics in Computer Science Waterloo, Ontario, Canada, 2020 …
Human swarm interaction using plays, audibles, and a virtual spokesperson
P Chaffey, R Artstein, K Georgila… – … Learning for Multi …, 2020 – spiedigitallibrary.org
SPIE Digital Library Proceedings.
Automatic text?to?gesture rule generation for embodied conversational agents
G Ali, M Lee, JI Hwang – Computer Animation and Virtual …, 2020 – Wiley Online Library
… We used the animations provided by ICT Virtual Human Toolkit.14 This data set will be referred to as Gesture data set. Gesture data set contains variable?length animations … The maps are: Manual: ICT Virtual Human Toolkit’s default rule?base available in the toolkit …
A Framework for Design of Conversational Agents to Support Health Self-Care for Older Adults
DG Morrow, HC Lane, WA Rogers – Human Factors, 2020 – journals.sagepub.com
ObjectiveWe examined the potential of conversational agents (CAs) to support older adults’ self-care related to chronic illness in light of lessons learned from decades of pedagogical agent researc…
Understanding the predictability of gesture parameters from speech and their perceptual importance
Y Ferstl, M Neff, R McDonnell – … of the 20th ACM International Conference …, 2020 – dl.acm.org
… All stimuli can be viewed at https://www.youtube. com/playlist?list=PLLrShDUC_FZzhe mzr0g1ekt1jz45-y_u3. 5.2 Experiment The experiment was designed with the Unity3D game engine and the open-source Virtual Human Toolkit (VHTK) [16] …
Which Model Should We Use for a Real-World Conversational Dialogue System? a Cross-Language Relevance Model or a Deep Neural Net?
SH Alavi, A Leuski, D Traum – … of The 12th Language Resources and …, 2020 – aclweb.org
… We were interested in this ap- proach as a baseline for our experiments because it seems to be robust and has a small number parameters to tune. The NPCEditor is publicly availible as part of the Virtual Human Toolkit (Hartholt et al., 2013) …
An introduction to clinical simulation (CS) for orofacial myologists: COVID-19’s impact on clinical education
HC Reed – International Journal of Orofacial Myology and …, 2020 – ijom.iaom.com
… Tools are also available to create your own simulations and case studies. These include SmartSparrow, SecondLife, Kynectiv: DecisionSim, Virtual Human Toolkit, iHuman, Virtual Case Creator, WebSP, AvatarKinect, X-box Kinect (SDK) (Williams et al., 2013) …
Conversational Systems Research in Spain: A Scientometric Approach
D Griol, Z Callejas – Conversational Dialogue Systems for the Next …, 2020 – Springer
… OR “spoken dialog system” OR “spoken dialogue system” OR “spoken dialog systems” OR “spoken dialogue systems” OR “spoken humancomputer interaction” OR “spoken question answering” OR “spoken virtual agents” OR “spoken virtual human toolkit” OR “statistical dialog …
Spoken Language Interaction with Robots: Research Issues and Recommendations, Report from the NSF Future Directions Workshop
M Marge, C Espy-Wilson, N Ward – arXiv preprint arXiv:2011.05533, 2020 – arxiv.org
… none of these components are truly “robot-ready.” There are also a num- ber of notable integrated frameworks and toolkits, several of which have been successfully used in robots, including InproTK, OpenDial, IrisTK, DIARC, RETICO, Plato, and the ICT Virtual Human Toolkit …
Conversational AI: Dialogue Systems, Conversational Agents, and Chatbots
M McTear – Synthesis Lectures on Human Language …, 2020 – morganclaypool.com
Page 1. MCT EAR C ONVE R SA T ION AL AI M O R GAN & CL A YPOO L Page 2. Page 3. Conversational AI Dialogue Systems, Conversational Agents, and Chatbots Page 4. Page 5. Synthesis Lectures on Human Language Technologies …
Conversational Systems Research in Spain: A Scientometric Approach
D Griol Barres, Z Callejas Carrión – 2020 – digibug.ugr.es
… OR “spoken dialog system” OR “spoken dialogue system” OR “spoken dialog systems” OR “spoken dialogue systems” OR “spoken humancomputer interaction” OR “spoken question answering” OR “spoken virtual agents” OR “spoken virtual human toolkit” OR “statistical dialog …
Toward Genuine Robot Teammates: Coordination Through Dialogue
F Gervits – 2020 – search.proquest.com
… 122 6.2.2 Corpus and Annotation . . . . . 124 6.3 Dialogue Modeling in the Virtual Human Toolkit . . . . . 125 6.3.1 Classification Approach . . . . . 125 6.3.2 Data Processing . . . . . 126 …
Modeling Visual Minutiae: Gestures, Styles, and Temporal Patterns
SS Ginosar – 2020 – search.proquest.com
Modeling Visual Minutiae: Gestures, Styles, and Temporal Patterns. Abstract. The human visual system is highly adept at making use of the rich subtleties of the visual world such as non-verbal communication signals, style, emotion, and the fine-grained details of individuals …
From Systems to Services: Changing the Way We Conceptualize ITSs–A Theoretical Framework and Proof-of-concept
BR Colby – 2020 – scholarsarchive.byu.edu
Page 1. Brigham Young University BYU ScholarsArchive Theses and Dissertations 2020-04-07 From Systems to Services: Changing the Way We Conceptualize ITSs — A Theoretical Framework and Proof-of-concept Brice R. Colby Brigham Young University …