Notes:
The Virtual Human Toolkit is a software platform for creating and simulating virtual human characters, including their appearance, behavior, and interactions with other characters or users. In 2016, the Virtual Human Toolkit was being used in a variety of ways, including:
- In education and training, to create virtual characters that could be used to simulate realistic scenarios and interactions for students to learn from.
- In psychology and neuroscience research, to study how people interact with and respond to virtual characters, and to test theories about human behavior and cognition.
- In entertainment and gaming, to create more realistic and immersive virtual environments and characters.
- In customer service and other business applications, to create virtual assistants and other digital characters that could interact with customers and provide information or support.
The references below discusses the use of virtual humans in various contexts, including human-computer interaction, dialogue systems, and cognitive science research. It mentions the Virtual Human Toolkit, which is a collection of tools and libraries for creating virtual human characters and facilitating their use in research and development. The Toolkit includes features for dialog and character design, nonverbal behaviors, and other aspects of social interaction. The text also mentions the use of virtual humans in the creation of audio-visual prompts, the integration of virtual agents into classical task environments, and the detection and adaptation of differences in communication style. It discusses the use of virtual humans in the context of chatbots and the potential use of virtual humans in the delivery of physical and emotional therapy. The text also mentions the importance of emotional expression in virtual agents and the potential use of virtual humans in the examination of deception.
Virtual humans can be used to study deception by simulating social interactions in which one party (the virtual human) tries to deceive the other (typically a human participant). In these studies, researchers can manipulate various variables, such as the virtual human’s nonverbal behavior or the content of their statements, to examine how these factors influence the ability of the human participant to detect deception. The results of these studies can help researchers understand how people detect deception in social interactions and can inform the development of more effective methods for detecting deception in real-world situations.
See also:
Virtual Human Toolkit 2010 | Virtual Human Toolkit 2011 | Virtual Human Toolkit 2012 | Virtual Human Toolkit 2013 | Virtual Human Toolkit 2014 | Virtual Human Toolkit 2015 | Virtual Human Toolkit 2017 | Virtual Human Toolkit 2018
A study of elderly people’s emotional understanding of prompts given by virtual humans
A Malhotra, J Hoey, A König, S van Vuuren – Proceedings of the 10th …, 2016 – dl.acm.org
… Our long-term aim is to build technology that will detect and adapt to these differences. Our study took place in two phases. In Phase I, described in [12, 13], we created a set of audio-visual prompts, using a virtual human developed with the USC Virtual Human Toolkit (VHT) …
Cross Modal Evaluation of High Quality Emotional Speech Synthesis with the Virtual Human Toolkit
B Potard, MP Aylett, DA Baude – International Conference on Intelligent …, 2016 – Springer
Emotional expression is a key requirement for intelligent virtual agents. In order for an agent to produce dynamic spoken content speech synthesis is required. However, despite substantial work with pre-recorded prompts, very little work has explored the combined effect …
The benefits of virtual humans for teaching negotiation
J Gratch, D DeVault, G Lucas – International Conference on Intelligent …, 2016 – Springer
… problems. The current, wizard-of-Oz (WOz) system allows students to communicate through natural language and nonverbal expressions. CRA is implemented with the publicly-available Virtual Human Toolkit [25]. Low-level …
Human-robot interaction design using Interaction Composer eight years of lessons learned
DF Glas, T Kanda, H Ishiguro – 2016 11th ACM/IEEE …, 2016 – ieeexplore.ieee.org
… Several tools have also been developed for virtual agents, such as the NPCEditor framework in the Virtual Human Toolkit, which includes graphical tools for developing and editing question-answer dialogs, nonverbal behaviors, and other aspects of social interaction [14] …
Socially-aware animated intelligent personal assistant agent
Y Matsuyama, A Bhardwaj, R Zhao, O Romeo… – Proceedings of the 17th …, 2016 – aclweb.org
… Figure 1 shows the overview of the architecture. All modules of the system are built on top of the Virtual Human Toolkit (Hartholt et al., 2013). Main modules of our architecture are described below. 2.1 Visual and Vocal Input Analysis …
Conversational interfaces: devices, wearables, virtual agents, and robots
M McTear, Z Callejas, D Griol – The Conversational Interface, 2016 – Springer
… The Virtual Human Toolkit . 32 The Institute for Creative Technologies (ICT) Virtual Human Toolkit is a collection of modules, tools, and libraries designed to aid and support researchers and developers with the creation of ECAs …
A Virtual Emotional Freedom Therapy Practitioner
H Ranjbartabar, D Richards – … of the 2016 International Conference on …, 2016 – dl.acm.org
… Our virtual EFT practitioner, known as EFFIE the Emotional Freedom FrIEnd, is based on Virtual Human Toolkit components [6]. As seen in Figure 1 for the system to deliver credible and effective communication in real time, we developed a dialogue engine written in C# …
What kind of stories should a virtual human swap?
SN Gilani, K Sheetz, G Lucas, D Traum – International Conference on …, 2016 – Springer
… Our agents were built using the Virtual Human Toolkit [11] … Association for Computational LinguisticsGoogle Scholar. 11. Hartholt, A., Traum, D., Marsella, SC, Shapiro, A., Stratou, G., Leuski, A., Morency, L.-P., Gratch, J.: All together now: introducing the virtual human toolkit …
IAGO: interactive arbitration guide online
J Mell, J Gratch – Proceedings of the 2016 International Conference on …, 2016 – dl.acm.org
… real-time through the interface. We use under-development aspects of the Virtual Human Toolkit2 to create the virtual character on screen and simulate the anima- tions, expressions, and backchannels. Users familiar with the …
Erica: The erato intelligent conversational android
DF Glas, T Minato, CT Ishi, T Kawahara… – 2016 25th IEEE …, 2016 – ieeexplore.ieee.org
… 1) Non-humanoid robots and virtual agents Highly realistic virtual agents have been created that can interact conversationally. The Virtual Human Toolkit [3] provides a set of tools for dialog and character design for photorealistic animated graphical avatars …
Niki and Julie: a robot and virtual human for studying multimodal social interaction
R Artstein, D Traum, J Boberg, A Gainer… – Proceedings of the 18th …, 2016 – dl.acm.org
… Figure 2). The interface runs in a web browser and sends messages using the VHMsg messaging protocol to trigger agent be- haviors.1 The system architecture is shown in Figure 3. Behaviors of Julie, the virtual human, were created using the Virtual Human Toolkit [5], and her …
Chatbots’ Greetings to Human-Computer Communication
MJ Pereira, L Coheur, P Fialho, R Ribeiro – arXiv preprint arXiv …, 2016 – arxiv.org
… It should be clear that we exclude from this survey, authoring platforms such as the IrisTK26, the Visual SceneMaker (Gebhard et al., 2011), or the Virtual Human Toolkit27 (Hartholt et al., 2013), as these target multi-modal dialogue systems and not chatbots, as defined in the …
The Rise of the Conversational Interface: A New Kid on the Block?
MF McTear – International Workshop on Future and Emerging …, 2016 – Springer
… See [5] for descriptions of many of these standards. Toolkits, for example, the Virtual Human Toolkit 20 and ACE (Articulated Communicator Engine). 21. For more detailed descriptions of ECAs, see [14], especially Chaps. 13–16. 3.4 Chatbots …
Dialport: Connecting the spoken dialog research community to real user data
T Zhao, K Lee, M Eskenazi – 2016 IEEE Spoken Language …, 2016 – ieeexplore.ieee.org
… Popular frameworks from academia include VoiceXML [2], CSLU toolkit [3], Olym- pus [4], Trindikit [5], Opendial [6], Virtual Human Toolkit [7] and many others. Recently several enterprise-level services have been released from industry …
Interviewing suspects with avatars: Avatars are more effective when perceived as human
S Ströfer, EG Ufkes, M Bruijnes, E Giebels… – Frontiers in …, 2016 – frontiersin.org
It has been consistently demonstrated that deceivers generally can be discriminated from truth tellers by monitoring an increase in their physiological response. But is this still the case when deceivers interact with a virtual avatar? The present research investigated whether the mere …
Rapid low-cost virtual human bootstrapping via the crowd
M Borish, B Lok – ACM Transactions on Intelligent Systems and …, 2016 – dl.acm.org
… [2013]. In this work, Hartholt et al. describes the Virtual Human Toolkit for the quick assembling of all major components of a VH such as corpus, visualization, and animation. All of these attempts at standardization and streamlining share one common detail …
A Design Proposition for Interactive Virtual Tutors in an Informed Environment
J Taoum, B Nakhal, E Bevacqua, R Querrec – International Conference on …, 2016 – Springer
… The BehaviorPlanner and BehaviorRealizer classes are abstract, which means that we can propose several implementations according to the ECA platform that we want to use (Greta [17], Virtual Human Toolkit [18], MARC [19], etc.) …
Synthesizing realistic image-based avatars by body sway analysis
M Nishiyama, T Miyauchi, H Yoshimura… – Proceedings of the Fourth …, 2016 – dl.acm.org
… 7. A. Hartholt, D. Traum, SC Marsella, A. Shapiro, G. Stratou, A. Leuski, LP Morency, and J. Gratch. 2013. All Together Now Introducing the Virtual Human Toolkit. In Proceedings of the 13th International Conference on Intelligent Virtual Agents (IVA). 368–381. 160 Page 7 …
Design and Study of Emotions in Virtual Humans for Assistive Technologies
A Malhotra – 2016 – uwspace.uwaterloo.ca
… a set of audio-visual prompts, using a Virtual Human Toolkit (VHT)3 developed by the University of Southern California’s Institute for Creative Technologies. We built a set of six audio-visual prompts (for … 10 Virtual Human Toolkit (https://vhtoolkit.ict.usc.edu) Page 22. 12 …
New dimensions in testimony demonstration
R Artstein, A Gainer, K Georgila, A Leuski… – Proceedings of the …, 2016 – aclweb.org
… The system is built on top of the components from the USC ICT Virtual Human Toolkit, which is pub- licly available (Hartholt et al., 2013).1 Specifi- cally, we use the AcquireSpeech tool for capturing the user’s speech, CMU PocketSphinx2 and Google Chrome ASR3 tools for …
Virtual human technologies for cognitively-impaired older adults’ care: the LOUISE and Virtual Promenade experiments
P Wargnier – 2016 – pastel.archives-ouvertes.fr
Page 1. HAL Id: tel-01531137 https://pastel.archives-ouvertes.fr/tel-01531137 Submitted on 1 Jun 2017 HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not …
Agent-Based Practices for an Intelligent Tutoring System Architecture
K Brawner, G Goodwin, R Sottilare – International Conference on …, 2016 – Springer
… Agents will be needed to broker the needs of systems like GIFT and the potential catalogue of services available in the larger learning ecosystem. As an example, the Virtual Human Toolkit (VHTk) provides the functionality to model the various aspects of virtual humans …
Establish Trust and Express Attitude for a Non-Humanoid Robot.
M Si, JD McDaniel – CogSci, 2016 – researchgate.net
… for driving the robot’s movements during the interaction. In the condition which included a virtual face on Baxter’s display screen, we utilized the Virtual Human Toolkit (VHT) (Hartholt et al., 2013). We used the face of a female …
Time-Offset Conversations on a Life-Sized Automultiscopic Projector Array
A Jones, J Unger, K Nagano, J Busch… – Proceedings of the …, 2016 – cv-foundation.org
… topic. Message passing be- tween speech recognizer, dialog manager, and video player is based on the publicly available virtual human toolkit [6]. 7. Results … Gratch. All Together Now: Introducing the Virtual Human Toolkit …
Automated Questions for Chat Dialogues with a Student Office Virtual Agent
M Muron, K Jokinen – workshop.colips.org
… The work uses the Virtual Human Toolkit, and the main motiva- tion has been to provide a simple and useful way to automatically transform in- formation from websites into a dialogue with a virtual human. Keywords: Virtual Agent, Chat Dialogues, Web-based WikiTalk …
Intelligent Virtual Agents: 16th International Conference, IVA 2016, Los Angeles, CA, USA, September 20–23, 2016, Proceedings
D Traum, W Swartout, P Khooshabeh, S Kopp… – 2016 – books.google.com
… Specific Stance….. 175 Thomas Janssoone, Chloé Clavel, Kévin Bailly, and Gaël Richard Cross Modal Evaluation of High Quality Emotional Speech Synthesis with the Virtual Human Toolkit….. 190 Blaise …
project SENSE–Multimodal Simulation with Full-Body Real-Time Verbal and Nonverbal Interactions
H Miri, J Kolkmeier, PJ Taylor, R Poppe… – … Conference on Intelligent …, 2016 – Springer
… The Virtual Human Toolkit (VHTK) is a collection of modules, tools, and libraries, to allow for the creation of virtual conversational characters [8]. We fused the VHTK with Oculus Rift and Xsens MVN Awinda system, in order to augment the nonverbal method of examining …
A virtual emotional freedom practitioner to deliver physical and emotional therapy
H Ranjbartabar – 2016 – researchonline.mq.edu.au
… behaviour generation (DeVault et al., 2014). Simsensei Kiosk is based on the Virtual Human Toolkit (VHToolkit) architecture (Hartholt et al., 2013) as illustrated in Figure 5. It has two main functions, a virtual human called Ellie, who …
The Advanced Exploitation of Mixed Reality (AEMR) Community of Interest
MO Rodas, J Waters, D Rousseau – International Conference on Human …, 2016 – Springer
… This includes innovative software systems such as the Virtual Human Toolkit by the Institute for Creative Technologies (ICT) at University of Southern California (USC) [11], which enables the creation of nuanced virtual characters who move, talk, and interact in true to life forms …
Avatars 4 all: an avatar generation toolchain
M Sili, E Broneder, M Morandell, C Mayer – Proceedings of the 10th EAI …, 2016 – dl.acm.org
… One example for such a solution is the “Virtual Human Toolkit” provided by the University of Southern California [7]. The toolkit offers a large number of modules and features and it is available free of charge for academic research …
Westminster Serious Games Platform (wmin-SGP) a tool for real-time authoring of roleplay simulations for learning
D Economou, I Doumanis, F Pedersen… – … on Future Intelligent …, 2016 – clok.uclan.ac.uk
… 3.1 wmin-SGP development tools The wmin-SGP platform has been developed using Unity 3D game platform [58] coupled with the ICT Virtual Human toolkit [22]. The latter is an open-source collection of modules, tools and libraries that facilitates the creation of embodied VHs …
The dialport portal: Grouping diverse types of spoken dialog systems
T Zhao, K Lee, M Eskenazi – Workshop on Chatbots and …, 2016 – workshop.colips.org
… Popular frame- works from academia include VoiceXML [10], the CSLU toolkit [16], Olympus [2], Trindikit [8], Opendial [7] and the Virtual Human Toolkit [5] and many others. Recently several services have been released from industry …
Affective computing as complex systems science
W Lee, MD Norman – Procedia Computer Science, 2016 – Elsevier
… 1: Visual description of proposed methodology: integrating complex systems based modelling with appraisal Page 6. 23 William Lee and Michael D. Norman / Procedia Computer Science 95 ( 2016 ) 18 – 23 with other applications (eg, Virtual Human Toolkit) is also no easy task …
Design a simulated multimedia enriched immersive learning environment (SMILE) for nursing care of dementia patient
S Roomkham – 2016 – eprints.qut.edu.au
Page 1. DESIGN A SIMULATED MULTIMEDIA ENRICHED IMMERSIVE LEARNING ENVIRONMENT (SMILE) FOR NURSING CARE OF DEMENTIA PATIENT Sirinthip Roomkham Submitted in fulfilment of the requirements for the degree of …
Interviewing Suspects with Avatars
S Ströfer, EG Ufkes, M Bruijnes, E Giebels… – DECEPTIVE INTENT – ris.utwente.nl
Page 62. Chapter 4 Interviewing Suspects with Avatars Avatars are more effective when perceived as human This chapter is based on: Ströfer, S., Ufkes, EG, Bruijnes, M., Giebels, E., & Noordzij, ML (submitted). Interviewing …
Geometric Energies for Haptic Assembly
M Behandish – 2016 – opencommons.uconn.edu
Page 1. University of Connecticut DigitalCommons@UConn Master’s Theses University of Connecticut Graduate School 4-27-2016 Geometric Energies for Haptic Assembly Morad Behandish University of Connecticut – Storrs, m.behandish@uconn.edu …
Rendering for Automultiscopic 3D Displays
A Jones – 2016 – search.proquest.com
Rendering for Automultiscopic 3D Displays. Abstract. While a great deal of computer generated imagery is modelled and rendered in three dimensions, the vast majority of this 3D imagery is shown on two-dimensional displays …
Improving visual speech synthesis using Decision Tree Models
CF Rademan – 2016 – scholar.sun.ac.za
Page 1. Improving Visual Speech Synthesis using Decision Tree Models by Christiaan Frans Rademan Thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Electronic Engineering in the Faculty of Engineering at Stellenbosch University …
Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience: 10th International Conference, AC 2016, Held as Part of HCI …
DD Schmorrow, CM Fidopiastis – 2016 – books.google.com
Page 1. Dylan D. Schmorrow Cali M. Fidopiastis (Eds.) Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience 10th International Conference, AC 2016 Held as Part of HCI International 2016 …