Human-Robot Dialog 2018


Notes:

ADE Agent Development Environment contains components of the DIARC (Distributed Integrated Affect, Reflection, and Cognition) architecture.

  • Human-robot dialog
  • Human-robot dialogue

Resources:

Wikipedia:

See also:

100 Best Robot Speech Recognition Videos


Thank you for sharing that interesting fact!: Effects of capability and context on indirect speech act use in task-based human-robot dialogue
T Williams, D Thames, J Novakoff… – Proceedings of the 2018 …, 2018 – dl.acm.org
Naturally interacting robots must be able to understand natural human speech. As such, recent work has sought to allow robots to infer the intentions behind commonly used non-literal utterances such as indirect speech acts (ISAs). However, it is still unclear to what …

Balancing Efficiency and Coverage in Human-Robot Dialogue Collection
M Marge, C Bonial, S Lukin, C Hayes, A Foots… – arXiv preprint arXiv …, 2018 – arxiv.org
We describe a multi-phased Wizard-of-Oz approach to collecting human-robot dialogue in a collaborative search and navigation task. The data is being used to train an initial automated robot dialogue system to support collaborative exploration tasks. In the first phase, a wizard …

Continually improving grounded natural language understanding through human-robot dialog
JD Thomason – 2018 – repositories.lib.utexas.edu
As robots become ubiquitous in homes and workplaces such as hospitals and factories, they must be able to communicate with humans. Several kinds of knowledge are required to understand and respond to a human’s natural language commands and questions. If a …

Experiments in proactive symbol grounding for efficient physically situated human-robot dialogue
J Arkin, TM Howard – Late-breaking Track at the SIGDIAL Special …, 2018 – robodial.github.io
Real-time performance of human-robot dialogue is important for making the interaction effective and worthwhile. Contemporary approaches treat language understanding and generation as reactive processes that construct a new inference or inverse-semantics model …

Improved Models and Queries for Grounded Human-Robot Dialog
A Padmakumar – cs.utexas.edu
The ability to understand and communicate in natural language can make robots much more accessible for naive users. Environments such as homes and offices contain many objects that humans describe in diverse language referencing perceptual properties. Robots …

Simultaneous Intention Estimation and Knowledge Augmentation via Human-Robot Dialog
S Bajracharya, S Amiri, J Thomason, S Zhang – cs.binghamton.edu
Robots have been able to interact with humans using natural language, and to identify service requests through human-robot dialog. However, few robots are able to improve their language capabilities from this experience. In this paper, we develop a dialog agent for …

Consequences and Factors of Stylistic Differences in Human-Robot Dialogue
SM Lukin, KA Pollard, C Bonial, M Marge… – arXiv preprint arXiv …, 2018 – arxiv.org
This paper identifies stylistic differences in instruction-giving observed in a corpus of human-robot dialogue. Differences in verbosity and structure (ie, single-intent vs. multi-intent instructions) arose naturally without restrictions or prior guidance on how users should …

Faster Pace in Human-Robot Dialogue Leads to Fewer Dialogue Overlaps
C Henry, C Gordon, D Traum, SM Lukin, KA Pollard… – cassidyhenry.com
In this paper, dialogue overlap at the transaction unit structure level is examined. In particular we investigate a corpus of multi-floor dialogue in a human-robot navigation domain. Two conditions are contrasted: a human wizard typing with a keyboard vs using a …

Engagement Recognition based on Multimodal Behaviors for Human-Robot Dialogue
K Inoue – 2018 – repository.kulib.kyoto-u.ac.jp
It is becoming a reality for humans to interact with robots in daily life. Conversational robots are expected to be involved in human society in a symbiotic manner by playing a specific social role. The goal of this study is to realize natural and smooth dialogue between humans …

Multi-Modal Robot Apprenticeship: Imitation Learning Using Linearly Decayed DMP+ in a Human-Robot Dialogue System
Y Wu, R Wang, LF D’Haro, RE Banchs… – 2018 IEEE/RSJ …, 2018 – ieeexplore.ieee.org
Robot learning by demonstration gives robots the ability to learn tasks which they have not been programmed to do before. The paradigm allows robots to work in a greater range of real-world applications in our daily life. However, this paradigm has traditionally been …

The Bot Language Project: Moving Towards Natural Dialogue with Robots
C Henry, S Lukin, KA Pollard, C Bonial, A Foots… – cassidyhenry.com
… 2017. Laying Down the Yellow Brick Road: Development of a Wizard-of-Oz Inter- face for Collecting Human-Robot Dialogue. Proc … 2017. Towards Efficient Human- Robot Dialogue Collection: Moving Fido into the VirtualWorld. Vancouver, Canada …

Toward ethical natural language generation for human-robot interaction
T Williams – Companion of the 2018 ACM/IEEE International …, 2018 – dl.acm.org
… 2013. Toward Information Theoretic Human-Robot Dialog. Robotics 32 (2013), 409–417 … 2017. Resolution of Referential Ambiguity in Human-Robot Dialogue Using Dempster-Shafer Theoretic Pragmatics. In Proceedings of Robotics: Science and Systems …

Evaluating robot behavior in response to natural language
P Moolchandani, CJ Hayes, M Marge – Companion of the 2018 ACM …, 2018 – dl.acm.org
… in a series of tasks. We con- ducted a study with 21 volunteers that analyzed how a virtual robot behaved when executing eight navigation instructions from a corpus of human-robot dialogue. Initial findings suggest that movement …

Turn-Taking Strategies for Human-Robot Peer-Learning Dialogue
R Das, H Pon-Barry – Proceedings of the 19th Annual SIGdial Meeting …, 2018 – aclweb.org
… These two goals are also relevant to human- robot dialogue with a teachable robot: a robot who acts as a peer to a student and prompts the stu- dent to teach them the material (Jacq et al., 2016; Lubold et al., 2018b) … 5 Comparison with Human-Robot Dialogue Interaction …

Scoutbot: A dialogue system for collaborative navigation
SM Lukin, F Gervits, CJ Hayes, A Leuski… – arXiv preprint arXiv …, 2018 – arxiv.org
… It is trained on human-robot dialogue collected from Wizard-of-Oz experiments, where robot responses were initiated by a human wiz- ard in previous interactions … 2016. Assessing Agree- ment in Human-Robot Dialogue Strategies: A Tale of Two Wizards. In Proc …

Reading with Robots: Towards a Human-Robot Book Discussion System for Elderly Adults
N Parde – Thirty-Second AAAI Conference on Artificial …, 2018 – aaai.org
… facilitate such exercise. In this thesis, I propose the first human-robot dialogue system designed specifically to promote cognitive exercise in elderly adults, through discussions about interest- ing metaphors in books. I describe …

Effects of Gender Stereotypes on Trust and Likability in Spoken Human-Robot Interaction
M Kraus, J Kraus, M Baumann, W Minker – Proceedings of the Eleventh …, 2018 – aclweb.org
… tion. Section 3 deals with the integration of gender stereo- types in spoken human-robot dialogue using the humanoid robot Aldebaran NAO. Here … traits. 3. Integration of Gender Stereotypes in Spoken Human-Robot Dialogue For …

HRI Design Research for Intelligent Household Service Robots: Teler as a Case Study
S Zhang, J Qin, S Cao, J Dou – … of Design, User Experience, and Usability, 2018 – Springer
… information architecture in HRI design. So, in this paper we analyze the HRI relationships and initiative human-robot interaction, at the same time, we take human-robot dialogue mechanism into consideration. The paper analyzed the …

Using Deep Learning and an External Knowledge Base to Develop Human-Robot Dialogues
JY Huang, TA Lin, WP Lee – 2018 IEEE International …, 2018 – ieeexplore.ieee.org
… 2) to perform human-robot dialogue, in which the LSTM contains memory blocks in the recurrent hidden layer that can store the temporal … [3] J. Thomason, S. Zhang, R. Mooney, P. Stone, “Learning to interpret natural language commands through human-robot dialog,” in: Proc …

The Case for Systematically Derived Spatial Language Usage
B Dorr, C Voss – Proceedings of the First International Workshop on …, 2018 – aclweb.org
… The emphasis of this position paper is on the representational underpinnings of spatial expres- sions for problems such as natural-language medi- ated two-way human-robot dialogue … 2018. Human-robot dialogue and collaboration in search and navigation …

Interaction and Autonomy in RoboCup@ Home and Building-Wide Intelligence
J Hart, H Yedidsion, Y Jiang, N Walker, R Shah… – arXiv preprint arXiv …, 2018 – arxiv.org
… [Thomason et al. 2015] Thomason, J.; Zhang, S.; Mooney, R.; and Stone, P. 2015. Learning to interpret natural lan- guage commands through human-robot dialog … Jointly improving pars- ing and perception for natural language commands through human-robot dialog …

How We Talk with Robots: Eliciting Minimally-Constrained Speech to Build Natural Language Interfaces and Capabilities
KA Pollard, SM Lukin, M Marge… – Proceedings of the …, 2018 – journals.sagepub.com
… to, use the system. A core contribution of our work has been the creation of a body (corpus) of minimally-constrained human-robot dialogue drawn from participant interactions with a robot teammate. Our approach is derived …

Who Should I Run Over?”: Long-term ethical implications of natural language generation
T Williams – Proceedings of the 2018 HRI workshop on …, 2018 – researchgate.net
… 2013. Toward Information Theoretic Human-Robot Dialog. Robotics 32 (2013), 409–417 … 2017. Resolution of Referential Ambiguity in Human-Robot Dialogue Using Dempster-Shafer Theoretic Pragmatics. In Proceedings of Robotics: Science and Systems.

SMILEE: Symmetric multi-modal interactions with language-gesture enabled (AI) embodiment
S Kim, D Salter, L DeLuccia, K Son, MR Amer… – Proceedings of the …, 2018 – aclweb.org
… 2009. Evaluating description and reference strategies in a cooperative human-robot dialogue system. IJCAI … 2014. Back to the blocks world: Learning new actions through situated human-robot dialogue. In SIGDIAL Conference …

Effect of Explicit Emotional Adaptation on Prosocial Behavior of Humans towards Robots depends on Prior Robot Experience
B Kühnlenz, K Kühnlenz, F Busse… – 2018 27th IEEE …, 2018 – ieeexplore.ieee.org
… human-like facial Action Units. A human-robot dialog scenario is chosen using NAO pretending to work for a supermarket and involving humans providing object names to the robot for training purpo- ses. In a user study, two …

Interactively picking real-world objects with unconstrained spoken language instructions
J Hatori, Y Kikuchi, S Kobayashi… – … on Robotics and …, 2018 – ieeexplore.ieee.org
… of natural language instructions has received attention in the field of robotics [7]–[10], our work is the first to propose a comprehensive system integrating the process of interactive clarification while supporting uncon- strained spoken instructions through human–robot dialogue …

Dialogue Structure Annotation for Multi-Floor Interaction
D Traum, C Henry, S Lukin, R Artstein… – Proceedings of the …, 2018 – aclweb.org
… In the next section, we present the annotation scheme. This is applied to a corpus of human-robot dialogue (section 3), and the scheme is shown to have high inter-annotator re- liability (section 4). Our objectives for the annotation are two-fold …

The SMOOTH Robot: Design for a Novel Modular Welfare Robot
WK Juel, F Haarslev, K Fischer… – … Workshop on Elderly …, 2018 – manoonpong.com
… The important aspect is motivating the elderly to drink. This involves human-robot dialog, and the technology developed can also be transferred to other contexts, such as receptions at conferences or celebrations. Use case 3 (fig …

Vibrational Artificial Subtle Expressions: Conveying System’s Confidence Level to Users by Means of Smartphone Vibration
T Komatsu, K Kobayashi, S Yamada… – Proceedings of the …, 2018 – dl.acm.org
… speech sounds [17]1. ASEs can be implemented not only as auditory ASEs but also as visual or motion ASEs [8,13,31]. For example, Funakoshi et al. [8] proposed visual ASEs for human-robot dialogue. An LED was implemented …

Proceedings of the 19th Annual SIGdial Meeting on Discourse and Dialogue
K Komatani, D Litman, K Yu, A Papangelis… – Proceedings of the 19th …, 2018 – aclweb.org
… 99 Consequences and Factors of Stylistic Differences in Human-Robot Dialogue Stephanie Lukin, Kimberly Pollard, Claire Bonial, Matthew Marge, Cassidy Henry, Ron Artstein, David Traum and Clare Voss …

User affect and no-match dialogue scenarios: an analysis of facial expression
JB Wiggins, M Kulkarni, W Min, KE Boyer… – Proceedings of the 4th …, 2018 – dl.acm.org
… community. For example, multimodal considerations are central in the design of human-robot dialogue interactions [26] and for dialogue systems that engage in interviews [8], training [35], and interactions with children [42] …

Perception-enhancement based task learning and action scheduling for robotic limb in CPS environment
S Li, M Shi, R Huang, X Chen, G Pan – Future Generation Computer …, 2018 – Elsevier
… By combining speech recognition and web access, a smart approach for robotic task learning from human–robot dialog and web access is applied to collaborative robots [2], which improves understanding of human commands …

Modeling and Development of a Spoken Natural Language Interface for Autonomous Robot Interaction
AS Brandão, AG Caldeira – … and 2018 Workshop on Robotics in …, 2018 – ieeexplore.ieee.org
… There are various applications for robotics such as control – for general mobile robots [5], [6], [7], prosthetics and robot arms [8], [9], robotic wheelchairs [10] – as well as human-robot dialogue, security and surveillance robots [11], home automation [12], in-car recognition and …

Analysis of Turn-Taking in the Slovak Interview Corpus
S Ondáš, J Juhár – 2018 16th International Conference on …, 2018 – ieeexplore.ieee.org
… It was interesting to observe that also untrained listeners were able to predict the end of the utterance very effective in case of so called “predictable sentence”. This experience leads us to think about anticipation in case of human-human and human- robot dialogue interactions …

Identification of Cognitive Navigational Commands for Mobile Robots Based on Hand Gestures and Vocal Commands
HMRT Bandara, BMSS Basnayake… – 2018 2nd …, 2018 – ieeexplore.ieee.org
… However, system does not consider uncertainties in human gestures and vocal commands. Furthermore [17] proposed a method to extract linguistic spatial descriptions and other spatial information from evidence grid map to use in a natural human robot dialogue …

Representational Learning in Conversational Agents
S Savi?, M Gnjatovi?, D Miškovi?, B Borovac – researchgate.net
… hotels. III. REPRESENTATIONAL LEARNING IN HUMAN-MACHINE INTERACTION Appropriate modelling of a dialogue domain is considered fundamental for successful human-robot dialogue manage- ment [17, 18]. In addition …

Towards Intelligent Arbitration of Diverse Active Learning Queries
K Bullard, AL Thomaz… – 2018 IEEE/RSJ …, 2018 – ieeexplore.ieee.org
… low-level action controllers [10], [11], [12], an optimal policy towards the end of imitating a human demonstrator’s behavior [4], [5], grounding of goal state symbols [7], inferring task sequencing constraints [8], and retrieval of objects by the use of curiosity in human-robot dialog [13 …

Augmented, mixed, and virtual reality enabling of robot deixis
T Williams, N Tran, J Rands, NT Dantam – International Conference on …, 2018 – Springer
… In: Proceedings of the 10th International Conference on Natural Language Generation (2017)Google Scholar. 14. Williams, T., Scheutz, M.: Resolution of referential ambiguity in human-robot dialogue using dempster-shafer theoretic pragmatics …

Neural networks for recognizing human activities in home-like environments
FJ Rodriguez Lera, FM Rico… – Integrated Computer …, 2018 – content.iospress.com
… with the robot. Aspects such as environmental location and human-robot dialog have traditionally been used to infer the current situation, which can be helpful in the decision making pro- cess of any robot. However, the acoustic …

Anticipation in speech-based human-machine interfaces
S Ondáš, J Juhár, E Kiktová… – 2018 9th IEEE …, 2018 – ieeexplore.ieee.org
… very effective in case of so called “predictable sentence”. This experience leads us to think about anticipation in case of human-human and human-robot dialogue interactions. There exist several works that confirm the human …

A multi-modal human robot interaction framework based on cognitive behavioral therapy model
N Rastogi, F Keshtkar, MS Miah – … of the Workshop on Human-Habitat …, 2018 – dl.acm.org
… extents. To summarize our primary goal, this paper is to introduce an ultimate multi-modal human-robot dialogue interaction system for automatic depression detection (ADD) based on cognitive- behavior-therapy (CBT) process …

Visual Referring Expression Recognition: What Do Systems Actually Learn?
V Cirik, LP Morency, T Berg-Kirkpatrick – arXiv preprint arXiv:1805.11818, 2018 – arxiv.org
… 2014. Collaborative effort towards common ground in situated human-robot dialogue. In Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction. ACM, pages 33–40. Danqi Chen, Jason Bolton, and Christopher D. Man- ning. 2016a …

Dialogue Behavior Control Model for Expressing a Character of Humanoid Robots
K Yamamoto, K Inoue, S Nakamura… – 2018 Asia-Pacific …, 2018 – ieeexplore.ieee.org
… scenarios. The utterances of the user are made by the experimenter. These utterances are spoken in Japanese. Two scenarios are prepared with reference to a human-robot dialogue corpus recorded in a WOZ setting [15], [16]. The …

A conversational user interface for supporting individual and group decision-making in stock investment activities
MH Wen – 2018 IEEE International Conference on Applied …, 2018 – ieeexplore.ieee.org
… In academics, aside from considering how to promote content cognition in human–robot dialogue from the perspective of engineers, some researchers are committed to understanding human–computer interaction performance from the aspect of user experience …

Augmenting Robot Knowledge Consultants with Distributed Short Term Memory
T Williams, R Thielstrom, E Krause… – … Conference on Social …, 2018 – Springer
… (2017)Google Scholar. 36. Williams, T., Acharya, S., Schreitter, S., Scheutz, M.: Situated open world reference resolution for human-robot dialogue … Williams, T., Scheutz, M.: Resolution of referential ambiguity in human-robot dialogue using dempster-Shafer theoretic pragmatics …

A Bayesian Analysis of Moral Norm Malleability during Clarification Dialogues.
T Williams, RB Jackson, J Lockshin – CogSci, 2018 – researchgate.net
… (2013). Toward information theoretic human-robot dialog. Robotics, 32, 409–417. Traum, DR (1994) … Williams, T., & Scheutz, M. (2017). Resolution of referential ambiguity in human-robot dialogue using dempster-shafer theoretic pragmatics. In Proceedings of RSS.

Incrementally learning semantic attributes through dialogue interaction
A Vanzo, JL Part, Y Yu, D Nardi, O Lemon – Proceedings of the 17th …, 2018 – dl.acm.org
… Conversely, in [11] clarifica- tion dialogues are used to support the mapping process. Such an approach is further extended in [27] to create conceptual represen- tations of indoor environments which are used in human-robot dialogue …

PREC 2018: Personal Robots for Exercising and Coaching
S Schneider, B Wrede, C Cifuentes… – Companion of the 2018 …, 2018 – dl.acm.org
… Since rejoining the Applied Informatics Group she has been working on human-robot dialog modeling, emotion recogni- tion and modeling in HRI, developmentally inspired speech acquisi- tion approaches, visual attention modeling, the analysis of tutoring behavior towards …

Pictobot: A Cooperative Painting Robot for Interior Finishing of Industrial Developments
E Asadi, B Li, IM Chen – IEEE Robotics & Automation Magazine, 2018 – ieeexplore.ieee.org
… Lee [8] introduces case studies on glazing robot technology for installing glass panels on construction sites. A human– robot dialogue system [9] has been developed in joint action science and technology to solve a construction task collaboratively with a human …

Human Capacity—Biopsychosocial Perspective
B Xing, T Marwala – Smart Maintenance for Human–Robot Interaction, 2018 – Springer
… spectrum: Query 11.1: How can dialogue policy of a human-robot dialogue system be better optimised so as to enhance its adaptability in terms of different user profiles? 11.2.1 Reinforcement Learning in Addressing Query 11.1. In …

AI, The Persona, and Rights
T Shepherd – A Networked Self and Human Augmentics, Artificial …, 2018 – taylorfrancis.com
… In particular, Sarah was created to test whether “reference to shared memories and shared friends in human-robot dialogue” would help support “more meaningful and sustainable relationships” between humans and robots (Mavridis et al., 2009 cited in 2011, p. 294) …

Real-Time Human-Robot Communication for Manipulation Tasks in Partially Observed Environments
J Arkin, R Paul, D Park, S Roy, N Roy… – International Symposium …, 2018 – rohanpaul.in
… References 1. J. Arkin and TM Howard. Experiments in proactive symbol grounding for efficient physi- cally situated human-robot dialogue … 9. T. Kollar, V. Perera, D. Nardi, and M. Veloso. Learning environmental knowledge from task- based human-robot dialog. In Proc …

Dialogue Models for Socially Intelligent Robots
K Jokinen – International Conference on Social Robotics, 2018 – Springer
… 2, presents a dialogue architecture for the robot agent in Sect. 3, and concludes in Sect. 4. The theoretical approach is complemented by sketching a real application, and demonstration. 2 Human-Robot Dialogue Interaction …

Asking for Help Effectively via Modeling of Human Beliefs
T Kessler Faulkner, S Niekum, A Thomaz – Companion of the 2018 ACM …, 2018 – dl.acm.org
… 2014. Collaborative effort towards common ground in situated human-robot dialogue. In Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction. ACM, 33–40. http://dl.acm.org/citation. cfm?id=2559677 [2] Naoto Iwahashi. 2006 …

Generation of Head Motion During Dialogue Speech, and Evaluation in Humanoid Robots
CT Ishi, C Liu, H Ishiguro – Geminoid Studies, 2018 – Springer
… We plan to evaluate the effect of this face-up motion in other types of robots in the future. 7.6 Evaluation of Head Motion and Eye Gazing During Human–Robot Dialogue Interaction … 2012. Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction …

Towards Learning User Preferences for Remote Robot Navigation
CJ Hayes, M Marge, E Stump, C Bonial… – Proceedings of the …, 2018 – ece.rochester.edu
… Page 3. III. HUMAN-ROBOT DIALOGUE EXPERIMENTS … Laying down the yellow brick road: Development of a wizard-of- oz interface for collecting human-robot dialogue. AAAI Fall Symposium Series: Natural Communication for Human-Robot Collaboration, 2017 …

Planning to Give Information in Partially Observed Domains with a Learned Weighted Entropy Model
R Chitnis, LP Kaelbling, T Lozano-Pérez – CoRR, 2018 – rohanchitnis.com
… IEEE, 1997. [3] R. Deits, S. Tellex, P. Thaker, D. Simeonov, T. Kollar, and N. Roy. Clarifying commands with information-theoretic human-robot dialog … [4] S. Tellex, P. Thaker, R. Deits, D. Simeonov, T. Kollar, and N. Roy. Toward information theo- retic human-robot dialog …

Learning What Information to Give in Partially Observed Domains
R Chitnis, LP Kaelbling, T Lozano-Pérez – arXiv preprint arXiv:1805.08263, 2018 – arxiv.org
… IEEE, 1997. [3] R. Deits, S. Tellex, P. Thaker, D. Simeonov, T. Kollar, and N. Roy. Clarifying commands with information-theoretic human-robot dialog … [4] S. Tellex, P. Thaker, R. Deits, D. Simeonov, T. Kollar, and N. Roy. Toward information theo- retic human-robot dialog …

Advanced Mechanical Science and Technology for the Industrial Revolution 4.0
L Yao, S Zhong, H Kikuta, JG Juang, M Anpo – 2018 – Springer
Ligang Yao· Shuncong Zhong Hisao Kikuta· Jih-Gau Juang Masakazu Anpo Editors Advanced Mechanical Science and Technology for the Industrial Revolution 4.0 Page 2. Advanced Mechanical Science and Technology for the Industrial Revolution 4.0 Page 3 …

Language to Action: Towards Interactive Task Learning with Physical Agents.
JY Chai, Q Gao, L She, S Yang, S Saba-Sadiya, G Xu – AAMAS, 2018 – ijcai.org
… In The 9th ACM/IEEE International Conference on Human-Robot Interaction, Bielefeld, Germany, 2014. [Chai et al., 2016] JY Chai, R. Fang, C. Liu, and L. She. Collaborative language grounding towards situated human robot dialogue. AI Magazine, 37(4):32–45, 2016 …

Robotics collaborative technology alliance (RCTA) program overview
SH Young, DG Patel – Unmanned Systems Technology XX, 2018 – spiedigitallibrary.org
… achieving several key autonomous robotic capabilities such as high-speed perception and mobility in rough terrain, situation awareness in unstructured environment, collaborative human-robot mission planning and execution, multimodal human-robot dialogue, and dexterous …

KTH Tangrams: A Dataset for Research on Alignment and Conceptual Pacts in Task-Oriented Dialogue
T Shore, T Androulakaki, G Skantze – … International Conference on …, 2018 – diva-portal.org
Page 1. http://www.diva-portal.org Postprint This is the accepted version of a paper presented at Eleventh International Conference on Language Resources and Evaluation (LREC 2018). Citation for the original published paper …

Learning Instructor Expectations in ITL Agent Interaction
P Ramaraj – 2018 – soar.eecs.umich.edu
Page 1. Learning Instructor Expectations in ITL Agent Interaction Preeti Ramaraj May 2018 Page 2. 2 Motivation Clark, HH, & Brennan, SE (1991). Grounding in communication. Perspectives on socially shared cognition, 13(1991), 127-149. Special coffee Non-Expert Page 3. 3 …

Relieving operators’ workload: Towards affective robotics in industrial scenarios
CT Landi, V Villani, F Ferraguti, L Sabattini, C Secchi… – Mechatronics, 2018 – Elsevier
… aerospace, edutainment and entertainment, home service, military and industrial applications [1], [2]. Robots can help humans in relieving physical effort tasks, carrying heavy loads and conducting repetitive tasks: in [3] authors describe a human-robot dialogue system that …

Grounded Language Learning: Where Robotics and NLP Meet.
C Matuszek – IJCAI, 2018 – iral.cs.umbc.edu
… rich existing corpus of research on learning. The first two suggested research directions directly address this question, while the third depends on solving it. Human-Robot Dialog for Active Learning One way of improving efficiency is to make use of active learning …

Text-Based Inference of Object Affordances for Human-Robot Interaction
M Persiani, T Hellström – idiap.ch
… The relevance of using corpora like YAMC to generate affordances for HRI has to be investigated further. The difference between usage of language in human-robot dialogue and in general corpora may affect accuracy, and alternative corpora could be considered …

User Input-Based Construction of Personal Knowledge Graphs
X Sun, S Zhang – International Conference on Applied Human Factors …, 2018 – Springer
… In: Robotics and Automation 2005, ICRA 2005, April 2005Google Scholar. 8. Kollar, T., Perera, V., Nardi, D., Veloso, M.: Learning environmental knowledge from task-based human-robot dialog. In: 2013 IEEE International Conference on Robotics and Automation (ICRA), pp …

POMDP-based coding of child–robot interaction within a robot-assisted ASD diagnostic protocol
F Petric, D Mikli?, Z Kova?i? – International Journal of Humanoid …, 2018 – World Scientific
… system navigating in unknown environments.17 Furthermore, researchers utilize POMDPs to improve robot plans for information gathering in unknown environments18 and to improve teaching strategies.19,20 POMDPs are also used to model human–robot dialogue21 and …

Reference in robotics: A givenness hierarchy theoretic approach
T Williams, M Scheutz – The Oxford handbook of …, 2018 – pdfs.semanticscholar.org
Page 1. Reference in Robotics: a Givenness Hierarchy Theoretic Approach Tom Williams and Matthias Scheutz {williams,mscheutz}@cs.tufts.edu Human-Robot Interaction Laboratory Tufts University 200 Boston Ave. Medford MA, 02145 1 Introduction …

Annotating Reflections for Health Behavior Change Therapy
N Guntakandla, N Rodney – … of the Eleventh International Conference on …, 2018 – aclweb.org
… Un- published manual. Nielsen, RD, Voyles, RM, Bolanos, D., Mahoor, MH, Pace, WD, Siek, KA, and Ward, WH (2010). A platform for human-robot dialog systems research. In AAAI Fall Symposium: Dialog with Robots. Pérez …

A comparison of visualisation methods for disambiguating verbal requests in human-robot interaction
E Sibirtseva, D Kontogiorgos, O Nykvist… – 2018 27th IEEE …, 2018 – ieeexplore.ieee.org
… Establishing language grounding, particularly in situated human-robot dialogue, can be challenging. Robots need to perceive human behaviour and build internal representations and spatial semantic understanding based on human inten- tions [23] …

THE DESIGN OF FINGER GESTURES VOCABULARY.
A Tsagaris – Academic Journal of Manufacturing …, 2018 – search.ebscohost.com
… 699–702). New York: ACM Press. ?Bodiroža Saša, Stern Helman I. and Edan Yael, (2012). Dynamic Gesture Vocabulary Design for Intuitive Human-Robot Dialog, HRI’12, March 5–8, , Boston, Massachusetts, USA. ?Chang, MS, Kwak, SD, Kang, SM (2011) …

Photo-Realistic Blocksworld Dataset
M Asai – arXiv preprint arXiv:1812.01818, 2018 – arxiv.org
… She, L., Yang, S., Cheng, Y., Jia, Y., Chai, J., & Xi, N. (2014). Back to the blocks world: Learning new actions through situated human-robot dialogue. In Proceedings of the 15th Annual Meeting of the Special Interest Group on Discourse and Dialogue (SIGDIAL), pp. 89–97 …

Installation and Control of Building Automation Systems Using Human-Robot-Interaction
M Ohlenbusch, NFH Bartner, S Vöge… – … on Methods & …, 2018 – ieeexplore.ieee.org
… In general, the choice to use a human-like robot as dialog partner was rated reasonable. Additionally, the visual appear- ance of the human robot dialog partner was perceived pleasant, which may influence the dialog process in a positive way …

A survey on deep learning toolkits and libraries for intelligent user interfaces
J Zacharias, M Barz, D Sonntag – arXiv preprint arXiv:1803.04818, 2018 – arxiv.org
… Jiang et al. [32] present an algorithm for learning new object classes and corresponding relations in a human-robot dialogue using CNN-based features. The relations are used to reason about future scenarios where known faces and objects are recognised. Cognolato et al …

Action verb corpus
S Gross, M Hirschmanner, B Krenn… – Proceedings of the …, 2018 – aclweb.org
… A speech and gesture spatial corpus in assisted living. In LREC, pages 2351–2354. Foster, ME, Bard, EG, Guhe, M., Hill, RL, Ober- lander, J., and Knoll, A. (2008). The roles of haptic- ostensive referring expressions in cooperative, task- based human-robot dialogue …

The Niki and Julie Corpus: collaborative multimodal dialogues between humans, robots, and virtual agents
R Artstein, J Boberg, A Gainer, J Gratch… – Proceedings of the …, 2018 – aclweb.org
… Interaction, Chicago, March. Marge, M., Bonial, C., Pollard, KA, Artstein, R., Byrne, B., Hill, SG, Voss, C., and Traum, D. (2016). Assess- ing agreement in human-robot dialogue strategies: A tale of two wizards. In David Traum …

Imitating human movement using a measure of verticality to animate low degree-of-freedom non-humanoid virtual characters
R Kaushik, A LaViers – International Conference on Social Robotics, 2018 – Springer
… 3819–3824. IEEE (2011)Google Scholar. 7. Liu, C., Ishi, CT, Ishiguro, H., Hagita, N.: Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction. In: 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 285–292 …

Towards givenness and relevance-theoretic open world reference resolution
T Williams, E Krause, B Oosterveld… – RSS Workshop on …, 2018 – ece.rochester.edu
Page 1. Towards Givenness and Relevance-Theoretic Open World Reference Resolution Tom Williams MIRRORLab Colorado School of Mines Golden, CO, USA twilliams@mines.edu Evan Krause, Bradley Oosterveld, Matthias …

Listening skills assessment through computer agents
H Tanaka, H Negoro, H Iwasaka… – Proceedings of the 2018 …, 2018 – dl.acm.org
… In ICASSP. [19] C. Liu, CT Ishi, H. Ishiguro, and N. Hagita. 2012. Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction. In ACM/IEEE Inter- national Conference on Human-Robot Interaction (HRI). 285–292 …

Socially Believable Robots
M Moetesum, I Siddiqi – Human-Robot Interaction: Theory and …, 2018 – books.google.com
… When humans engage in a dialog, they usually rely on a variety of para-linguistic social cues (ie facial expressions and gestures, etc.) in addition to words. Research [39], has proven such non-verbal cues to be highly effective for controlling human robot dialog …

Comparing cascaded LSTM architectures for generating head motion from speech in task-oriented dialogs
DC Nguyen, G Bailly, F Elisei – International Conference on Human …, 2018 – Springer
… 28, Article no. 172. ACM (2009)CrossRefGoogle Scholar. 14. Liu, C., Ishi, CT, Ishiguro, H., Hagita, N.: Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction. In: Human-Robot Interaction (HRI), pp. 285–292. IEEE (2012)Google Scholar. 15 …

The neural painter: Multi-turn image generation
RY Benmalek, C Cardie, S Belongie, X He… – arXiv preprint arXiv …, 2018 – arxiv.org
… language programming. In: AAAI. (2013) 13. She, L., Yang, S., Cheng, Y., Jia, Y., Chai, JY, Xi, N.: Back to the blocks world: Learning new actions through situated human-robot dialogue. In: SIGDIAL Conference. (2014) 14. Ling …

Surprising sequences for communication and conversation
TY Wu, X Ge, LR Varshney – 2018 52nd Annual Conference on …, 2018 – ieeexplore.ieee.org
Page 1. Surprising Sequences for Communication and Conversation Ting-Yi Wu, Xiou Ge, and Lav R. Varshney Coordinated Science Laboratory University of Illinois at Urbana-Champaign Abstract—In most information-theoretic …

Light-based nonverbal signaling with passive demonstrations for mobile service robots
R Fernandez Jr – 2018 – repositories.lib.utexas.edu
Page 1. Copyright by Rolando Fernandez Jr. 2018 Page 2. The Thesis committee for Rolando Fernandez Jr. certifies that this is the approved version of the following thesis: Light-Based Nonverbal Signaling with Passive Demonstrations for Mobile Service Robots …

Gesture Recognition System Based on RFID
F Chen – Mobile Ad-hoc and Sensor Networks: 13th International …, 2018 – books.google.com
… devices. In: Usenix Conference on Networked Systems Design and Implementation, pp. 303–316 (2014) 2. Ziaie, P., Muller, T., Knoll, A.: A novel approach to hand-gesture recognition in a human-robot dialog system. In: First …

Principles of Human–Robot Interaction
H Ishiguro – Diversity in Harmony: Proceedings of the 31st …, 2018 – Wiley Online Library
… Interaction Studies, 16 (2), 249–271. Liu, C., Ishi, CT, Ishiguro, H., & Hagita, N.(2012). Generation of nodding, head tilting and eye gazing for human–robot dialogue interaction. In 7th ACM/IEEE International Conference on Human–Robot Interaction (HRI 2012), pp. 285–292 …

ARMAR-6: A collaborative humanoid robot for industrial environments
T Asfour, L Kaul, M Wächter… – 2018 IEEE-RAS 18th …, 2018 – ieeexplore.ieee.org
… real-world setting via the modalities speech, vision and haptics/force, and its capabilities in terms of force-based bimanual manipulation, vision-based grasping, fluent object handover, human activity recognition, natural language based human-robot dialog, navigation and more …

Reasoning about Complex and Uncertain Worlds in Physical, Social, and Virtual Realms: A Report of the Army Science Planning and Strategy Meetings Held in Fiscal …
B Glaz, B Piekarski, B Sadler, L Troyer… – 2018 – apps.dtic.mil
… Robotic physical partnering with humans is limited to fixed, repetitive tasks. • Ad hoc robotic tasking and human-robot dialog are in early-stage research. • Physics-based simulations are highly scenario-specific and not easily generalized or scaled.

Artificial intelligence and machine learning for future army applications
JM Fossaceca, SH Young – Ground/Air Multisensor …, 2018 – spiedigitallibrary.org
SPIE Digital Library Proceedings.

Guiding exploratory behaviors for multi-modal grounding of linguistic descriptions
J Thomason, J Sinapov, RJ Mooney, P Stone – Thirty-Second AAAI …, 2018 – aaai.org
… constraints. In future work, behavior annotations could be gathered from human users on-the-fly in an embodied dialog setting, using a learned human-robot dialog policy to know when behavior annotation questions are warranted …

Design of Computational Intelligence-Based Language Interface for Human-Machine Secure Interaction
R Damaševicius, W Wei – J. Univ. Comput. Sci, 2018 – jucs.org
Page 1. Design of Computational Intelligence-based Language Interface for Human-Machine Secure Interaction Marcin Wozniak, Dawid Po lap (Institute of Mathematics, Silesian University of Technology Kaszubska 23, 44-100 …

Using iterative design to create efficacy-building social experiences with a teachable robot
N Lubold, E Walker, H Pon-Barry, Y Flores, A Ogan – 2018 – repository.isls.org
… at fostering these four experiences. It consisted of human-robot dialogue based on two human learners who are friends and introduced a moderate level of difficulty for achieving mastery experience. Overall, we find several …

Augmenting Robot Knowledge Consultants with Distributed Short Term Memory
M Scheutz – … Robotics: 10th International Conference, ICSR 2018 …, 2018 – books.google.com
… Cogn. 43, 26 (2017) Williams, T.: A consultant framework for natural language processing in integrated robot architectures. IEEE Intell. Inform. Bull.(2017) Williams, T., Acharya, S., Schreitter, S., Scheutz, M.: Situated open world reference resolution for human-robot dialogue …

Human–Machine Synergism in High-Level Cognitive Functioning: The Human Component
RE Patterson, RG Eggleston – IEEE Transactions on Emerging …, 2018 – ieeexplore.ieee.org
… There have been recent studies of joint training involving humans and machines in the context of human-robot dialogue. For example, reference [61] grounded human language to the internal representation of perception of a robot using collabo- rative techniques …

Interactive, Collaborative Robots: Challenges and Opportunities.
D Kragic, J Gustafson, H Karaoguz, P Jensfelt, R Krug – IJCAI, 2018 – ijcai.org
… on Mobile Robots, pages 1–6, 2015. [Chai et al.,2014] Joyce Y. Chai, Lanbo She, Rui Fang, Spencer Ottarson, Cody Littley, Changsong Liu, and Kenneth Hanson. Collaborative effort towards com- mon ground in situated human-robot dialogue. In Proc. ACM/IEEE Int. Conf …

Predicting Perceived Age: Both Language Ability and Appearance are Important
S Plane, A Marvasti, T Egan, C Kennington – Proceedings of the 19th …, 2018 – aclweb.org
Page 1. Proceedings of the SIGDIAL 2018 Conference, pages 130–139, Melbourne, Australia, 12-14 July 2018. c 2018 Association for Computational Linguistics 130 Predicting Perceived Age: Both Language Ability and Appearance are Important …

Mapping navigation instructions to continuous control actions with position-visitation prediction
V Blukis, D Misra, RA Knepper, Y Artzi – arXiv preprint arXiv:1811.04179, 2018 – arxiv.org
Page 1. Mapping Navigation Instructions to Continuous Control Actions with Position-Visitation Prediction Valts Blukis†? Dipendra Misra†? Ross A. Knepper† Yoav Artzi†? †Department of Computer Science, Cornell University …

Facilitating Human–Robot Collaborative Tasks by Teaching-Learning-Collaboration From Human Demonstrations
W Wang, R Li, Y Chen, ZM Diekel… – IEEE Transactions on …, 2018 – ieeexplore.ieee.org
Page 1. This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination. IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING 1 Facilitating Human–Robot Collaborative Tasks by …

Building and Learning Structures in a Situated Blocks World Through Deep Language Understanding
I Perera, J Allen, CM Teng, L Galescu – Proceedings of the First …, 2018 – aclweb.org
… Lanbo She, Shaohua Yang, Yu Cheng, Yunyi Jia, Joyce Y Chai, and Ning Xi. 2014. Back to the Blocks World: Learning New Actions through Sit- uated Human-Robot Dialogue. Proceedings of the SIGDIAL 2014 Conference, (June):89–97 …

Using syntax to ground referring expressions in natural images
V Cirik, T Berg-Kirkpatrick, LP Morency – Thirty-Second AAAI Conference on …, 2018 – aaai.org
Page 1. Using Syntax to Ground Referring Expressions in Natural Images Volkan Cirik, Taylor Berg-Kirkpatrick, Louis-Philippe Morency School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 {vcirik,tberg,morency}@cs.cmu.edu Abstract …

Robot Communication Via Motion: Closing the Underwater Human-Robot Interaction Loop
M Fulton, C Edge, J Sattar – arXiv preprint arXiv:1809.07948, 2018 – arxiv.org
… [25] J. Sattar and JJ Little. Ensuring safety in human-robot dialog a cost-directed approach. In 2014 IEEE International Conference on Robotics and Automation (ICRA), pages 6660–6666, May 2014. [26] T. Shimokawa and T. Sawaragi …

Robot Behavioral Exploration and Multi-modal Perception using Dynamically Constructed Controllers
S Amiri, S Wei, S Zhang, J Sinapov, J Thomason… – 2018 AAAI Spring …, 2018 – aaai.org
… In future work, we plan to in- vestigate applying this approach to tasks that involve more human-robot interaction and mobile robot platforms, where exploration would require navigation actions and perceptual modalities such as human-robot dialog …

Learning interpretable spatial operations in a rich 3d blocks world
Y Bisk, KJ Shih, Y Choi, D Marcu – Thirty-Second AAAI Conference on …, 2018 – aaai.org
Page 1. Learning Interpretable Spatial Operations in a Rich 3D Blocks World Yonatan Bisk, 1? Kevin J. Shih, 2 Yejin Choi, 1 Daniel Marcu 3? 1 Paul G. Allen School of Computer Science & Engineering, University of Washington …

Toward Low-Cost Automated Evaluation Metrics for Internet of Things Dialogues
K Georgila, C Gordon, H Choi, J Boberg… – … on Spoken Dialogue …, 2018 – people.ict.usc.edu
… Science 5533), pp. 22–35. Springer (2009) 2. Foster, ME, Giuliani, M., Knoll, A.: Comparing objective and subjective measures of us- ability in a human-robot dialogue system. In: Proc. of ACL, pp. 879–887. Suntec, Singapore …

Poker face influence: persuasive robot with minimal social cues triggers less psychological reactance
AS Ghazali, J Ham, EI Barakova… – 2018 27th IEEE …, 2018 – ieeexplore.ieee.org
Page 1. ? Abstract— Applications of social robotics in different domains such as education, healthcare, or as companions to people living alone, often entail that robots will act as persuasive agents. However, persuasive attempts …

A Multi-layer LSTM-based Approach for Robot Command Interaction Modeling
M Mensio, E Bastianelli, I Tiddi, G Rizzo – arXiv preprint arXiv:1811.05242, 2018 – arxiv.org
… 1, pp. 49–62, 2013. [8] J. Thomason, S. Zhang, R. Mooney, and P. Stone, “Learning to interpret natural language commands through human-robot dialog,” in Proceedings of the 24th International Conference on Artificial Intelligence (IJCAI), ser. IJCAI’15. AAAI Press, 2015, pp …

A Situated Dialogue System for Learning Structural Concepts in Blocks World
I Perera, J Allen, CM Teng, L Galescu – Proceedings of the 19th Annual …, 2018 – aclweb.org
… Lanbo She, Shaohua Yang, Yu Cheng, Yunyi Jia, Joyce Y Chai, and Ning Xi. 2014. Back to the Blocks World: Learning New Actions through Situ- ated Human-Robot Dialogue. In Proceedings of the SIGDIAL 2014 Conference, June, pages 89–97 …

Visual curiosity: Learning to ask questions to learn visual recognition
J Yang, J Lu, S Lee, D Batra, D Parikh – arXiv preprint arXiv:1810.00912, 2018 – arxiv.org
Page 1. Visual Curiosity: Learning to Ask Questions to Learn Visual Recognition Jianwei Yang1? Jiasen Lu1? Stefan Lee1 Dhruv Batra1,2 Devi Parikh1,2 1Georgia Institute of Technology 2Facebook AI Research Abstract: In …

Using lexical alignment and referring ability to address data sparsity in situated dialog reference resolution
T Shore, G Skantze – Proceedings of the 2018 Conference on Empirical …, 2018 – aclweb.org
Page 1. Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 2288–2297 Brussels, Belgium, October 31 – November 4, 2018. c 2018 Association for Computational Linguistics 2288 …

Towards a robust interactive and learning social robot
M de Jong, K Zhang, AM Roth, T Rhodes… – Proceedings of the 17th …, 2018 – dl.acm.org
Page 1. Towards a Robust Interactive and Learning Social Robot Robotics Track Michiel de Jong Carnegie Mellon University Pittsburgh, USA mdejong@andrew.cmu.edu Kevin Zhang Carnegie Mellon University Pittsburgh, USA klz1@andrew.cmu.edu …

Multi-modal Predicate Identification using Dynamically Learned Robot Controllers.
S Amiri, S Wei, S Zhang, J Sinapov, J Thomason… – IJCAI, 2018 – cs.utexas.edu
Page 1. In Proceedings of the 27th International Joint Conference on Artificial Intelligence (IJCAI 2018), Stockholm, Sweden, July 2018 Multi-modal Predicate Identification using Dynamically Learned Robot Controllers Saeid …

Learning Out-of-Vocabulary Words in Intelligent Personal Agents.
A Ray, Y Shen, H Jin – IJCAI, 2018 – ijcai.org
Page 1. Learning Out-of-Vocabulary Words in Intelligent Personal Agents Avik Ray, Yilin Shen and Hongxia Jin Samsung Research America, Mountain View, California, USA {avik.r, yilin.shen, hongxia.jin}@samsung.com Abstract …

Language-Based Bidirectional Human and Robot Interaction Learning for Mobile Service Robots
V Perera – 2018 – reports-archive.adm.cs.cmu.edu
… The key contributions of this thesis, organized by the direction of interaction, are the following: Human-to-Robot Interaction KnoWDiaL is an approach that lets the robot learn task-relevant environmental knowledge from human-robot dialogue and access to the web …

Person following by autonomous robots: A categorical overview
MJ Islam, J Hong, J Sattar – arXiv preprint arXiv:1803.08202, 2018 – arxiv.org
Page 1. Islam et al. 1 Person Following by Autonomous Robots: A Categorical Overview Preprint Version II XX(X):1–30 c The Author(s) 2019 Reprints and permission: sagepub.co.uk/ journalsPermissions.nav DOI: 10.1177/ToBeAssigned www.sagepub.com …

What is not where: the challenge of integrating spatial representations into deep learning architectures
JD Kelleher, S Dobnik – arXiv preprint arXiv:1807.08133, 2018 – arxiv.org
Page 1. What is not where: the challenge of integrating spatial representations into deep learning architectures John D. Kelleher ADAPT Centre for Digital Content Technology Dublin Institute of Technology, Ireland john.d.kelleher@dit.ie …

Lifetime Achievement Award
M Steedman – 2018 – research.ed.ac.uk
Page 1. Edinburgh Research Explorer The Lost Combinator Citation for published version: Steedman, M 2018, ‘The Lost Combinator’ Computational Linguistics, vol. 44, no. 4, pp. 613-629. DOI: 10.1162/coli_a_00328 Digital Object Identifier (DOI): 10.1162/coli_a_00328 …

Exploring the functional and geometric bias of spatial relations using neural language models
S Dobnik, M Ghanimifard, J Kelleher – 2018 – arrow.dit.ie
Page 1. Dublin Institute of Technology ARROW@DIT Conference papers Applied Intelligence Research Centre 2018 Exploring the Functional and Geometric Bias of Spatial Relations Using Neural Language Models Simon …

What action causes this? towards naive physical action-effect prediction
Q Gao, S Yang, J Chai, L Vanderwende – … of the 56th Annual Meeting of …, 2018 – aclweb.org
Page 1. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Long Papers), pages 934–945 Melbourne, Australia, July 15 – 20, 2018. c 2018 Association for Computational Linguistics 934 What Action Causes This …

Towards a Robust Interactive and Learning Social Robot
M de Jong, K Zhang, AM Roth, T Rhodes, R Schmucker… – traversrhodes.com
Page 1. Towards a Robust Interactive and Learning Social Robot Robotics Track Michiel de Jong Carnegie Mellon University Pittsburgh, USA mdejong@andrew.cmu.edu Kevin Zhang Carnegie Mellon University Pittsburgh, USA klz1@andrew.cmu.edu …

The lost combinator
M Steedman – Computational Linguistics, 2018 – MIT Press
Create a new account. Email. Returning user. Can’t sign in? Forgot your password? Enter your email address below and we will send you the reset instructions. Email. Cancel. If the address matches an existing account you will …

Modular Mechanistic Networks: On Bridging Mechanistic and Phenomenological Models with Deep Neural Networks in Natural Language Processing
S Dobnik, JD Kelleher – arXiv preprint arXiv:1807.09844, 2018 – arxiv.org
… Taipei, Taiwan, pages 441–450. Niels Schutte, Brian Mac Namee, and John D. Kelleher. 2017. Robot perception errors and human resolution strategies in situated human–robot dialogue. Advanced Robotics 31(5):243–257. Stuart Shieber. 1986 …

Hobbit: providing fall detection and prevention for the elderly in the real world
M Bajones, D Fischinger, A Weiss, D Wolf… – Journal of …, 2018 – hindawi.com
Journal of Robotics is a peer-reviewed, Open Access journal that publishes original research articles as well as review articles on all aspects automated mechanical devices, from their design and fabrication, to testing and practical implementation. The journal welcomes submissions …

DNN-HMM based automatic speech recognition for HRI scenarios
J Novoa, J Wuth, JP Escudero, J Fredes… – Proceedings of the …, 2018 – dl.acm.org
Page 1. Speech Processing and Transmission Laboratory, University of Chile Av. Tupper 2007, Santiago, Chile jose.novoa@ing.uchile.cl Speech Processing and Transmission Laboratory, University of Chile Av. Tupper 2007, Santiago, Chile jwuth@ing.uchile.cl …

Robot-to-human feedback and automatic object grasping using an RGB-D camera–projector system
J Shen, N Gans – Robotica, 2018 – cambridge.org
Page 1. Robotica (2018) volume 36, pp. 241–260. © Cambridge University Press 2017 doi:10.1017/S0263574717000339 Robot-to-human feedback and automatic object grasping using an RGB-D camera–projector system Jinglin Shen† and Nicholas Gans‡, ? …

Visual identification of biological motion for underwater human–robot interaction
J Sattar, G Dudek – Autonomous Robots, 2018 – Springer
We present an algorithm for underwater robots to visually detect and track human motion. Our objective is to enable human–robot interaction by allowing a robot to follow behind a human moving in (up…

Generating machine-executable plans from end-user’s natural-language instructions
R Liu, X Zhang – Knowledge-Based Systems, 2018 – Elsevier
Skip to main content …

One-Shot Interaction Learning from Natural Language Instruction and Demonstration
T Frasca, B Oosterveld, E Krause, M Scheutz – cogsys.org
Page 1. Advances in Cognitive Systems 6 (2018) 1–18 Submitted 6/2018; published 8/2018 One-Shot Interaction Learning from Natural Language Instruction and Demonstration Tyler Frasca TYLER.FRASCA@TUFTS.EDU Bradley Oosterveld …

Improving Mild Cognitive Impairment Prediction via Reinforcement Learning and Dialogue Simulation
F Tang, K Lin, I Uchendu, HH Dodge, J Zhou – arXiv preprint arXiv …, 2018 – arxiv.org
Page 1. Improving Mild Cognitive Impairment Prediction via Reinforcement Learning and Dialogue Simulation Fengyi Tang1, Kaixiang Lin1, Ikechukwu Uchendu1, Hiroko H. Dodge2,3, Jiayu Zhou1 1Computer Science and Engineering …

Robot following ahead of the leader and learning human-relevant navigational cues
P Nikdel – 2018 – summit.sfu.ca
Page 1. Robot Following Ahead of the Leader and Learning Human-Relevant Navigational Cues by Payam Nikdel B.Sc., Shiraz University, 2015 Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Science …

Imitation Learning for Object Manipulation Based on Position/Force Information Using Bilateral Control
T Adachi, K Fujimoto, S Sakaino… – 2018 IEEE/RSJ …, 2018 – ieeexplore.ieee.org
… networks, arXiv:1802:005201v. [7] M. Marge, C. Bonial, B. Byrne, T. Cassidy, W. Evans, S. Hill, and C. Voss, Applying the wizard-of-Oz technique to multimodal human- robot dialogue, arXiv:1703:03714. [8] P. Yang, K. Sasaki …

What My Eyes Can’t See, A Robot Can Show Me: Exploring the Collaboration Between Blind People and Robots
M Bonani, R Oliveira, F Correia, A Rodrigues… – Proceedings of the 20th …, 2018 – dl.acm.org
Page 1. What My Eyes Can’t See, A Robot Can Show Me: Exploring the Collaboration Between Blind People and Robots Mayara Bonani1,3, Raquel Oliveira2, Filipa Correia3, André Rodrigues4, Tiago Guerreiro4, Ana Paiva3 …

Synthesizing Robot Programs with Interactive Tutor Mode
H Li, YP Wang, TJ Mu – International Journal of Automation and Computing – Springer
Page 1. Synthesizing Robot Programs with Interactive Tutor Mode Hao Li Yu-Ping Wang Tai-Jiang Mu Tsinghua National Laboratory for Information Science and Technology (TNList), Department of Computer Science and Technology …

Age-and gender-based differences in children’s interactions with a gender-matching robot
A Sandygulova, GMP O’Hare – International Journal of Social Robotics, 2018 – Springer
Social robots are increasingly being used to encourage social, emotional and cognitive growth in children. However, in order to establish social and bonding interactions, social robots need to be…

A Multimodal Deep Learning Network for Group Activity Recognition
S Rossi, R Capasso, G Acampora… – 2018 International Joint …, 2018 – ieeexplore.ieee.org
… 31 – June 7, 2014, pp. 4863–4868. [5] R. Caccavale, E. Leone, L. Lucignano, S. Rossi, M. Staffa, and A. Finzi, “Attentional regulations in a situated human-robot dialogue.” in RO- MAN. IEEE, 2014, pp. 844–849. [6] G. Li, C. Zhu …

Multi-users online recognition of technical gestures for natural human–robot collaboration in manufacturing
E Coupeté, F Moutarde, S Manitsaris – Autonomous Robots, 2018 – Springer
Page 1. Autonomous Robots https://doi.org/10.1007/s10514-018-9704-y Multi-users online recognition of technical gestures for natural human–robot collaboration in manufacturing Eva Coupeté1 · Fabien Moutarde1 · Sotiris Manitsaris1 …

Follownet: Robot navigation by following natural language directions with deep reinforcement learning
P Shah, M Fiser, A Faust, JC Kew… – arXiv preprint arXiv …, 2018 – arxiv.org
… [27] J. Thomason, S. Zhang, R. Mooney, and P. Stone. Learning to interpret natural language commands through human-robot dialog … [28] J. Thomason, S. Zhang, RJ Mooney, and P. Stone. Learning to interpret natural language commands through human-robot dialog …

Human–Robot Interaction
A Kirsch – Computation for Humanity, 2018 – taylorfrancis.com
Page 1. 177 8 Human–Robot Interaction Alexandra Kirsch Industrial robots have been utilized for decades to perform dirty, dull, and dangerous tasks. But maybe such machines could do more. In Hollywood movies, we have …

Robot representing and reasoning with knowledge from reinforcement learning
K Lu, S Zhang, P Stone, X Chen – arXiv preprint arXiv:1809.11074, 2018 – arxiv.org
… Figure 6 shows the belief changes (in the dimensions of item, Fig. 6. Belief change in three dimensions (In order from the left: Items, Persons and Offices) over five turns in a human-robot dialog . person, and room) as the robot interacts with a human user …

VGPN: Voice-Guided Pointing Robot Navigation for Humans
J Hu, Z Jiang, X Ding, T Mu… – 2018 IEEE International …, 2018 – ieeexplore.ieee.org
… pp. 91–100. [17] P. Yan, B. He, L. Zhang, and J. Zhang, “Task execution based-on human- robot dialogue and deictic gestures,” in IEEE International Conference on Robotics and Biomimetics (ROBIO), 2017, pp. 1918–1923 …

Visual Diver Recognition for Underwater Human-Robot Collaboration
Y Xia, J Sattar – arXiv preprint arXiv:1809.10201, 2018 – arxiv.org
… [30] Junaed Sattar and James Joseph Little. Ensuring Safety in Human-Robot Dialog – a Cost-Directed Approach. In Proceedings of the IEEE International Conference on Robotics and Automation, ICRA., pages 6660–6666, Hong Kong, China, May 2014 …

Towards Intelligent Social Robots: From Naive Robots to Robot Sapiens
A Aly, S Griffiths, V Nitsch, K Pastra, T Taniguchi – 2018 – hal.archives-ouvertes.fr
… Experimental Robotics. Springer, 2013, pp. 403–415. [29] J. Thomason, S. Zhang, RJ Mooney, and P. Stone, “Learning to interpret natural language commands through human-robot dialog.” in IJCAI, 2015, pp. 1923–1929. [30] M …

Situated Human–Robot Collaboration: predicting intent from grounded natural language
J Brawer, O Mangin, A Roncone… – 2018 IEEE/RSJ …, 2018 – ieeexplore.ieee.org
Page 1. Situated Human–Robot Collaboration: predicting intent from grounded natural language Jake Brawer, Olivier Mangin, Alessandro Roncone, Sarah Widder, and Brian Scassellati1 Abstract—Research in human teamwork …

Intuitive control of mobile robots: an architecture for autonomous adaptive dynamic behaviour integration
C Melidis, H Iizuka, D Marocco – Cognitive processing, 2018 – Springer
In this paper, we present a novel approach to human–robot control. Taking inspiration from behaviour-based robotics and self-organisation principles, we present an interfacing mechanism, with the…

An open vocabulary semantic parser for end-user programming using natural language
JE Sales, S Handschuh – 2018 IEEE 12th International …, 2018 – ieeexplore.ieee.org
Page 1. An Open Vocabulary Semantic Parser for End-User Programming using Natural Language Juliano Efson Sales Department of Computer Science and Mathematics University of Passau Passau, Germany André Freitas …

Precise but Natural Specification for Robot Tasks
I Gavran, B Boldt, E Darulova, R Majumdar – arXiv preprint arXiv …, 2018 – arxiv.org
Page 1. arXiv:1803.02238v2 [cs.RO] 20 Sep 2018 Precise but Natural Specifications for Robot Tasks Ivan Gavran1 Brendon Boldt2 Eva Darulova1 Rupak Majumdar1 Abstract—We present Flipper, a natural language interface …

Temporal spatial inverse semantics for robots communicating with humans
Z Gong, Y Zhang – 2018 IEEE International Conference on …, 2018 – ieeexplore.ieee.org
Page 1. Temporal Spatial Inverse Semantics for Robots Communicating with Humans Ze Gong and Yu Zhang Abstract—Effective communication between humans often embeds both temporal and spatial context. While spatial …

A multimodal classifier generative adversarial network for carry and place tasks from ambiguous language instructions
A Magassouba, K Sugiura… – IEEE Robotics and …, 2018 – ieeexplore.ieee.org
Page 1. 2377-3766 (c) 2018 IEEE. Personal use is permitted, but republication/ redistribution requires IEEE permission. See http://www.ieee.org/ publications_standards/publications/rights/index.html for more information. This …

Ambient Assisted Living: Systematic Review
A Queirós, NP da Rocha – Usability, Accessibility and Ambient Assisted …, 2018 – Springer
The active ageing paradigm aims to contribute to the expectation of a healthy, autonomous and independent life with quality. The technological solutions might have a key role in the promotion of…

Integrating multi-purpose natural language understanding, robot’s memory, and symbolic planning for task execution in humanoid robots
M Wächter, E Ovchinnikova, V Wittenbeck… – Robotics and …, 2018 – Elsevier
Skip to main content …

Object assembly guidance in child-robot interaction using RGB-D based 3d tracking
J Hadfield, P Koutras, N Efthymiou… – 2018 IEEE/RSJ …, 2018 – ieeexplore.ieee.org
Page 1. Object Assembly Guidance in Child-Robot Interaction using RGB-D based 3D Tracking Jack Hadfield1,2, Petros Koutras1,2, Niki Efthymiou1,2, Gerasimos Potamianos1,3, Costas S. Tzafestas2, Petros Maragos1,2 Abstract …

An Emotion and Memory Model for Social Robots: A Long-Term Interaction
MI Ahmad – 2018 – search.proquest.com
Page 1. An Emotion and Memory Model for Social Robots: A Long-term Interaction by Muneeb Imtiaz Ahmad A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Human-Robot …

Attentive Speaking. From Listener Feedback to Interactive Adaptation
H Buschmeier – 2018 – pub.uni-bielefeld.de
Page 1. ATTENTIVE SPEAKING From Listener Feedback to Interactive Adaptation HENDRIK BUSCHMEIER Page 2. Page 3. ATTENTIVE SPEAKING From listener feedback to interactive adaptation A dissertation submitted in …

Human-Robot Interaction
Y Jia – downloads.hindawi.com
Page 1. Journal of Robotics Human-Robot Interaction Lead Guest Editor: Yunyi Jia Guest Editors: Biao Zhang, Miao Li, Brady King, and Ali Meghdari Page 2. Human-Robot Interaction Page 3. Journal of Robotics Human-Robot Interaction Lead Guest Editor: Yunyi Jia …

Understanding and resolving failures in human-robot interaction: Literature review and model development
S Honig, T Oron-Gilad – Frontiers in psychology, 2018 – frontiersin.org
While substantial effort has been invested in making robots more reliable, experience demonstrates that robots operating in unstructured environments are often challenged by frequent failures. Despite this, robots have not yet reached a level of design that allows effective …

APPINITE: A Multi-Modal Interface for Specifying Data Descriptions in Programming by Demonstration Using Natural Language Instructions
TJJ Li, I Labutov, XN Li, X Zhang, W Shi… – … IEEE Symposium on …, 2018 – ieeexplore.ieee.org
Page 1. 978-1-5386-4235-1/18/$31.00 ©2018 IEEE APPINITE: A Multi-Modal Interface for Specifying Data Descriptions in Programming by Demonstration Using Natural Language Instructions Toby Jia-Jun Li1, Igor Labutov2 …

A review of computational approaches for human behavior detection
S Nigam, R Singh, AK Misra – Archives of Computational Methods in …, 2018 – Springer
Page 1. ORIGINAL PAPER A Review of Computational Approaches for Human Behavior Detection Swati Nigam1 • Rajiv Singh2 • AK Misra1 Received: 12 October 2017 / Accepted: 2 May 2018 © CIMNE, Barcelona, Spain 2018 …

Three-layer weighted fuzzy support vector regression for emotional intention understanding in human–robot interaction
L Chen, M Zhou, M Wu, J She, Z Liu… – … on Fuzzy Systems, 2018 – ieeexplore.ieee.org
Page 1. 2524 IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 26, NO. 5, OCTOBER 2018 Three-Layer Weighted Fuzzy Support Vector Regression for Emotional Intention Understanding in Human–Robot Interaction Luefeng …

Interaction algorithm effect on human experience with reinforcement learning
S Krening, KM Feigh – ACM Transactions on Human-Robot Interaction …, 2018 – dl.acm.org
Page 1. 16 Interaction Algorithm Effect on Human Experience with Reinforcement Learning SAMANTHA KRENING and KAREN M. FEIGH, Georgia Institute of Technology, USA A goal of interactive machine learning (IML) is …

Natural Language Generation For Relative Position Description Using the PHI-Descriptor
J Francis – 2018 – atrium.lib.uoguelph.ca
Page 1. Natural Language Generation For Relative Position Description Using The PHI-Descriptor by Jesse Francis A Thesis presented to The University of Guelph In partial fulfilment of requirements for the degree of Master of Science in Computer Science …

Inferring beliefs for search and rescue from natural language
ND Schurr – 2018 – dspace.mit.edu
Page 1. Inferring Beliefs for Search and Rescue from Natural Language by Naomi D. Schurr Submitted to the Department of Aeronautics and Astronautics in partial fulfillment of the requirements for the degree of Master of Science in Aeronautics and Astronautics at the …

T???: accessible automated reasoning for human robot collaboration
I Gavran, O Mailahn, R Müller, R Peifer… – Proceedings of the 2018 …, 2018 – dl.acm.org
Page 1. Tool: Accessible Automated Reasoning for Human Robot Collaboration Ivan Gavran Max Planck Institute for Software Systems Kaiserslautern, Germany gavran@mpi-sws.org Ortwin Mailahn Research Group: Assembly …

Annotation Scaffolds for Object Modeling and Manipulation
P Frank-Bolton, R Simha – arXiv preprint arXiv:1808.06679, 2018 – arxiv.org
Page 1. Annotation Scaffolds for Object Modeling and Manipulation Pablo Frank-Bolton The George Washington University Washington, DC pfrank@gwu.edu Rahul Simha The George Washington University Washington, DC simha@gwu.edu …

A Survey of Knowledge Representation and Retrieval for Learning in Service Robotics
D Paulius, Y Sun – arXiv preprint arXiv:1807.02192, 2018 – arxiv.org
Page 1. A Survey of Knowledge Representation and Retrieval for Learning in Service Robotics David Pauliusa, Yu Suna,? aUniversity of South Florida, 4220 E Fowler Ave, Tampa, FL, United States, 33620 Abstract Within the …

Interactive Motion Planning for Multi-Agent Systems with Physics-Based and Behavior Constraints
A Best – 2018 – search.proquest.com
Interactive Motion Planning for Multi-Agent Systems with Physics-Based and Behavior Constraints. Abstract. Man-made entities and humans rely on movement as an essential form of interaction with the world. Whether it is an …

(Visited 68 times, 1 visits today)