Notes:
The dialog management module is a component of a dialog system that is responsible for controlling the flow of the conversation and deciding which actions the system should take in response to the user’s input. This module typically receives input from a variety of other modules, such as the natural language understanding module, the user and task modeling module, and the knowledge representation and reasoning module. It then uses this information to determine the appropriate response to the user’s input, based on the current state of the conversation and the system’s goals and objectives.
The dialog management module typically includes a set of rules or algorithms that define how the system should respond to different types of user input, as well as how it should navigate the conversation and keep track of the user’s goals and preferences. For example, the module may use rules to determine when the system should ask a clarifying question, provide additional information, or move on to a new topic. It may also use algorithms to decide which actions the system should take based on the current context and the user’s past behavior.
See also:
Dialog Management | Dialog Management Middleware | Dialog Manager | Dialog System Modules | Olympus/Ravenclaw Dialog Management Framework | OwlSpeak
Towards a serious game playing empathic robotic tutorial dialogue system S Janarthanam, H Hastie, A Deshmukh… – Proceedings of the 2014 …, 2014 – dl.acm.org … robotic tutor as well as the dialogue context. In this paper, we will examine the issues related to development of these dialogue strategies as part of the dialogue manager module. Permission to make digital or hard copies of … Cited by 1 Related articles All 4 versions
Exploiting knowledge base to generate responses for natural language dialog listening agents S Han, J Bang, S Ryu, GG Lee – 16th Annual Meeting of the Special …, 2015 – aclweb.org … We used Freebase popularity score to get top-10 popular instances. Extracted infor- mation is sent to the language generation module. 3.4 Dialog Management The dialog management module returns system intention based on interpretation of emotion and user intention. …
Dialogue Platform for Interactive Personal Assistant Software Y Park, S Kang, M Koo, J Seo – Natural Language Dialog Systems and …, 2015 – Springer … In this paper, we propose an effective knowledge platform structure that considers expanded structural applica- tion domains of language understanding and dialogue management modules. These modules form the core technology of the interactive personal assistant software. …
Interrogative Sentence Generation and Dialogue Management in Intelligent Tutoring System Z Zhang, X Shang – … on Advances in Mechanical Engineering and …, 2015 – atlantis-press.com … (2) Where C1,C2,K are constants. Score(Np,Nh) is a monotone decreasing function about Np and Nh. it follows the intuition: the more the user receives hints or promts, the less he gets the score. Figure 2 shows the workflow of the dialogue management module. Conclusions … Related articles All 2 versions
Social signal and user adaptation in reinforcement learning-based dialogue management E Ferreira, F Lefèvre – Proceedings of the 2nd Workshop on Machine …, 2013 – dl.acm.org … ABSTRACT This paper investigates the conditions under which cues from social signals can be used for user adaptation (or user track- ing) of a learning agent. In this work we consider the case of the Reinforcement Learning (RL) of a dialogue management module. … Cited by 9 Related articles
A comparative study of various approaches for dialogue management M Ahmed, R Riyaz, S Afzal – Int. J. Adv. Comput. Technol, 2013 – core.ac.uk … The dialogue flow is controlled by the dialogue management module. … The dialogue management module is also responsible for detecting and repairing breakdowns in the dialogue through verifications, confirmations and corrections. … Cited by 3 Related articles All 3 versions
Co-adaptation in spoken dialogue systems S Chandramohan, M Geist, F Lefevre… – Natural Interaction with …, 2014 – Springer … interaction. The dialogue management module is responsible for navigating the system to accomplish a specific task. Proper functioning of the dialogue management module can be attributed to the so-called dialogue policy. … Cited by 6 Related articles All 11 versions
Interruptible Autonomy: Towards Dialog-Based Robot Task Management Y Sun, B Coltin, M Veloso – Workshops at the Twenty-Seventh AAAI …, 2013 – brian.coltin.org … It consists of three parts. First is the dialog parser, a dialog management module that interacts with the human user through speech. The dialog parser perceives a user’s speech command and converts the speech into text candidates. … Cited by 5 Related articles All 7 versions
Counseling Dialog System with 5W1H Extraction S Han, K Lee, D Lee, GG Lee – Proceedings of the SIGDIAL 2013 …, 2013 – sigdial.org … The emotion detection module detects the user?s emotions using the emotional keyword diction- ary. The dialog management module decides the system?s action from the main action and the 5W1H information from the trained module from the example dialog corpus. … Cited by 4 Related articles All 8 versions
[BOOK] Natural Interaction with Robots, Knowbots and Smartphones: Putting Spoken Dialog Systems Into Practice J Mariani, S Rosset, M Garnier-Rizet, L Devillers – 2014 – books.google.com … The last two parts of this book are about the development of specific aspects of dialog systems. The fifth part (Spoken Dialog Systems components) is about specific components while the sixth specifically concerns the dialog management module. … Cited by 2 All 3 versions
Towards online planning for dialogue management with rich domain knowledge P Lison – Natural Interaction with Robots, Knowbots and …, 2014 – Springer … We have already described in our previous work the general dialogue system workflow [10] and will not repeat it here. Instead, we will concentrate on the dialogue manager module and in particular on its internal, domain-specific models. … Cited by 1 Related articles All 11 versions
Module for Dialog Management in the Interaction System Between User and Mobile Robotic Guide IM Kobozeva, AV Zimmerling – Trudy SPIIRAN, 2014 – mathnet.ru … Kobozeva a , G. Sidorov b , AV Zimmerling c a MV Lomonosov Moscow State University b Instituto Politécnico Nacional, Mexico City, Mexico c Moscow State Humanitarian University named after MA Sholokhov Abstract: The paper presents dialogue management module for a … Cited by 1 Related articles
Meta-Learning for Fast Dialog System Habituation to New Users R Bakis, J Havelka, J Cu?ín – Natural Language Processing and …, 2015 – books.google.com … hidden parameters of the model. From a mundane point of view, replacing an existing rule-based dialog management module in a spoken dialog system might be infea- sible due to practical reasons and costs. In this work we … Related articles
Description of the PatientGenesys Dialogue System LCDBE Bilinski, AL Ligozat, PZS Rosset… – 16th Annual Meeting of …, 2015 – aclweb.org … Lists of medical and lay terms map acronyms or technical terms (eg ligamentoplasty) to lay terms (eg ligament repair). 3.4 Dialogue manager module The system uses a frame-based design in order to allow flexible interactions. …
A step towards adaptive multimodal virtual social interaction platform for children with autism E Bekele, M Young, Z Zheng, L Zhang… – Universal access in …, 2013 – Springer … 2.2 Spoken Dialog Manager The verbal conversation is managed by a spoken dialog management module which was developed using the Microsoft speech recognizer from the speech API (SAPI) with domain specific grammar and semantics. … Cited by 2 Related articles All 3 versions
Sentiment apprehension in human-robot interaction with NAO J Shen, O Rudovic, S Cheng… – Affective Computing and …, 2015 – ieeexplore.ieee.org … 6. This system is constructed from the high- level dialogue management modules (3 modules are developed, each corresponding to one stage described in subsection II.A), the modules used for the video capturing (‘NAO Vision’), face detection (‘Face Detector’) and facial …
Multimodal Interfaces and Sensory Fusion in VR for Social Interactions E Bekele, JW Wade, D Bian, L Zhang, Z Zheng… – Virtual, Augmented and …, 2014 – Springer … 17 Fig. 1. (Continued.) 1.3 Spoken Dialog Management The verbal conversation component of the VR system creates context for social inte- raction and emotion recognition in a social setting and is managed by a spoken dialog management module. … Related articles All 3 versions
An Argumentation-based dialogue system for human-robot collaboration MQ Azhar, S Parsons, E Sklar – … of the 2013 international conference on …, 2013 – dl.acm.org … con- siders the timing of dialogue delivery. Many existing HRI systems use scripted dialogue management modules (eg, 1353 Page 2. Figure 1: ArgHRI Dialogue Manager robot receptionist [2]). However, a scripted dialogue … Cited by 2 Related articles All 8 versions
Adaptive generation in dialogue systems using dynamic user modeling S Janarthanam, O Lemon – Computational Linguistics, 2014 – MIT Press … Like Demberg, Winterboer, and Moore (2011), wizards in our set-up did not make dialogue management decisions. These were computed by the dialogue manager module based on the user dialogue act and the current dialogue state. … Cited by 4 Related articles All 6 versions
A Proposal for Processing and Fusioning Multiple Information Sources in Multimodal Dialog Systems D Griol, JM Molina, J García-Herrero – Highlights of Practical Applications …, 2014 – Springer … The Modalities fusion and fission module manages input data prepares it for pro- cessing by the application logic. When the fusion and fission engines reach an interpretation, it is passed to the dialog management module. … Related articles All 3 versions
Cognitively-inspired representational approach to meaning in machine dialogue M Gnjatovi?, V Deli? – Knowledge-Based Systems, 2014 – Elsevier … Finally, the paper reports on a domain-independent framework for end-user programming of adaptive dialogue management modules. Keywords. Human–machine dialogue; Meaning representation; Cognition; Attention; Focus tree; Broca’s aphasia. 1. Introduction. … Cited by 3 Related articles All 2 versions
A unified approach for semantic-based multimodal interaction M Löckelt, M Deru, CH Schulz, S Bergweiler… – Towards the Internet of …, 2014 – Springer … plain web browser. Pattern-Based Interaction Management As mentioned before, the fusion and discourse engine FADE and the dialog management module FLINT are both based on a system of production rules. The general … Cited by 3 Related articles All 3 versions
Architecture of a socio-conversational agent in virtual worlds B Ravenet, M Ochs, C Pelachaud – Image Processing (ICIP), …, 2014 – ieeexplore.ieee.org … The result is composed of the recog- nized list of words with a confidence index. The keyword spotting module is connected to the dialog manager module. For the dialog manager module we have integrated the Disco For Games D4G module [13]. … Related articles
Context models for adaptive dialogs and multimodal interaction F Honold, F Schussel, M Weber… – … (IE), 2013 9th …, 2013 – ieeexplore.ieee.org … Fig. 4. A planning step D is brought to execution and requires user interaction. The dialog management module decomposes the planning step into dialog goals a, b, c, and d. These units are communicated towards the user in terms of dialog outputs as dialog acts. … Cited by 12 Related articles All 5 versions
End-user design of emotion-adaptive dialogue strategies for therapeutic purposes M Gnjatovi?, V Deli? – Recent Advances of Neural Network Models and …, 2014 – Springer … The early-stage version of the dialogue management module implements three prespecified prim- itive actions: synthesizing a verbal dialogue act (utter()), instructing the exter- nal robotic system to perform a nonverbal act (perform()), and waiting until Page 6. … Cited by 4 Related articles All 3 versions
Modelling multi-party interactions among virtual characters, robots, and humans Z Yumak, J Ren, NM Thalmann, J Yuan – PRESENCE: Teleoperators and …, 2014 – MIT Press … A speech event contains four parameters: event ID, speaking person, speech content, and listening person. These three events are sent to the multi- party dialogue manager module in order to decide what to do and what to say at each point in time. 3.1.1 Speaker Identification. … Cited by 7 Related articles All 3 versions
The evaluation of spoken dialog management models for multimodal HCIs. R Maskeliunas – Int. Arab J. Inf. Technol., 2014 – ccis2k.org … Next the task, the dialog model and the other processes in the dialog management module are activated to establish a dialog, to send a command operational instruction to the application backend and to generate a feedback to the user. 3. Dialog Management … Related articles All 4 versions
Research of Urban Planning and Design Based on 3D Visualization GIS XP Liang, Q Liu – Applied Mechanics and Materials, 2014 – Trans Tech Publ … VRML world server Learning object database HTML browser VRML browser Java application User dialogue management module Data measurement module 2 d graphics display regulations of the module HTML browser VRML browser Java application Server Internet … Related articles All 2 versions
Demonstration of the Emote Wizard of Oz Interface for Empathic Robotic Tutors S Bhargava, S Janarthanam, H Hastie… – Proceedings of …, 2013 – sigdial.org … and cognitive states in tutorial tasks. In this study, the wizard plays the same role as that of affect recognition and dialogue management modules in the actual final system. 2 Previous work Wizard-of-Oz (WoZ) frameworks have … Cited by 5 Related articles All 10 versions
Censys: A Model for Distributed Embodied Cognition T Ribeiro, M Vala, A Paiva – Intelligent Virtual Agents, 2013 – Springer … The Intention and Behavior Planning is done within the Decision-Making and Dialogue Manager modules (DMs), which generate only ABML actions containing Behavior Markup Language (BML) blocks[8]. The DMs also receives only PPML perceptions con- taining a … Cited by 7 Related articles All 4 versions
Multimodal analysis of laughter for an interactive system J Urbain, R Niewiadomski, M Mancini, H Griffin… – Intelligent Technologies …, 2013 – Springer … controls the details of the expressive pattern of the laughter response by choosing, from the lexicon of pre-synthesized laughter samples, the most appropriate audiovisual episode, ie the episode that best matches the requirements specified by the Dialog Manager module. … Cited by 5 Related articles All 5 versions
User-awareness and adaptation in conversational agents V Deli?, M Gnjatovi?, N Jakovljevi?… – Facta Universitatis, …, 2014 – casopisi.junis.ni.ac.rs … machine interaction. It focuses particularly on the development of speech recognition modules in cooperation with both modules for emotion recognition and speaker recognition, as well as the dialogue management module. Finally … Cited by 2 Related articles All 6 versions
What should we know to develop an information robot? S Satake, K Nakatani, K Hayashi, T Kanda… – PeerJ Computer …, 2015 – peerj.com … The robot ends the dialog when the user leaves the robot’s side (3 m away), or when the dialog management module decides to end the dialog. Dialog manager. We developed a rule-based mechanism for dialog management. …
KeJia Robot–An Attractive Shopping Mall Guider Y Chen, F Wu, W Shuai, N Wang, R Chen, X Chen – Social Robotics, 2015 – Springer … updating. The background server collect the real-time robots’ state (ie, coordinate, task state) and forward to the smart phone app. The dialog manager module attempts to understand the users’ intentions and dispatches tasks. By …
Research on the Universals of Voice Interactive Interface W Shan, YI Ding – Information Technology Journal, 2013 – docsdrive.com … Difficult to dialogue management. The dialogue management module records the current state and sub-state of system and transfer ir1to next state when dialogue happens according to the definition of STN. The core of dialogue … Related articles All 2 versions
The Geranium System: Multimodal Conversational Agents for E-learning D Griol, JM Molina, AS de Miguel – Distributed Computing and Artificial …, 2014 – Springer … to a question, respectively. The natural language understanding and dialog management modules have been developed according to the Voice Extensible Markup Language (VoiceXML, www. w3. org/TR/voicexml20), defined … Related articles All 3 versions
Developing multimodal conversational agents for an enhanced e-learning experience D Griol, JM Molina, AS de Miguel – ADCAIJ: Advances in Distributed …, 2014 – rca.usal.es … The natural language understanding and dialog management modules have been developed according to the Voice Extensible Markup Language (VoiceXML) [DOMINGUEZ, K., 2014], defined by the W3C as the standard for implementing interactive voice dialogs for human … Related articles All 5 versions
Design of Dialog-Based Intelligent Tutoring Systems to Simulate Human-to-Human Tutoring S D’Mello, A Graesser – Where Humans Meet Machines, 2013 – Springer … Syntax plays an important role in speech act classification, but not in semantic matching algorithms for learner modeling. Dialog Management The dialog-management module in most versions of AutoTutor is an augmented, finite-state transition-network (see Fig. 11.2). … Cited by 4 Related articles All 3 versions
Multimodal interaction with virtual worlds XMMVR: eXtensible language for MultiModal interaction with virtual reality worlds H Olmedo, D Escudero, V Cardeñoso – Journal on Multimodal User …, 2015 – Springer Page 1. J Multimodal User Interfaces (2015) 9:153–172 DOI 10.1007/s12193-015-0176- 5 ORIGINAL PAPER Multimodal interaction with virtual worlds XMMVR: eXtensible language for MultiModal interaction with virtual reality worlds … Cited by 1
Combining heterogeneous inputs for the development of adaptive and multimodal interaction systems D Griol, J García-Herrero, JM MOLINA – ADCAIJ: Advances in …, 2013 – revistas.usal.es … The Modalities fusion and fission module manages input data prepares it for processing by the application logic. When the fusion and fission engines reach an interpretation, it is passed to the dialog management module. In … Cited by 5 Related articles All 5 versions
Tracking and fusion for multiparty interaction with a virtual character and a social robot Z Yumak, J Ren, NM Thalmann, J Yuan – SIGGRAPH Asia 2014 …, 2014 – dl.acm.org … Speech event contains four parameters: event ID, speaking person, speech content and listening person. These five events are sent to the multiparty-dialogue-manager module in order to decide what to do and what to say at each point of time. … Cited by 2 Related articles All 2 versions
Multi-party interaction with a virtual character and human-like robot: A case study Z Yumak, J Ren, NM Thalmann… – … Teleoperators and Virtual …, 2014 – eeeweba.ntu.edu.sg … Speech event contains four parameters: event ID, speaking person, speech content and listening person. These three events are sent to the multi-party dialogue manager module in order to decide what to do and what to say at each point in time. 8 Page 9. … Related articles
An Architecture to Develop Multimodal Educative Applications with Chatbots D Griol, Z Callejas – International Journal of Advanced Robotic …, 2013 – cdn.intechopen.com … The natural language understanding and dialogue management modules have been developed according to the Voice Extensible Markup Language (VoiceXML, www.w3.org/TR/voicexml20), defined by the W3C as the standard for implementing interactive voice dialogues for … Cited by 5 Related articles All 3 versions
A Decision Support System for Operating Room scheduling M Dios, JM Molina-Pariente… – Computers & Industrial …, 2015 – Elsevier In this paper we present a Decision Support System (DSS) for surgery scheduling which is currently in use in several Surgical Units in one of the largest hospit.
Restoring incorrectly segmented keywords and turn-taking caused by short pauses K Komatani, N Hotta, S Sato – Proc. IWSDS, 2014 – ei.sanken.osaka-u.ac.jp … the global message queue. Each plug- in receives the messages and functions in accordance with them; the FST in the dialogue management module uses the messages as conditions for FST transition. We added two kinds … Cited by 4 Related articles All 3 versions
Lattice Theoretic Relevance in Incremental Reference Processing J Hough, M Purver – 2014 – pub.uni-bielefeld.de … This requires strong interleaving of repair detection in the parser with the dialogue management module responsible for the lattice-based judgements for this to become possible. To explain the second example, we require (Knuth, 2005)’s notion of lattice-theoretic rel- evance. … Related articles All 7 versions
On three notions of grounding of artificial dialog companions A Lücking, A Mehler – Science, Technology & Innovation Studies, 2013 – sti-studies.de … Conversational grounding has to be seen as a sine qua non for the dialog management module of ADCs, since “[m]any of the errors that occur in hu- man-computer interaction can be ex- plained as failures of grounding, in which users and systems lack enough evidence to … Cited by 3 Related articles All 11 versions
Automatic dialogue acts classification in Slovak dialogues M Pleva, S Ondas, J Juhar – … RADIOELEKTRONIKA), 2015 25th …, 2015 – ieeexplore.ieee.org … The main motivation of our work was to prepare advanced dialogue management module for our previously developed systems or systems under development (see [7], [8]) that enables spoken dialogue-based interaction. In the past we developed two dialogue managers. …
Development of voice operated service for Žilina local transport timetable I Guoth, R Jarina – … (DT), 2014 10th International Conference on, 2014 – ieeexplore.ieee.org … the IRKR_UNIZA system. Firstly we had to improve the Dialog Manager module by modifying existing and adding new VoiceXML documents, to obtain a required dialog structure related to the new service. Next, ASR server … Related articles
Multimodal and Multi-party Social Interactions Z Yumak, N Magnenat-Thalmann – Context Aware Human-Robot and …, 2016 – Springer … and listening person. These three events are sent to the multi-party dialogue manager module in order to decide what to do and what to say at each point in time. 13.4.2.2 Multi-party Dialogue Manager. For natural interaction …
Therapist-centered design of a robot’s dialogue behavior M Gnjatovi? – Cognitive Computation, 2014 – Springer … speech synthesis. It should be kept in mind that a dialogue strategy defined by the therapist is automatically integrated in a dialogue management module. This module, in turn, may be integrated in a dialogue system. For example … Cited by 7 Related articles All 2 versions
Enabling effective design of multimodal interfaces for speech-to-speech translation system: An empirical study of longitudinal user behaviors over time and user … JH Shin, PG Georgiou, S Narayanan – Computer Speech & Language, 2013 – Elsevier … Table 1 shows sample log data. The system routing tag represents the information flows from the source module to the target module; for example, ‘FADT’ indicates that the data went from the audio server to the dialog management module in text form. … Cited by 4 Related articles All 4 versions
A Self-Management Service Framework to Support Chronic Disease Patients’ Self-Management T Supnithi, M Buranarach… – … in Knowledge and …, 2013 – books.google.com … Dialog Management Module: This module manages the system dialogs and interactions between the patient and the service. The service utilizes Vaja (http://vaja. nectec. or. th/), a Thai text-to-speech software, in synthesizing audio clips from a dialog corpus. … All 2 versions
Improving language models in speech-based human-machine interaction R Justo, O Saz, A Miguel, MI Torres… – Int J Adv Robotic …, 2013 – cdn.intechopen.com Page 1. International Journal of Advanced Robotic Systems Improving Language Models in Speech-Based Human-Machine Interaction Regular Paper Raquel Justo1,*, Oscar Saz2,3, Antonio Miguel2, M. Inés Torres1 and Eduardo Lleida2 … Cited by 4 Related articles All 4 versions
Influence of Agent Behaviour on Human-Virtual Agent Body Interaction I Stankovi?, B Popovi?, F Focone – Speech and Computer, 2014 – Springer … his emotional state. This requires the integration and synergy between the present modules, and the modules for emotion, speech and user recognition, as well as the dialogue management module. Emotion recognition cues … Cited by 1 Related articles All 3 versions
Chatting to Personalize and Plan Cultural Itineraries. A Sorgente, N Brancati, C Giannone… – UMAP …, 2013 – art.torvergata.it … Figure 1b shows the overall architecture of the Dialogue Agent. The modules do not interact directly with each other, but are governed by the Dialogue Manager module in a hub architecture fashion. The hub acts as a router which sends the requests to each specific module. … Related articles All 7 versions
Towards a Caring Home for Assisted Living B De Carolis, S Ferilli, D Greco – … Intelligence” co-located with the AI* IA, 2013 – ceur-ws.org … described later on. Starting from what has been inferred by the user model component, the dialog management module computes the agent’s move using a strategy based on the information state approach [35]. It represents … Cited by 2 Related articles
Laugh machine J Urbain, R Niewiadomski… – Proceedings …, 2013 – informatik.uni-augsburg.de Page 1. ENTERFACE’12 SUMMER WORKSHOP – FINAL REPORT; PROJECT P2 : LAUGH MACHINE 13 Laugh Machine Jérôme Urbain1, Radoslaw Niewiadomski2, Jennifer Hofmann3, Emeline Bantegnie4, Tobias Baur5, Nadia … Cited by 4 Related articles All 5 versions
Structural Safety and Integrity Assessment and Use of Expert Systems at MPA Stuttgart K Kussmaul, A Jovanovic – … of an International Course October 2- …, 2013 – books.google.com Page 13. Structural Safety and Integrity Assessment and Use of Expert Systems at MPA Stuttgart K. Kussmaul Professor Dr.-Ing., Dr. techn. E. h. and Director of the Staatliche Materialprüfungsanstalt (MPA) Universität Stuttgart … Cited by 1 Related articles
Navmetro: Preliminary Study Applications Of Usability Assessment Methods EJ Ferreira – Human Factors in Design, 2013 – revistas.udesc.br … Then, the dialogue management module processes the user’s response – it is in this stage that the sound guidance system reproduces the sounds through the sound buoys installed according to the scheme depicted by figure 3. … Related articles All 5 versions
[BOOK] Virtual, Augmented and Mixed Reality: Designing and Developing Augmented and Virtual Environments: 6th International Conference, VAMR 2014, Held as … R Shumaker, L Stephanie – 2014 – books.google.com Page 1. Randall Shumaker Stephanie Lackey (Eds.) Virtual, Augmented and Mixed Reality 6th International Conference, VAMR 2014 Designing and Developing Virtual and Augmented Environments Held as Part of HCI International … Related articles All 2 versions
Dialogue-based Exploration of Graphics for Users with a Visual Disability J Plhak – 2014 – is.muni.cz … Image semantics, as well as the dialogue interface, are supported by ontologies representing a formally defined and well-structured knowledge base. Communica- tion with the user is operated by intelligent dialogue management modules. … Related articles All 3 versions
A Chatbot Dialogue Manager-Chatbots and Dialogue Systems: A Hybrid Approach A Woudenberg – 2014 – dspace.learningnetworks.org Page 1. A Chatbot Dialogue Manager Chatbots and Dialogue Systems: A Hybrid Approach AF van Woudenberg, 850272823 June 17, 2014 Page 2. Page 3. A Chatbot Dialogue Manager Chatbots and Dialogue Systems: A Hybrid Approach … Related articles All 3 versions
Domain-sensitive topic management in a modular conversational agent framework D Macias Galindo – 2014 – researchbank.rmit.edu.au Page 1. Domain-Sensitive Topic Management in a Modular Conversational Agent Framework A thesis submitted for the degree of Doctor of Philosophy Daniel Mac?as-Galindo, B.Eng., M.Sc, School of Computer Science and … Related articles All 2 versions
Social talk capabilities for dialogue systems T Klüwer – 2015 – universaar.uni-saarland.de Page 1. Saarbrücken Dissertations | Volume 39 | in Language Science and Technology Social Talk Capabilities for Dialogue Systems Tina Klüwer T in a K lü w e r S o cia l Ta lk C a p a b ilitie s fo r D ia lo g u e S yste m s universaar … Related articles All 2 versions
Modelling Incremental Self-Repair Processing in Dialogue. J Hough – 2014 – qmro.qmul.ac.uk Page 1. Modelling Incremental Self-Repair Processing in Dialogue. Hough, Julian The copyright of this thesis rests with the author and no quotation from it or information derived from it may be published without the prior written consent of the author … Cited by 2
Dialog Systems Based on Markov Decision Processes Over two Real Tasks I Casanueva Pérez – 2013 – addiehu.ehu.es … The bibliography reviewed is about automatic dialogue management , so when developing the whole systems we will focus on the dialogue management module and the rest of the modules (ASR and semantic understanding 1 Page 3. Figure 1: Arquitecture of a SDS. … Related articles All 2 versions
An integrated system for voice command recognition and emergency detection based on audio signals E Principi, S Squartini, R Bonfigli, G Ferroni… – Expert Systems with …, 2015 – Elsevier … Information is extracted from audio signals with a speech recogniser and with a sound classifier. The results are then employed by the understanding and dialogue management module that undertakes appropriate decisions. … Cited by 3 Related articles All 2 versions
A System for Conversational Case-Based Reasoning in Multiple-Disease Medical Diagnosis SL Follesø, O Heimark, M Ekerholt – 2014 – brage.bibsys.no Page 1. A System for Conversational Case-Based Reasoning in Multiple-Disease Medical Diagnosis Marius Ekerholt Sondre Lucas Follesø Øystein Heimark Master of Science in Computer Science Supervisor: Agnar Aamodt, IDI … Related articles All 3 versions
Multi-agent Pickup and Delivery Planning with Transfers B Coltin – 2014 – repository.cmu.edu Page 1. Carnegie Mellon University Research Showcase @ CMU Dissertations Theses and Dissertations 5-2014 Multi-Agent Pickup And Delivery Planning With Transfers Brian Coltin Carnegie Mellon University Follow this … Cited by 2 Related articles All 2 versions
Using emotion as inferred from prosody in language modeling SA Karkhedkar – 2013 – Citeseer … While various components could use emotion information, so far attention has been devoted primarily towards (1) the dialog manager module of a spoken dialog system to decide on an emotion-appropriate response and (2) the synthesis of emotion-appropriate speech. … Related articles All 6 versions