Notes:
Spoken programming refers to the use of natural language, such as English, to interact with a computer or programming environment in order to write and execute code. This can be done through the use of voice recognition software, which allows a person to speak commands and have them translated into text that can be understood by the computer.
Spoken programming can be useful for a number of reasons. For one, it can be faster and more efficient than typing out code manually, particularly for people who are more comfortable speaking than typing. It can also be a more accessible way of programming for people who have disabilities that make it difficult to use a keyboard or mouse.
There are a number of tools and platforms available that support spoken programming, including code editors and integrated development environments (IDEs) that have built-in voice recognition capabilities. Some examples of these tools include Dragon NaturallySpeaking, which is a popular voice recognition software for Windows, and VoiceCode, which is an open-source tool that allows programmers to use voice commands to interact with their code editor.
Resources:
- sourceforge.net/projects/spel .. Speakable Programming Every Language (formerly Human Speakable Programming Language)
References:
- Improving Spoken Programming Through Language Design and the Incorporation of Dynamic Context (2013)
- English for spoken programming (2012)
- Progress in Spoken Programming (2012)
- Developing a Language for Spoken Programming (2011)
See also:
NaturalJava | NLPA (Natural Language Program Analysis) | VoiceCode
Progress in programming the hrp-2 humanoid using spoken language PF Dominey, A Mallet, E Yoshida – Robotics and Automation, …, 2007 – ieeexplore.ieee.org … 2. To do this in a semi- natural and real-time manner using spoken language. In this framework, a system for Spoken Language Programming (SLP) is presented, and experimental results are presented from this prototype system. … Cited by 28 Related articles All 6 versions Cite Save
Real-time spoken-language programming for cooperative interaction with a humanoid apprentice PF Dominey, A Mallet, E Yoshida – International Journal of …, 2009 – World Scientific An apprentice is an able-bodied individual that will interactively assist an expert, and through this interaction, acquire knowledge and skill in the given task domain. A humanoid apprentice should have a useful repertoire of sensory-motor acts that the human can … Cited by 13 Related articles All 3 versions Cite Save
Elements of a spoken language programming interface for robots T Miller, A Exley, W Schuler – Proceedings of the ACM/IEEE international …, 2007 – dl.acm.org Abstract In many settings, such as home care or mobile environments, demands on users’ attention, or users’ anticipated level of formal training, or other on-site conditions will make standard keyboard-and monitor-based robot programming interfaces impractical. In such … Cited by 3 Related articles All 5 versions Cite Save
[BOOK] Spoken language and vision for adaptive human-robot cooperation PF Dominey – 2007 – cdn.intechopen.com … In this framework, a system for Spoken Language Programming (SLP) is presented. … 2. They should be of general utility so that different tasks can be performed with the same set of primitives. 3. Spoken Language Programming … Cited by 8 Related articles All 5 versions Cite Save More
Real-time cooperative behavior acquisition by a humanoid apprentice PF Dominey, A Mallet, E Yoshida – Humanoid Robots, 2007 7th …, 2007 – ieeexplore.ieee.org … We previously defined a system for Spoken Language Programming (SLP) that allowed the user to guide the robot through an arbitrary, task relevant, motor sequence via spoken commands, and to store this sequence as re-usable macro. … Cited by 18 Related articles All 2 versions Cite Save
Anticipation and initiative in human-humanoid interaction PF Dominey, G Metta, F Nori… – Humanoid Robots, 2008. …, 2008 – ieeexplore.ieee.org … until OK signal In addition to the allowing the user to issue iCub specific motion commands, the Spoken Language Programming system implements the different levels of anticipation described in Section II. The RAD programming … Cited by 22 Related articles All 6 versions Cite Save
Towards a platform-independent cooperative human-robot interaction system: I. perception S Lallée, S Lemaignan, A Lenz… – … Robots and Systems …, 2010 – ieeexplore.ieee.org Page 1. Abstract— One of the long term objectives of robotics and artificial cognitive systems is that robots will increasingly be capable of interacting in a cooperative and adaptive manner with their human counterparts in open-ended tasks that can change in real-time. … Cited by 22 Related articles All 8 versions Cite Save
Learning to collaborate by observation S Lallee, F Warneken, P Dominey – Epirob, Venice, Italy, 2009 – pfdominey.perso.sfr.fr … Figure 2. Functions provided by the Supervisor. Learn by observation implemented here. Engage in learned collaboration extended from [12] Learn by Spoken Language Programming implemented in [9-12]. The system then begins execution of the shared plan. … Cited by 6 Related articles All 5 versions Cite Save More
The coordinating role of language in real-time multimodal learning of cooperative tasks M Petit, S Lallée, JD Boucher… – Autonomous Mental …, 2013 – ieeexplore.ieee.org … teacher. Finally we will refer to “spoken language programming” [21] as the method described above where well-formed sentences are used to specific robot actions and arguments, either in isolation or in structured sequences. … Cited by 8 Related articles All 5 versions Cite Save
Linking language with embodied and teleological representations of action for humanoid cognition S Lallee, C Madden, M Hoen… – Frontiers in …, 2010 – ncbi.nlm.nih.gov Cited by 19 Related articles All 14 versions Cite Save 31 Cochlear Implants: Advances, Issues, and Implications PE Spencer, M Marschark… – The Oxford handbook of …, 2010 – books.google.com … of children with CIs. Geers (2006) found a small but statistically significant advantage for chil- dren using CIs in spoken language programming when compared to those in programs using sign language. However, there is both … Cited by 15 Related articles Cite Save
Towards a platform-independent cooperative human robot interaction system: Iii an architecture for learning and executing actions and shared plans S Lallée, U Pattacini, S Lemaignan… – Autonomous Mental …, 2012 – ieeexplore.ieee.org … shared plan. This ability is the basis of the spoken language programming framework and, as we previously demonstrated [16, 29, 30], allows the interaction between the user and the robot to be faster and more efficient. D. Using … Cited by 9 Related articles All 8 versions Cite Save
[BOOK] From motor learning to interaction learning in robots O Sigaud, J Peters – 2010 – Springer … cooperation between them. The focus is put on the use of the Spoken Language Programming approach to facilitate the interaction. 3 Conclusion and Perspectives Robot learning is a young, fruitful and exciting field. It addresses … Cited by 16 Related articles All 26 versions Cite Save
Human-robot cooperation based on interaction learning S Lallée, E Yoshida, A Mallet, F Nori, L Natale… – From motor learning to …, 2010 – Springer … goal. The implementation of this apprentice architecture is one of our long term goal for which we developed the Spoken Language Programming system (SLP). … 1.1 Linking Words to Actions – Spoken Language Programming Robots … Cited by 12 Related articles All 10 versions Cite Save
Using A “Wizard Of Oz” Study To Investigate Issues Related To A Spoken Language Interface For Programming DE Price – 2007 – cs.utah.edu Page 1. USING A “WIZARD OF OZ” STUDY TO INVESTIGATE ISSUES RELATED TO A SPOKEN LANGUAGE INTERFACE FOR PROGRAMMING by David E. Price A thesis submitted to the faculty of The University of Utah in partial fulfillment of the requirements for the degree of … Related articles All 2 versions Cite Save More
Early Education for Deaf Children and Their Families Final Research Report C Enns, S Kelly – 2008 – cc.umanitoba.ca Page 1. Early Education for Deaf Children and Their Families Final Research Report Charlotte Enns and Sarah Kelly University of Manitoba CSSE 2008 Vancouver, British Columbia Page 2. Enns & Kelly 2 Early Education for … Related articles Cite Save More
A hybrid propositional-embodied cognitive architecture for human-robot cooperation PF Dominey, I Tapiero, C Madden… – … , 2008. IJCNN 2008.( …, 2008 – ieeexplore.ieee.org … In this “apprentice” paradigm, the behavioral repertoire of the robot and its associated vocabulary is expanded as a function of the robot’s experience with the human user, ie new behaviors are created, through Spoken Language Programming (SLP) interaction with the human … Related articles Cite Save
The tip of the language iceberg PF Dominey – Language and Cognition, 2013 – Cambridge Univ Press … Artificial Intelligence 167. 31–61. Dominey P., A. Mallet & E. Yoshida. 2009. Real-Time spoken-language programming for cooperative interaction with a humanoid apprentice. International Journal of Humanoids Robotics 6. 147–171. Dominey P. 2003. … Related articles All 3 versions Cite Save
Grounding language in action KJ Rohlfing, J Tani – Autonomous Mental Development, IEEE …, 2011 – ieeexplore.ieee.org … Syst. Man Cybern., B Cybern., vol. 34, pp. 1374–1383, 2004. [7] PF Dominey, A. Mallet, and E. Yoshida, “Real-time spoken-language programming for cooperative interaction with a humanoid apprentice,” Int. J. Human. Robot., vol. 6, no. 2, pp. 147–171, 2009. … Cited by 1 Related articles All 5 versions Cite Save
A Model for Verbal and Non-Verbal Human-Robot Collaboration. L Matignon, AB Karami, AI Mouaddib – AAAI Fall Symposium: Dialog with …, 2010 – aaai.org … In IJCAI ’09, 1684–1689. Dominey, PF; Mallet, A.; and Yoshida, E. 2009. Real- time spoken-language programming for cooperative interac- tion with a humanoid apprentice. IJ Humanoid Robotics 6(2):147–171. Doshi, F., and Roy, N. 2008. The permutable pomdp: fast … Cited by 5 Related articles All 3 versions Cite Save
An augmented reality system for teaching sequential tasks to a household robot R Fung, S Hashimoto, M Inami… – RO-MAN, 2011 IEEE, 2011 – ieeexplore.ieee.org Page 1. An Augmented Reality System for Teaching Sequential Tasks to a Household Robot Richard Fung, Sunao Hashimoto, Masahiko Inami, Takeo Igarashi Abstract—We present a method of instructing a sequential task … Cited by 3 Related articles All 7 versions Cite Save
The language situation in Iceland A Hilmarsson-Dunn, AP Kristinsson – Current Issues in Language …, 2010 – Taylor & Francis Cited by 8 Related articles All 2 versions Cite Save
A neurolinguistic model of grammatical construction processing PF Dominey, M Hoen, T Inui – Journal of Cognitive Neuroscience, 2006 – MIT Press Page 1. A Neurolinguistic Model of Grammatical Construction Processing Peter Ford Dominey1, Michel Hoen1, and Toshio Inui2 Abstract & One of the functions of everyday human language is to com- municate meaning. Thus … Cited by 53 Related articles All 19 versions Cite Save
Authentic Literacy and Communication in Inclusive Settings for Students With Significant Disabilities AL Ruppar – TEACHING Exceptional Children, 2013 – CEC Page 1. Carmen is 15 years old and attends ninth grade at her neighborhood school. She is learning to communi- cate by using two-step switch scanning on a voice-output communication aid (VOCA) and can also use partner- assisted scanning to make choices. … Related articles All 5 versions Cite Save
Integration of action and language knowledge: A roadmap for developmental robotics A Cangelosi, G Metta, G Sagerer, S Nolfi… – Autonomous Mental …, 2010 – ieeexplore.ieee.org Page 1. Copyright (c) 2010 IEEE. Personal use is permitted. For any other purposes, Permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org. This article has been accepted for publication in a future issue of this journal, but has not been fully edited. … Cited by 72 Related articles All 26 versions Cite Save
The Action Game: A computational model for learning repertoires of goals and vocabularies to express them in a population of agents B Jansen, J Cornelis – Interaction Studies, 2012 – ingentaconnect.com Page 1. Interaction Studies 13:2 (2012), ??? doi 10.1075/is.13.2.06jan issn 1572–0373 / e-issn 1572–0381 © John Benjamins Publishing Company The Action Game A computational model for learning repertoires of goals and vocabularies to express them … Cited by 1 Related articles All 3 versions Cite Save
CHRIS: Cooperative Human Robot Interaction Systems (CHRIS D5: Systems Engineering Analysis Report) – chrisfp7.eu Page 1. This document contains: Deliverable 5 (D5), Month 17 – Systems Engineering Analysis Report Appendix 1: Snapshot of YARP ports Appendix 2: D5.1 Adaptive Planning Capability Report (Internal Deliverable) Appendix … Related articles Cite Save More
An experiment on behavior generalization and the emergence of linguistic compositionality in evolving robots E Tuci, T Ferrauto, A Zeschel… – … , IEEE Transactions on, 2011 – ieeexplore.ieee.org Page 1. Copyright (c) 2011 IEEE. Personal use is permitted. For any other purposes, Permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org. This article has been accepted for publication in a future issue of this journal, but has not been fully edited. … Cited by 11 Related articles All 14 versions Cite Save
[BOOK] Developing dialogues: Indigenous and ethnic community broadcasting in Australia S Forde, K Foxwell, M Meadows – 2009 – books.google.com … 5. Audiences for ethnic community radio Ethnic audiences for community radio acknowledge the essential service nature of local stations in maintaining languages and cultures through specialist music and spoken language programming. … Cited by 17 Related articles All 3 versions Cite Save
Using brain–computer interfaces to detect human satisfaction in human–robot interaction ET Esfahani, V Sundararajan – International Journal of Humanoid …, 2011 – World Scientific Page 1. S0219843611002356 International Journal of Humanoid Robotics Vol. 8, No. 1 (2011) 87–101 c© World Scientific Publishing Company DOI: 10.1142/ S0219843611002356 USING BRAIN–COMPUTER INTERFACES … Cited by 5 Related articles All 2 versions Cite Save
Interface Issues in Robot Scrub Nurse Design A Agovic – 2011 – conservancy.umn.edu Page 1. Interface Issues in Robot Scrub Nurse Design A DISSERTATION SUBMITTED TO THE FACULTY OF THE GRADUATE SCHOOL OF THE UNIVERSITY OF MINNESOTA BY Amer Agovic IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF … Related articles All 2 versions Cite Save More
Robot cognitive control with a neurophysiologically inspired reinforcement learning model M Khamassi, S Lallée, P Enel, E Procyk… – Frontiers in …, 2011 – ncbi.nlm.nih.gov We are sorry, but NCBI web applications do not support your browser, and may not function properly. More information here… … Cited by 15 Related articles All 19 versions Cite Save