I would not say‘s backend is powered by Nuance. I would say Siri’s “frontend” is powered by Nuance, in particular speech-in ( ) and perhaps speech-out ( ).
Siri’s backend was originally based on a government sponsored (SRI) project called PAL (Personalized Assistant that Learns)., the Cognitive Agent that Learns and Organizes, was developed as part of the PAL project. What is not so widely known is that SPARK (SRI Procedural Agent Realization Kit) forms the heart of CALO’s task execution. The real strength of Siri lies in this middleware infrastructure that integrates more than 35 APIs.
I don’t know what conversation schema Siri uses. However, most Loebner class chatbots use markup languages similar to, making chatbots a sophisticated form of search. Professional AI researchers tend to bristle at chatbot mavens and claim that this is not true AI. (However, my own position is that sophisticated chatbots are sufficient for like 95% of real world problems, such as conversational interfaces for the Internet of Things.)
Not much seems to have come out yet about the inner workings of. I would like to learn more about how it works myself.