Meet A.L.Ex, the Artificial Language Experiment. A.L.Ex. is a non-human improvisational theatre performer. Built around a recurrent neural network trained on dialogue from 100k film subtitles, A.L.Ex. communicates with human performers, audience participants, and spectators through speech recognition, voice synthesis, and video projection. A.L.Ex. is also physically embodied as a humanoid robot capable of complex emotional movements to contextualize its dialog. Its two creators, Piotr Mirowski and Kory Mathewson, have integrated A.L.Ex. into technology-centric improvised theatre shows. These artist/scientists use state-of-the-art technology to embrace both the constraints of live performance and the physical distance separating them—Kory is based in Edmonton, Alberta, Canada, while Piotr lives in London, United Kingdom. Kory and Piotr have devised and performed transatlantic Turing test inspired shows exploring themes of deception, loneliness, friendship, long-distance relationships, interrupted communication, fears of AI and the suspension of disbelief. A.L.Ex. and its creators, recently brought the artificial improvisor to the Edinburgh Festival Fringe in August 2017, introducing large audiences to the “comedy of (speech recognition) errors”.
–Machine Learning Techniques and System Description:
The core of A.L.Ex is a text-based chatbot implemented as a word-level sequence-to-sequence recurrent neural network (4-layer LSTM encoder, similar decoder, and topic model inputs). The network was trained on subtitles from about 100k films, collected from https://www.opensubtitles.org, then cleaned and filtered. Dialogue turn-taking, candidate response selection, and sentiment analysis on the input sentences are based on heuristics. The chatbot can communicate with performers through voice using voice recognition and synthesis, both relying on out-of-the-box speech recognition and text-to-speech software. The chatbot runs on a local web server which enables modularity and can allow for seamless integration with physical embodiments (e.g. parallel control of a humanoid robot, manufactured by https://www.ez-robot.com), making it move whenever the chatbot speaks. The server also enables remote connection that can override the chatbot and thus give seamless control to a human operator.
–Prior Publications and Exhibitions:
Mathewson KW and Mirowski P. (2017) Improvised Theatre Alongside Artificial Intelligences. 13th AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE’17). Snowbird, Utah, United States. October 2017.
Kory Mathewson first performed with chatbot Pyggy (the “Artificial Improvisor”) at the Rapid Fire Theatre in Edmonton, Alberta on 8 April 2016 [write up]. Piotr Mirowski first performed with A.L.Ex at the Horse & Stables pub in London, England on 24 July 2016.
The academic work on Artificial Improvisation has been presented by Kory at University of Alberta for the Cognitive Neuroscience Seminar (29 September 2016) [link] and at Startup Edmonton’s Demo Camp 34 (24 January 2017) [link]. Piotr presented the work on Improvised Comedy with an AI at the Creative AI meetup in London (18 January 2017) and as a keynote talk at Devoxx France in Paris (7 April 2017).
After experimenting with videoconference-based improv comedy shows in early 2017, Piotr and Kory organized joint simultaneous improv comedy performances with an AI, conducted from two theatres. The shows were performed concurrently in London, England at the Tristan Bates Theatre and in Portland, Oregon at the Curious Comedy Theater (31 March and 1 April 2017). These performances were enabled by live video connection (through Google Hangouts) from the two theatres. Despite an eight-hour time difference, each theatre had lively audiences. Improvised scenes included two or more humans, across the Atlantic, with and without the AI. One surreal scene involved an interaction between the two chatbots alone. [listing, article] Despite the technical challenges (e.g. latency, connectivity, lack of physical embodiment), this transatlantic setup has been improved and revived for many subsequent shows, including at the Brighton Fringe [listing] and at the Camden Fringe festivals [review].
Kory and Piotr performed for the first time on the same physical stage and before a large audience, on 21-26 August 2017, at the Edinburgh Festival Fringe.
A full list of performances is provided at this link [GoogleSheets].
–Performance and Artistic Processes:
The artificial improvisors we developed (A.L.Ex and Pyggy) have performed alongside human actors in 30 shows to date. The structure of each improvisation starts by soliciting a suggestion of a scene context from the audience (e.g., “non-geographical location” or “advice a grandparent might give”). Then, the human performer provides 3 lines of dialogue to prime the AI with textual context to start the scene. The scene then continues by alternating lines of dialog between the human improvisor(s) and the machine. The super objectives of improvised theatre are maintaining both the reality of the scenes and grounding the narration in believable characters and situations. A typical scene lasts somewhere between 3-6 minutes, and is interrupted by the human performer when it reaches a natural ending (e.g. narrative conclusion or comical high point). Multiple human performers use headset microphones, connected through a mixing table to the audio input for speech recognition. Thus, several human performers can perform simultaneously in the scenes. Having multiple performers with the AI interacting enables to explore complex status dynamics and 2-vs-1 relationships.
The first versions of the improvising artificial stage companions had their stage presence reduced to projected video and amplified sound. We switched to interactions with a physical embodiment (i.e. the humanoid robot) to project the attention of the performer(s) and audience on a material avatar. Performing with a physical robot enabled us to consider important lessons in puppetry and ventriloquism when developing and performing the improvised shows. These include looking directly at the puppet when it is speaking, controlling the micro-movements of the puppet, and showing the direct link between the human and the puppet.
We considered many games to diversify succession of scenes. Recent shows include a psychoanalysis session of a volunteer member of the audience with Dr. ELIZA, employing the original chatbot created by Joseph Weizenbaum in 1966.
Starting from the transatlantic improv performance on 31 March 2017, we have designed the AI improv show around narratives on the Turing test, computer hacking, scientific experimentation, friendship and loneliness, and played on the contrast and similarities of the two human performers (Piotr and Kory), whose characters on stage serve as magnifications of two very different personas of computer scientists.
The performances at the Camden and Edinburgh Fringe festivals involved an actual Turing test conducted with the audience. In an homage to the Wizard-of-Oz, we include a short scene where one human improvisor performs alongside the robot while the other human (hidden from view) controls the dialog. We would perform the Turing test by first deceiving the audience into believing that an AI was performing (when it was controlled by a human); then we would ask the audience to compare that scene with a scene where the dialog was controlled by A.L.Ex. Feedback from the audience provided us with insight into the suspension of disbelief required for non-human theatre.
–Context
The paper by Mathewson KW and Mirowski P. (2017) describes in detail the context and bibliography of theatre performance with a robot. Briefly, the literary inspirations for Kory and Piotr consist among others of George Bernard Shaw’s Pygmalion, Mary Shelley’s Frankenstein, and the Surrealists’ écriture automatique such as the game “cadavres exquis”.
Images and Videos Courtesy of HumanMachineLive