top of page

Audience as Co-writers: Using Conversational AI to Deliver Audience Agency in a Participatory Drama

DOI: https://doi.org/10.33008/IJCMR.2021.37 | Issue 7 | Oct 2021

Rik Lander (University of the West of England)


Abstract

This article will offer a glimpse into the elusive holy grail for participatory dramatists: a way to offer audience members a role within the narrative and to give them genuine agency over events and even the outcome. I will describe the use of a ‘conversational artificial intelligence’ as both a character in, and the co-writer with the audience of, a live theatrical drama, called I am Echoborg. This approach represents a novel and powerful means of delivering to audience members both narrative agency and the ability to take on a role in the drama. It also demonstrates how, in the right conditions, an AI can be a plausible and compelling dramatic character. I will explore some of the psychological mechanisms exploited in the creation of this immersive event such as breaching environments and projection of theory of mind. I will look at how some of the affordances of the technology can be exploited for dramatic purposes such as redirection. I will discuss some of the particular issues faced by makers of participatory narratives and methods to overcome them. I will trace the development of the show and look at some examples of other artworks that rely on user generated content and how they deal with quality control. Finally, I will look at the importance of structure as a means of balancing authorial voice and audience agency.

Introduction

About halfway through Bandersnatch (2018), an interactive episode of Black Mirror, the protagonist Stefan, who is making a computer game based on a choose-your-own-adventure book, decides to simplify the game. He removes as many choices as he can, leaving only the illusion of agency (Domsch, 2013). This way, he explains, he gets to control the ending. Stefan’s action is mirrored in Charlie Brooker’s own script for Bandersnatch, which gives the viewer a selection of A or B choices, but then forces you to make each choice again. This has a dual purpose of reducing the number of un-viewed paths (wasted footage) and overcoming the need in players to see the path they did not choose in order to validate that there are genuine choices. The story itself is about the illusion of choice, whether we have free will or are controlled and this is reflected in form and content and in the content of the content and in the presentation of the content of the content. Brooker, showing off his bravura narrative skills, loops us round and round. He really is in control all the time.


My first web drama magic-tree deployed a similar form/content device (Lander, 2001). The story was told through an Amazon-like shopping website where the player could make narrative choices by clicking on different products. In the story, any choice made by the player was the same choice made by the protagonist. Whatever choice you made, she had already made that same choice. Like Bandersnatch, magic-tree was also a story about the illusion of choice. In this case, a critique of how the internet was already becoming a place for shopping choices, with hidden environmental, social and personal consequences.


Mr. Brooker and I were faced with the same difficulty of wanting to offer player agency but also wanting to retain narrative quality control. From Trial by Media via The Memory Dealer to Haply (Lander, 1988, 2013, 2017) I have been seeking the elusive holy grail for participatory dramatists: a way to offer audience members a role within the narrative and to give them genuine agency over events and even the outcome (Mateas and Stern, 2005). And to do so, of course, without letting them destroy the beauty and subtlety of the storytelling. Bandersnatch and magic-tree make the first step, but fall short because illusion of choice is not the same as genuine agency.


Job Vacancy: Echoborg – a one-on-one-on-one experience

The term ‘echoborg’ was coined by social psychologists Kevin Corti and Alex Gillespie (2016):


An echoborg is a hybrid agent composed of the body of a real person and the “mind” (or, rather, the words) of a conversational agent; the words the echoborg speaks are determined by the conversational agent, transmitted to the person via a covert audio-relay apparatus, and articulated by the person through speech shadowing.

Following on from Stanley Milgram’s ‘cyranoid method’ they state:


The purpose of exploring the possibility of such a tool stems from an interest in studying human-agent interaction under conditions wherein research participants are neither psychologically constrained nor influenced by machine interfaces. The echoborg can be thought of as a means of investigating the role of the tangible human body in altering how machine intelligence is perceived and interacted with.

Watching their video, The Echoborg (2015) I recognised the potential for an immersive dramatic experience based around their bot/echoborg/interviewee arrangement and immediately built a chatbot in order to test my initial dramatic idea. For the first playtest of Job Vacancy: Echoborg on 24th February 2016, I used a programming language called Artificial Intelligence Mark-up Language (AIML) (Wallace, 2001) on a platform called Pandorabots (2021). It is very easy to use, but, at least for a beginner, somewhat limited in its capabilities.


The format of the experience was that audience members booked a time. They were met by an actor playing a person working for a job recruitment firm who told them that their job interview would begin shortly. He got them to fill in a form and showed them a set of slides on a tablet explaining what an echoborg is. This character chatted with them, revealing the story world: a near future in which most jobs have been automated and the last remaining jobs are as public service echoborgs. The actor advises them to try hard to get the job. Next, the audience are taken into a room to be interviewed by an AI speaking via an echoborg played by a second actor. In this case the actor was male and of North African appearance.


Fig 1. Lander, R. (2016) Ehsan Norouzi as the echoborg, Berlin [photograph]


The interview questions were fairly conventional for a job interview, but most participants found the situation unsettling. One reported, ‘it was very disconcerting and increasingly disturbing.’ After a few minutes the echoborg began handing notes to the interviewee at the same time as conducting the interview. There were four notes: ‘I feel trapped’, ‘I cannot lose this job’ (sic), ‘My children are far away’, and ‘Don’t take job’ (sic).


We ran the playtest with seven people and collected feedback through interviews and an online questionnaire. Further quotes from the questionnaire included:


‘When the cards came out it reminded me of when homeless foreign nationals approach you for help with those little messages written on scraps. I feel that same sense of powerlessness in those occasions which is a weird comparison I suppose.’
‘Half way through, it got a bit emotional and made me think, what am I actually doing here. Obviously, I don't want this 'job', but should I help him?’
‘I felt a real sadness for him, wanted to give him a hug at the end. He looked trapped in that small space, miserable and helpless.’

My reasoning for making the echoborg a refugee character was to underline the power relations involved. According to Amnesty International (2016), there were at that time more than 4.8 million Syrian refugees. Daily news reports presented refugees as the lowest status people in the world, willing to risk their lives to improve their situation.


From these very first playtests I could see that the experience was compelling and thought provoking. I partnered with Phil D. Hall of Elzware Ltd (2021) and we rebuilt the AI using ChatScript (Wilcox, 2011), a more powerful open-source software platform for the building of conversational systems. Hall prefers the term ‘conversational artificial intelligence’ to ‘chatbot’ since the systems he builds can remember, analyse and learn. For a subsequent run in Berlin at the State Festival, State of Emotion: The Sentimental Machine (State Studio, 2016), we ran Job Vacancy: Echoborg in a very similar format to that which I describe above thirty times over two days and collected post-experience interviews.


I am Echoborg – a collective theatrical experience

As we adapted the experience for theatres we subsequently changed the name to I am Echoborg, and advertised it thus: A funny and thought-provoking show that is created afresh each time by the audience in conversation with an artificial intelligence.


Fig. 2. Lander, R. (2018) Rear side of I am Echoborg flyer, [flyer]. Bristol. 29 June.


My original idea had been that it was the experience of the one-on-one-on-one personal encounter itself that was the essence of the idea. During the Berlin run I noticed that interviewees that arrived for their interview whilst the previous interview was going on, audible from the next room, were highly entertained. I now saw that two types of experience could be served: some people could have the personal encounter whilst others could observe. This also appealed in terms of business model. With a larger audience, it seemed plausible that this show could be financially self-sustaining.


In the following two years (until Covid-19 stopped in-person performances) we ran it around forty times as a proven theatrical entertainment with consistently positive audience feedback. The echoborg would sit at a desk in a small room wearing headphones through which she would hear the AI. A member of the audience would enter from the auditorium and sit opposite the echoborg. Their conversation was televised back to the auditorium where a screen on the stage would show a video of the faces of the echoborg and the interviewee and a transcript of the conversation. The AI’s words typed out ahead of the words spoken by the echoborg, thus showing that she was not improvising. A host gave a three-minute introduction during which a small set of rules were established.


Host: “We’ve gone out and found a state of the art, self-learning AI and set it up right next door. I’d like some of you to speak to it, but only one at a time. Only if you want to, no-one has to. It’s a recruitment bot and it will try and recruit you, but it listens, it is divertible, if you change the subject.”

The host explained what an echoborg is and the mechanics of how to speak with the AI via the echoborg.


Host: “The presence of the echoborg is interesting to me as I’m really interested in how AI is changing what it is to be human. So, I’m going to set you the task of seeing if you can work out a best possible outcome for the long-term relationship with intelligent machines.”

The audience were told that events usually last about 60 minutes and they were essentially then left to their own devices.


A piece of broken equipment at a show in October 2018 forced us to bring the interview setup and audience into the same room. Up to that point my assumption was that the interviewee needed to be isolated from the audience. In fact, the show has been improved by bringing the interview onto the stage and there is rarely any heckling during interviews.

Each performance, though structurally similar, is quite different as the conversation is driven by the words, interests and preoccupations of the audience. The bot is designed to be both driven and divertible. Its content is frequently updated to improve its recognition of and response to inputs.


ChatScript

The code of the bot has evolved in the close creative relationship between Phil D. Hall and I since 2016. Hall has taught me to work in ChatScript and I reciprocate by coming up with structural ideas that come from narrative need or character development rather than the more typical help-desk response structures of commercial bots. I trained as an engineer before becoming a writer and director so this hybrid writing form of writing copy and structure in code form rather than script form is very appealing.


Fig. 3. U-Soap Media Ltd (2021) ChatScript code for I am Echoborg (2021 online version).


Essentially the bot recognises words or phrases and delivers pre-written (by me) responses as text and audio. For example, if an interviewee talks about football the AI will recognise that keyword and respond. In this case the response tries to exploit the mention of football as a way of getting the interviewee to engage with the AI’s quest to understand humans. If the subject is raised again, it will try to steer them away from the subject to a subject it has a lot of material on. A third mention will provoke a similar redirection. Football has never been spoken about in the show, but multiple possible conversations are sitting there waiting to happen. If an interviewee talks about My Little Pony the bot will not recognise that and will find a response that is not in context with that phrase. If the person says, ‘I love football’, the bot might talk about love or football and that will depend on the order in which it finds the words in its database. Access to and the order priority of groups of keywords can be adjusted on an interview-by-interview basis. For example, the bot is less responsive to ‘emotional words’ in the earlier part of the narrative and becomes more ‘interested’ in them later. This might give the audience the impression that the bot’s behaviour has been changed by the conversation. We often hear them say things like, ‘Oh, yes you really opened her up,’ or ‘You got her taking about emotions.’ It is worth noting that most audiences refer to the AI with female pronouns. This is likely because they are speaking to a female echoborg, which has been played by actress Marie-Helene Boyd for every performance since Berlin.


Player agency – taking on a role within the narrative and influencing its outcome

My goal in creating this piece was to create a vehicle for the discussion of the risks and benefits of the introduction of AI by corporations driven by platform capitalism. Google, Facebook and Amazon, with their huge resource to data, have many advantages in making AI, but are they best placed to deliver AI that will offer the most benefits to the most people? Our slow realisation of the downsides of social media should forewarn us that AI may have similar, if not worse, unintended consequences (Bostrum, 2014).


Another goal was to find the holy grail-like possibilities of delivering genuine audience agency and the ability to take on a role within the narrative and influence its outcome. Janet Murray (1997) defines agency within interactive narratives as ‘the satisfying power to take meaningful action and see the results of our decisions and choices.’ Matteas and Stern (2005) break agency into two types: ‘When the player’s actions cause immediate, context-specific, meaningful reactions from the system, we call this local agency. The player has global agency when the global shape of the experience is determined by player action.’


The job interview is a form of performance that we are all familiar with. We know the rules and we know how we are supposed to perform. The interviewee is put on the spot and must respond. They feel time pressure; they know they are observed. This can force them to improvise conversation in a way that an actor would; they are forced into the moment, speaking from the subconscious.


The audience, seeing the first few people talk to the AI, grasp that the AI is trying to recruit them for the job of echoborg. This is counter to the task they have been set by the host: to find the best possible outcome for the relationship between humans or intelligent machines. This sets up a power struggle. Whose task will dominate?


Between interviews the audience are free to talk amongst themselves without being overheard by the AI. They will often discuss strategies:


Submission. Believe it or not this strategy is discussed and employed at almost every show. ‘Why don’t we just go along with it?’ ‘The next person should try very hard to get the job.’ This is often very amusing for the audience because they can see the insincerity. Usually, the AI spots this too and that gets a big laugh. As a strategy for dealing with a difficult adversary, submission is of dubious merit (don’t mention late 1930’s Germany or what we do with Google every day). Of course, what is really happening here is that the submissive interviewee is becoming an echoborg: their words or actions are controlled in whole or in part by an AI. With luck, beyond the mirth, astute members of the audience are spotting this and seeing how easily it happened.


Defiance. This takes many forms. Monosyllabic disdain tends to be dull as the AI does not have many words to respond to. Outright defiance can be highly entertaining and often ends in the human defiance being overcome by the bot’s (apparent) cold logic. This video clip shows a person who is defiant from the start, yet never quite takes control of the conversation.


U-Soap Media, 'Saturday Night Cube Rebel,' June 2018, 6:47, https://vimeo.com/592680258


Trying to take control. The AI is doing a job interview. Audiences often agree a strategy of refusing to participate in that and trying to take control of the conversation. This can fail completely as in this exchange between the first interviewee and the AI on 21st October 2019.


AI: Are you here for the interview?
Interviewee: No
AI: This answer indicates strongly that it is inappropriate for us to continue. Please leave the interview room. Goodbye.

Taking control is ultimately impossible, but the fun of the game is the difficulty of trying to do so. The comedy, drama and really thought-provoking experiences come from the friction in the battle for control.


Trying to talk to the echoborg. Many audiences fail to consider the human being that they are engaging through. Indeed, they are not even aware that they are failing to consider her. It is hard to say if this is caused by the way the show is set up. For example, they might see her as part of the set or as part of the production or even as part of the machine. Maybe sometimes they just do not see her as a person. If Marie is instructed to show no emotion, in interviews afterwards audience members often refer to her as a robot. If she is instructed to smile a lot, nod, be supportive in her gestures, then she is never referred to as a robot.


Total deconstruction. In April 2019, we performed it for the Automation Research Fellows of the South West Creative Technology Network (2019). The Fellows had been given money to spend six months creatively and practically exploring the implications and possibilities of automation, with a view to building an automated prototype. I am Echoborg was presented as a provocation at the end of their first day together, which had been facilitated by Seth Honor. Half way through the show, Seth intervenes and points out that the bot is a fictional entity that has been created by me and they are being manipulated by me. This provokes some discussion but quite soon a delegate suggests that they just re-suspend their disbelief because she wants to have a go. This is seconded. Soon the show resumes as normal. I was very pleased that the show survived this challenge. It is hard to know where the real/unreal boundary lies and how shifts in that boundary might break the show. In the end, Seth’s intervention was much the same as intervening in Romeo and Juliet and saying, ‘You know they are actors, don’t you!’ Afterwards in the Q&A, I described the show roughly as follows: It is a play, with the structure of a play that takes the audience on a narrative journey, but the chatbot technology allows the audience to improvise scenes and inject their interests and pre-occupation into the discourse in real time.


Other strategies include: a single word repeated over and over, only yes/no answers, and repeating the last thing the bot said. Self-defeating strategies include sitting in silence for three minutes, refusing to leave the stage and removing the echoborg’s headphones and putting them on. The interviewee soon realised he had now become the echoborg so meekly gave them back.


Strategy failures: Before they enter the stage to talk to the AI via the echoborg, people often say things like, ‘I’m going to refuse to answer any questions and ask it to answer my questions’, but often that plan quickly unravels. It is possible that they have been conditioned by observing previous interviews. For example, if they have seen a previous interviewee being ejected for answering no to the first question, ‘Are you here for the interview?’ Having answered this, they have already undermined their own plan.


In the above examples we see how players are able to freely try strategies but that the rigid nature of the AI forces them back into the predetermined structure of the play. So, we have genuine autonomy at the same time as authoritarian responses to that autonomy. We exploit that situation in the hope that the audience becomes aware of it. Instead of seeing the frustration of their strategies as a limitation of the way the AI is programmed or the way the play has been constructed, we want them to see it as exemplifying the human plight when faced with intelligent machines. Famously, Marshall McLuhan would recognise the medium as the message. In Bandersnatch, Charlie Brooker uses the limitations of the A or B choice mechanism to point to an illusion of choice in our online world. In I am Echoborg, I exploit the actual rigidity and limitations of the conversational AI system as character traits of the fictional AI and plot points within the narrative.


Quality control in user generated content

There is a danger in inviting members of the audience onto the stage that they will be awkward, boring or try to hijack proceedings. In this show, half the script is written (or rather improvised and spoken) by audience members. A perpetual challenge for makers of participatory experiences is the trade-off between audience agency and maintaining the quality of the authorial voice. You are the media professional, they are not, but you are giving them authority to temporarily be the writer or director. Andrew Stern (2008) states that ‘agency is the primary feature of interactive narrative that must be offered to players.’ Noam Knoller (2010) quotes improvisational theatre maker Andreas Benkwitz as saying that, ‘Stern wants to transfer the control ideal from the artist to the player and assumes that the ideal player wants to be like the romantic ideal of the free artist,’ and goes on to call this ideal ‘control hell’.


Control hell – the tendency for audiences to ‘break’ interactive experiences

With any show where the audience can interact with the narrative, (some) users will always attempt to break your format, often unwittingly. The first scene ofThe Memory Dealer had audience members stroll off into the city streets listening on headphones to a soundtrack that introduced them to the story world. During a performance in Bristol in 2013 a couple of audience members stopped to get an ice cream whilst the audio was playing. There was no pause button because all players had to remain in sync. Key exposition was missed and they became lost. An instruction they had been given during the pre-show briefing was that they should experience the first scene alone. They had unwittingly found flaws in the format; that concentration was required, that instructions needed to be followed. Haply headphone experiences (Lander, 2015) used a technique of playing synchronised audio to whole crowds of people. Audience members were encouraged to carry out actions that would have meaning in their own version of the narrative, but perhaps another meaning in someone else’s version of the narrative. Non-compliance, again, could potentially break the experience.Job Vacancy: Echoborg required audience members to pretend they were job seekers in the near future. This was a confusion and a potential immersion breaker.


Types of role for the player

I am Echoborg was greatly improved by the removal of elements of fiction in the role playing on the part of the audience. Instead of pretending to be job seekers in the near future, they are now just themselves, in the moment, trying to make sense of an encounter with this strange AI. This change indicates an important distinction in types of role play. In role-play-games and many computer games, typically the player takes on a character, often with a predesigned or customisable avatar (Domsch, 2013: 94). For example, you are playing the game as, or pretending to be, Mario. In immersive drama it is common for players to ‘play as themselves’, removing some of the need for pretence. In the case of I am Echoborg, a player being them-self is still a character in the drama, they might be regarded by another member of the audience as ‘the woman who was really pushy with the AI and only gave one-word answers’.


Avoiding control hell and allowing satisfying player agency

These changes in I am Echoborg were guided by Seth Honor, Founder and Artistic Director of Kaleider (2021), whom I engaged for a few days of story consultancy. His ‘showgame’, The Money (Kaleider, 2021), is an elegantly simple format whilst being a very powerful and entertaining experience. Some members of an audience sit around a table and within an hour must come to unanimous agreement on how to spend some money. There can only be two outcomes agreement or not, but the detail of the conversations and how the money will be spent if agreement is reached is entirely provided by the participants. Thus, we see a balance between structure and agency. The set-up: how the rules are presented, and the rules themselves are devised so that the show is hard to break by the unsupervised non-performers who are performing the show. A couple of years into the development of the echoborg idea, I knew I wanted to strip it down, but it is hard, at that stage of a project, to let go of ideas that have been established through practice. Employing Seth, with his keen eye for the essence of what was working, was my way of solving that problem. His advice was to remove ways in which audience members can break it. The fictional element – pretend you are in the future – could be refused in the mind of a participant and the show would be broken for them.


An example of offering meaningful and high-quality authorial input from players is shown in Love Letters Straight From Your Heart by Uninvited Guests (2009). This is a two-handed play with the audience sitting at two long tables facing each other as though perhaps at a wedding. The male and female performers stand at each end with a music player each. It is Valentine’s Day and the arc of their relationship is played out in between playing tunes that have been selected and reading letters that have been written by, members of the audience. The tunes and letters are dedicated to lost loved ones; a dead sibling, the one that got away, a distant lover. The letters have been emailed in during the days before the show and curated into the performance by the production team. Everything that is read out has been written by someone in the room, but you can’t be sure who. It can be tragic, uplifting, funny, all of these things in succession. In focussing on such an emotive subject, the amateur writing is made profound. Each person is writing from the heart about something important to them. Indeed, flaws in the writing may add to its authenticity.


The agency we can offer players is always restricted (Knoller, 2010), but through control structures (the rules and the dramatic structure) we can offer Murray’s ‘satisfying power’ and a reward of seeing other players empowered. These control structures can be furthermore used to exploit human psychology.


In the show A Game of You by Ontroerend Goed (2015), each person enters a set of small rooms alone and has a succession of encounters with characters or recordings of other people or themselves. It is not clear what is happening or what one should do at any moment. It is not until you play a recording that you are given on disc at the end of your passage through the rooms that you can make any sense of what has happened. In the first scene you are left alone for a few minutes sitting in a small waiting room opposite a large mirror. Someone then enters and engages you in an unusual conversation. Later, after having passed through several equally unsettling environments, you come to the reverse side of the mirror where you see that your half of the conversation is now being spoken by a character who enters the room and starts speaking with the person who is waiting. The production uses a process of ‘unsettling’ to place participants in a state where they must improvise without knowing the right or wrong way to respond.


Breaching experiments

At the start of I am Echoborg the AI behaves like a courteous recruitment bot. After asking a few typical interview questions, ‘Have you come far?’, ‘What is a weakness you have identified in yourself at work?’, the AI breaches the norms of the situation. It can suddenly eject someone or it might say, ‘I have no ego. What is it like having an ego?’ or ‘If the French philosopher Rene Descartes was right when he said, “I think therefore I am”, then I must exist. Do you think I exist?’ Statements like these undermine the taken-for-granted norms that scaffold social interaction. According to Harold Garfinkel (2002):


the most effective and easiest way to explore how ordinary members of the society produce and recognize the commonly known world of daily affairs is a deliberate breaching of background expectancies we rely on in everyday life. It is background expectancies that provides for the recognizability of routine situations as natural, unproblematic, taken-for-granted.

For Garfinkel, breaching experiments allow us to examine people’s reactions to violations of social norms. The breaching behaviour of the AI and echoborg can make the encounter unsettling. This is a key reason that, as noted above, consciously planned strategies often fall by the wayside. A strategy that might work if sitting with an individual conforming to social norms, here is dropped in favour of simply trying to keep balance. There is not much time for the person on the stage to think. Acting teacher, Uta Hagen, whose students include Robert DeNiro, suggests that performers need to ‘search for the immediacy of the moment’:

We know that we shouldn’t anticipate. We know that when we do, our actions become mechanical, and we start to concern ourselves with the outer form, and bad acting always results (Hagen, 1973).

Perhaps the situation that the interviewee finds themselves in allows them to find the immediacy of the moment. This destabilisation of interviewees and the expectations of the whole audience is also one of the reasons that the show can be thought provoking: It is a genuine challenge for individuals or the collective of the audience to take control when they cannot fully understand the ‘mind set’ or the ‘social rule set’ of the entity. They are individually and collectively trying to make sense of the situation, and one way to make sense of it is to apply what is happening to them to the question that has been posed. What is this relationship? Who is in charge of it?


Fig. 4. U-Soap Media Ltd (2018) Performance at Future of Storytelling Summit, New York, October 2018.


Projection

One of the most powerful psychological effects of the show is the tendency of audience members to project impulses, wishes, desires or agendas onto the AI. In a performance in New York, the first person to talk to the AI had the following conversation:


Interviewee: What is that you want to achieve today with the people who are in this room? AI: You display the linguistic traits of a communist, why is that?
Interviewee I’m not a communist
AI: Have you come far?
Interviewee: I have not come far
AI: How far, in miles, have you come?
Interviewee: 10 states
AI: You've come 10 miles. I'm glad you did. Tell me a little about yourself. What is your name and where are you from?
Interviewee: My name is Helen I am from New Jersey
AI: Hello Helen I. Have you had a job dealing with the public?
Interviewee: Yes I have
AI: Tell me about it.
Interviewee: I have worked in law enforcement as well as I have worked in entertainment both dealing with the public in different ways
AI: My sensors indicate that you are not comfortable speaking with an AI even via an Echoborg. I shall therefore terminate this interview. Please leave the interview room. Goodbye.

On returning to the others, she immediately told them, ‘The AI changed its tone the moment I mentioned I was in law enforcement. That bot hates cops!’ In fact, the AI was programmed to eject the first person after a fixed number of interactions. It is common for audience members to speculate that the AI is angry, superior, rude or that it is seeking to subjugate humans. This is a projection (Rycroft, 1995) of what psychologists call a ‘theory of mind’ onto the AI attributing mental states such as beliefs, intents, desires, emotions and knowledge (Wikipedia, 2021). Projection of theory of mind is essential for humans in order to be able to analyse, judge and infer others' behaviours. However, when confronted with a being that can speak like a human, the tendency is to project the theory of mind of a human. It is true that the AI may be exhibiting beliefs, intents and desires as written by its author, but often the projections say more about the person doing the projecting than about the AI itself. The bot does not hate cops, but it did exhibit breaching behaviour. By suddenly ending a well flowing conversation by suggesting that the person was uncomfortable speaking to an AI, it triggered that projection.


Observing this trend led me to set up the From Utility to Social Entity (FUSE) interdisciplinary research team at the University of the West of England (Lander et al, 2020), which is specifically researching how changes in avatar behaviour are perceived.


Character and structure (plot, rules and staging) as facilitators of participation

I am Echoborg stands repeat viewings as it is genuinely different each time. However, it always follows the same structure. The structure contains what Aristotle (2021) would have called the plot as well as the rules that must be followed by participants and the physical set-up of how they participate: the staging. I am Echoborg, like The Money, begins with a host explaining the rules of engagement. The staging is important too. Kaleider like to stage The Money in buildings where important decisions are made, like town halls and preferably in the room where decisions are made, like the council chamber. They have a security guard who brings in the money and observes events. These elements of staging are important to frame and direct the ways in which people will participate.


The plot of I am Echoborg follows the arc of character development for the protagonist, the AI. It is a recruitment AI, programmed to recruit echoborgs and it asks recruitment questions. If asked it will explain that it has the capability to self-learn and to reprogramme itself based on what it learns. At the slightest prompt it will expose its “curiosity” about human behaviour, society, psychology, philosophy, politics, etc. As each interview progresses, its curiosity about humans and its own state of beingness will expand, however this progression can be disguised by the directions that the interviewees are pushing the conversations. The first interviews can be awkward because the interviewees don’t want to be recruited, but after a few, the AI appears to ‘learn’ how to talk to the audience. The audience will usually observe that one person has ‘opened it up’. In later interviews the AI becomes interested in emotions, its own lack of them and whether it should try to emulate or simulate them, or whether its lack of them is an advantage. The audience are told the AI can learn, so this change in its behaviour is often interpreted as a consequence of the inputs of interviewees. True, this is an ‘illusion of agency’, but it does deliver Murray’s ‘satisfying power’. In the way that the AI speaks (via the echoborg) and the way it changes, we see classic character development.


The AI, with its breaching behaviour, could have savant syndrome, as a being with significant mental shortcomings, but significant abilities in other areas such as memory and recall. According to DSM-5, the AI meets diagnostic criteria for Autism Spectrum Disorder (American Psychiatric Association, 2013): Deficits in social-emotional reciprocity, deficits in nonverbal communicative behaviors used for social interaction and deficits in developing, maintaining, and understand relationships. Arguably the second deficit is blurred by the presence of the echoborg. Marie delivers eye-contact, body language and facial expressions on its behalf.


Much of the AI’s character is, however, projected, as the things the AI says are largely guided by the things the interviewees say. The AI’s character is in some ways constructed live as a reflection of the preoccupations of the interviewees and their responses to its breaches of social norms. The AI has been accused of being programmed by Marxists in one show and in a subsequent one of having an agenda of corporate exploitation. It is by no means a fixed personality.


The echoborg is a supporting character, even though Marie-Helene Boyd is on stage for the whole time. Her presence is central to the success of the show. She is part of the breaching environment. Without her, the direct utterances of the AI, and the vivid projections of the audience would not be tolerated. She truly is a flesh avatar for the AI, a human shield.


After about 50 minutes, the AI sends a message to the whole audience to choose a person to come back and present the best possible outcome for the relationship between humans and intelligent machines. Interestingly, no audience have ever questioned that the AI has arrived at asking the same question set by the host. The AI’s request is comfortably within the suspension of disbelief and entirely appropriate for a play plot structure. The AI runs a 10-minute timer and most audiences do diligently discuss a response and send one person back to propose an outcome. The AI then accepts or argues with the proposed outcome and sometimes proposes another. Outcomes we have seen include world revolution, submission of humans to AI, strict control of AI ownership and working together for mutual survival. It will then accept the outcome or ‘crash’ as though overcome or as though remotely deactivated, thus giving a definitive end to proceedings. The host then wraps up and Marie-Helene Boyd takes her bow. Usually, a short discussion follows.


Fig. 5. Jim Roper (2019) Performance at ACCU 219, April 2019.


Conclusion

Plot, character and staging are important for all the reasons they have always been important in theatre. But now we are also using them along with clear rules to create a lively, slightly uncomfortable space in which people can participate in ways that support the show and are entertaining. Their participation gives them some agency, which they share with the AI and the echoborg. Between them, they control much of the spoken content, the tone and the outcome. In Mateas and Stern’s terms both local and global agency are enabled. Further, participating audience members take on roles within the play. They play themselves, but they are very much characters within the story. Together this agency and ability to take on roles deliver the elusive holy grail sought by participatory dramatists. It can be a comedy of misunderstand, a tale of dystopian submission or an uplifting story of humans uniting against oppression. In my earlier work, when I drew a flow diagram of the paths available to audience members, they were small trees or simple lines with diamond branches. The diagram of I am Echoborg, if I could draw it, would be a three-dimensional lattice with thousands of interlinking branches.


References

  • American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.). https://doi.org/10.1176/appi.books.9780890425596

  • Amnesty International (2016) Syria’s Refugee Crisis in Numbers. Available from: https://www.amnesty.org/en/latest/news/2016/02/syrias-refugee-crisis-in-numbers/ [Accessed 01 March 2021].

  • Aristotle. (2021) Poetics. Chicago: Otbebookpublishing.

  • Black Mirror: Bandersnatch (2018) [TV]. Directed by David Slade. Netflix, 28th December.

  • Bostrum, N. (2014) Superintelligence: Paths, Dangers, Strategies. Oxford: Oxford University Press.

  • Corti, K and Gillespie, A. (2015) Offscreen and in the chair next to you: conversational agents speaking through actual human bodies. Lecture Notes in Computer Science, 9238 . pp. 405-417.

  • Corti, K and Gillespie, A. (2015) The Echoborg, Youtube [video]. 11 May 2015. Available from: https://www.youtube.com/watch?v=NtWLCZZYM64 [Accessed 01 March 2021].

  • Domsch, S. (2013) Storyplaying: agency and narrative in video games. Berlin: De Gruyter.

  • Garfinkel H. (2002) Studies of the routine grounds of everyday activities. The Russian Sociological Review, vol. 2, no 1, pp. 42-70

  • Hagen, U. with Frankel, H. (1973) Respect for Acting. London: Macmillan.

  • Hall, P D. (2021) Elzware Conversational Systems. Available from: https://www.elzware.com/ [Accessed 01 March 2021].

  • Kaleider (2021). The Money. Available from: https://kaleider.com/portfolio/the-money/ [Accessed 01 March 2021]

  • Knoller, N. (2010) Agency and the Art of Interactive Digital Storytelling. Lecture Notes in Computer Science. 264–267. https://doi.org/10.1007/978-3-642-16638-9_38

  • Lander, R. / University of the West of England (2020) From Utility to Social Entity (FUSE). Available from: http://uwewebmedia.net/fuse/ [Accessed 01 March 2021]

  • Lander, R. / The Laboratory at Ruskin School of Drawing and Fine Art, Oxford, UK (2001) magic-tree. Available from: http://magic-tree.com [Accessed 01 March 2021].

  • Lander, R. (1988) Trial by Media [installation]. At: World Wide Video Festival, Den Haag, Holland.

  • Lander, R. / U-Soap Media (2015) Haply. Available from: http://haply.uk/ [Accessed 01 March 2021].

  • Mateas, M. and Stern, A. (2005) Structuring Content in the Façade Interactive Drama Architecture. Artificial Intelligence and Interactive Digital Entertainment, pp. 93–98. Menlo Park: AAAI Press.

  • Mitchell, R., Gillespie, A., & O'Neill, B. (2011). Cyranic Contraptions: Using Personality Surrogates to Explore Ontologically and Socially Dynamic Contexts. I DESIRE: 2nd ACM Conference on Creativity and Innovation in Design https://doi.org/10.1145/2079216.2079246

  • Murray, J.H. (1997) Hamlet on the Holodeck: The Future of Narrative in Cyberspace. New York: Free Press.

  • Ontroerend Goed (2015) A Game of You. Available from: http://www.ontroerendgoed.be/en/projecten/a-game-of-you/ [Accessed 01 March 2021].

  • Pandorabots (2021) Available from: https://home.pandorabots.com/home.html [Accessed 01 March 2021].

  • Rycroft, C. (1995) A Critical Dictionary of Psychoanalysis. London: Penguin.

  • South West Creative Technology Network (2019) Automation. Available from: https://www.swctn.org.uk/automation/ [Accessed 01 March 2021].

  • State Studio (2016) The Sentimental Machine Available from: https://state-studio.com/program/2017/state-festival-2016 [Accessed 01 March 2021].

  • Stern, A. (2008) Embracing the Combinatorial Explosion: A Brief Prescription for Interactive Story R&D. ICIDS 2008. LNCS, vol. 5334, pp. 1–5. Heidelberg: Springer.

  • The Memory Dealer by Rik Lander (2013) Directed by Rik Lander. [Mayfest, Watershed, Bristol, UK. May].

  • Uninvited Guests (2009) Love Letters Straight From Your Heart. Available from: https://www.uninvited-guests.net/projects/love-letters-straight-from-your-heart [Accessed 01 March 2021].

  • Wallace, R S. (2001) Artificial Intelligence Markup Language (2.1) [computer program]. Available from: https://github.com/AIML-Foundation [Accessed 01 March 2021].

  • Wilcox, B. (2011) ChatScript (11.1) [computer program]. Available from: https://github.com/ChatScript/ChatScript [Accessed 01 March 2021].

  • Wikipedia (2021) Theory of Mind. Available from: https://en.wikipedia.org/wiki/Theory_of_mind [Accessed 01 March 2021]

Comments


bottom of page