Interview with Ryan Purcell, director of The Adding Machine: A Cyborg Morality Play

  

A collaboration between The Feast Theater and SU’s Theater Program, The Adding Machine: A Cyborg Morality Play was produced this fall at the Lee Center for the Arts at Seattle University. A rendition of Elmer Rice’s 1923 play, The Adding Machine  tells the story of Zero, an office worker in an unhappy marriage who loses his job to the “adding machine”, i.e., essentially a calculator. Director Ryan Purcell maintains the play’s focus on the effects of automation on the workplace and personal life, but introduces one remarkable component: Generative AI (GenAI) output is included in some of the scenes.

SU’s Technology Ethics Initiative Director Dr. Onur Bakiner had a chance to catch up with Ryan to have a conversation about AI, the arts, and the future of creativity. (The interview has been edited for length and clarity.)

 

Onur Bakiner (OB): How did you come up with the idea for the play?

Ryan Purcell (RP): An actor I was working with a couple of read recommended another expressionist play called Machina. I read it and liked it but didn’t think it was right for us. The Adding Machine is another expressionist play that felt immediately relevant. The main plot is a man who loses his job to automation, so it resonated politically immediately. But more than that, I think in the face of this kind of disruption, people tend to respond with dystopian energy. I like that Elmer Rice responds with humor and love and kind of absurdism. So that was the beginning. I just read it, and I thought this would be great for the company. It’s really human and funny and weird about it, which is the kind of theater that I’m drawn to that has big questions without necessarily having the answer figured out.

OB: What have the responses been so far?

RP: What’s interesting is, even in announcing it, we got a lot of pushback from people saying you shouldn’t be using AI in theater. I think people who’ve seen it have responded really positively because we’re looking at it really carefully and, in our opinion, ethically and thoughtfully. People who see the nuances of the production and are engaging with what we’re really trying to do have really enjoyed the questions that staff have been pushed to think on themselves. The reaction from people who haven’t seen it falls into exactly the camps that I’m trying to push against. Some people are saying you shouldn’t do this at all, and other people are saying this is so cool, this is the future. It’s actually trying to get those two groups talking and thinking differently that is the point. So the response has been all over the place.

OB: What does the GenAI piece add to the play?

RP: A couple of things: on the simplest level, it makes it about right now. There is language in the play that is literally generated the moment that it is spoken. The other thing is that it challenges the way we think about creativity because this technology, and art in general, the question is: how do you generate something truly new? It makes me, the artist, the audiences think about an experience of a moment that is beautiful, funny, moving but do not know who created it. Does it change the experience of that moment? So there is almost that spiritual quality of creativity, the complicated ownership. You go, “I felt something, the words were real, but I don’t know who created them, so I don’t know my place being challenged, being decentralized.”

And then the other thing that I’m excited about doing is, you know, everyone uses the Internet, everyone uses AI now, whether they… Not everyone, but you know, it’s more and more present, but it’s a very private thing to explore. I’m also very excited about the fact that when you do it publicly, you take this thing that is designed to respond to one person, and you get to see what that response actually is in a broader way. I’ve been really, like, just playing on my own. I’ve been so fascinated by how it tries to get to know me and what it thinks I want, how my work with AI has changed my YouTube algorithm, how it has changed. And so, part of what I’m also interested in is taking this thing that we think of as very individual, but it also feels kind of objective. Like, you search for something, and then it gives you a response. You think that’s the response, but showing how much that is actually fracturing objective reality because it’s doing something different for everyone. And so, there’s something about interacting with the language model publicly where you see the way that it’s responding for a certain kind of person but not another kind of person. That tension and problem is actually quite fun to play with.

OB: Was it challenging for the cast?

RP: Huge challenge for the cast. How do you build a character whose lines change every night? It’s a really big challenge. So, yeah, it would have changed every night. I mean, I would say 75% of this set changes every night, you know, following patterns and habits of the machine. But it tells me that there’s two things: being able to keep flexible enough to change when things change, but the other thing that’s really interesting, which is always an active challenge, is how do you stay interested in what it feels like to experience this technology fresh and new? Which I think is allowed for the audience’s experience, like being able to get back to the way the actors have the suspense of the first time they would ask the question about what the future holds. This is really scary and interesting, and then you kind of get used to it. That’s part of what’s fascinating about this technology, how quickly you can get used to it and sort of forget how profound the power was. So, the other acting challenge is to remember that first experience of interacting with them before they get used to it. They have to live their life; they don’t have to wait on the screens.

OB: How long do the actors get to memorize their lines?

RP: They don’t memorize their lines. A lot of what MJ [Sieber, one of the cast members] does is make sense of what he has in front of the computer.

OB: What was the scene with the house party about?

RP: The writing in the original scene is already cliché, about how mechanization of our lives isn’t just a work thing; it’s a personal thing. It’s like we become more and more a product of the processes even in our own lives. So that was something that you might be interested in. You know, there’s a lot of discussion about how algorithms particularly are shaping our political discourse and people speaking in slogans, echoing disinformation. That scene was about describing the social event when the content is truly what the Internet thinks is the right answer. It’s interesting because it mirrors the original in interesting ways. It’s funny in how generic it is, which is both embarrassing and fascinating. But also it is funny when it misunderstands and goes in a different direction.

OB: Does it misunderstand?

RP: For example, it misunderstands humor in a variety of ways. Sometimes it gets the politics in a xenophobic direction. There’s that history where you just leave YouTube running, and if it chooses the next video over and over again, it will continue to go towards more and more extreme content because that works on human beings. You have to pay for all that. I think there’s also something in generative AI, which is it wants to give you what you want and keep you engaged. GenAI has a similar tendency to give you what you want, and get you more engaged, so it can give you more extreme content. This is part of the play – getting more passionate for what a machine is doing.

OB: What happens to moral responsibility when no human is the author?

RP: One thing that is fascinating is that different companies put different limits. There might be nights when the AI will say “sorry, I’m not comfortable”. This is clearly not a technological limitation; it is a corporate limitation, obscuring what the real technology is capable of. As theater artists, we take responsibility for what we put on stage. The actors know what to do. When a person says something, they create an effect and they are responsible for that effect.

Something else is that we make a lot of older plays. When someone says something you cannot say today, do you pretend it’s not happening? Do you explore it? How do you deal with the effect of ideas once they are communicated? You are never fully inside an actor’s head, so there is always miscommunication. You are playing one thing but the audience does not necessarily perceive that. AI can add to the problem but the problem is already there. Who is responsible for making the impact? I’m expecting the consequences will fall on me and the company if people don’t like it, its politics, etc.

OB: Have you come closer to an answer to the question of experience if you don’t know the creator?

RP: The closest to an answer that I’ve come to is that the value of art is in the effect it has on the people making it and the people experiencing it. For a second I will go in the other direction: if you observe the painting of a sunset or watch a sunset, the painting makes you see the actual sunset differently. That is why it is art. That is what I see about this technology: it points at dynamics of communication in relationships that are already there. It’s not that interacting with algorithms is new. My answer is that, when I find something beautiful, I find it beautiful. Who makes it does change my perception but that feeling is still there. I think that who creates something (a moment) impacts how we experience the moment, but our experience is real. It feels like a cause-effect chain and art is meant to be a very specific way to shape the causes and what they lead to. Any technology changes the way we experience the moment. A painting, movie or AI-generated picture of a sunset would be different from one another. For me, the dynamic new moment to define is: is this fake? We don’t call a painting of a sunset fake, but when it is generated from a million different photos, we are unsure about what that means. The uncertainty is the exciting thing.

OB: Do you have in store another technology-infused production?

RP: We had a workshop on Oscar Wilde and his libel and sodomy trials. It was the first celebrity media trial, so we have a play called Gross Indecency that will be about the technology of social media, cable news, TikTok, etc. It’s about how technology interacts with news.

OB: We'll stay tuned. Thank you!

RP: Thank you!

Onur Bakiner

October 3, 2024