Small step or a giant leap? What AI means for the dance world
‘I think AI’s going to change everything,” Tamara Rojo, artistic director of San Francisco Ballet, told me earlier this year. “We just don’t know quite how.” The impact of artificial intelligence on the creative industries can already be seen across film, television and music, but to some extent dance seems insulated, as a form that so much relies on live bodies performing in front of an audience. But this week choreographers Aoi Nakamura and Esteban Lecoq, collectively known as AΦE, are launching what is billed as the world’s first AI-driven dance production, Lilith.Aeon. Lilith, the performer, is an AI entity, who has co-created the work, with Nakamura and Lecoq. “She” will appear on an LED cube that the audience move around, their motion triggering Lilith’s dance.
Nakamura and Lecoq insist they’re interested not in chasing the latest technology for its own sake but in enhancing their storytelling. Working as dancers with theatre company Punchdrunk turned them on to the idea of immersive experiences, which led to virtual reality (VR), augmented reality (AR) and now AI. Their question is always: “How can we make this tech come alive?” But not in a robots-taking-over-the-world way.
The story in Lilith.Aeon is inspired by transhumanism (using technology to evolve beyond human limitations) and began as a script written by an AI bot. Nakamura and Lecoq fed it all their research – images, audiobooks, discussions – “and we were able to talk and collaborate with the AI, and co-create the piece together”. The duo created steps, “like a dictionary”, that Lilith was trained on, but the AI went on to generate its own new “words”. They were excited when Lilith did something they never would have thought of, but the choreography remains tailored to their aesthetic. “It’s not random,” says Lecoq. “I’m not interested in seeing something that looks like a screensaver.”
You can’t talk about AI in dance without discussing Wayne McGregor. Always on the front foot when it comes to tech, he first started researching AI 20 years ago. With Google, McGregor developed AISOMA, a choreographic tool trained on his 25-year archive of work, analysing thousands of hours of video, which can then come up with real-time suggestions, much like a dancer improvising in the studio. He has used AISOMA to generate new versions of his 2017 piece Autobiography that are different at every performance. His latest project, opening next year, is On the Other Earth, developed with Professor Jeffrey Shaw in Hong Kong, which uses a 360-degree screen with sensing technology for the audience to construct their own experience.
Choreographer Alexander Whitley is also using AI to develop ways to integrate the audience into the work. In a VR version of The Rite of Spring, he is working on using the movement of the audience as triggers for avatars trained on a database of Whitley’s choreography. The tech can make an amateur audience member’s movement more artful, and even place it in time with the music, like a dance version of Auto-Tune.
The technology is developing fast. The kind of motion capture that was once the domain of Hollywood studios is now accessible on an app on your phone (try Move.ai), and much of the progress is led by the gaming industry. But it’s worth looking there to see some of the pitfalls, too. Video game performers, including motion capture actors, are striking in the US over concerns about being replaced by AI (much like the actors strike of 2023). Dancers are already being recorded by companies building motion banks (“I’ve done about a million projects where I’ve had to motion capture, like, somebody spinning on their head,” says McGregor). And the question of rights and royalties for using dancers’ movement (and expertise) in order to train AI is a big one. McGregor says that in the past it was common in motion-capture contracts to have a complete buyout. “We didn’t understand what the application of that technology was in the future.” Now he wants to do some work with Arts Council England on intellectual property (IP), motion data and “ethical AI”. Done well, this could be another income stream for dancers. “Coding choreo creates coin,” quotes Jonzi D from his hip-hop show Fray, which features an AI-generated dancing avatar.
But what happens when it’s creating coin for someone else? Dance is an ever-morphing art form, passed on via dancefloors, studios and now social media, and it can be difficult to know, or prove, where an idea began. You can copyright a dance work, but not a step, as dancers found when they tried to sue the makers of the Fortnite video game. You can copyright a file, and Nigerian choreographer Qudus Onikeku is researching using AI to recognise and classify movements to build a dance databank and protect IP, especially for Black artists, so often misappropriated in the past.
Artists working seriously in AI are partnering with big corporations including Nvidia, Amazon and Dell. They get the tech, and in return the companies get the ideas, the kudos and importantly, the data. Are they selling their souls, or is it just pragmatism? There’s influence both ways, potentially. “You don’t want to be the technology adopters,” says McGregor. “You want to be in the conversation at the beginning, being generators. You want to be ahead of it, otherwise you’re just servicing the technology.” Commercial funding is often the only way to develop tools, some of which could go on to democratise and demystify dance – Whitley is working on software that could be used in education, allowing students without previous dance knowhow to create their own choreography on screen.
“I think that humans and AI can do some beautiful things together,” says Jonzi D. But he has also noticed that most AI-created content he sees has a particular, samey look. “It’s always going to boil down to how creatively we’re able to use it.” Lecoq agrees that everything will look the same if it’s all trained on the same content; the art will eat itself. “It’s laziness in not pushing the boundaries further,” he says. AI isn’t a shortcut when, like AΦE, you’re creating the tech as you go. “It’s a longcut. It’s a very hard, lonely process.”
Rojo can think of some helpful applications for AI in dance. An algorithm that was able to solve the headache of recasting a ballet when someone’s injured, for example, and compute in seconds who’s available, who knows the role, etc. Less helpful would be “if composers were replaced, set and lighting designers were replaced, if patterns in choreography were made by artificial intelligence,” she says. “And that is not out of the realms of possibility.”
The computer’s incursion into creativity is nothing new, though. “Computers are the future of dance,” said the choreographer Merce Cunningham back in 1995, by which time he’d already been working with the programme LifeForms for six years, manipulating avatars on screen then transferring the results on to his dancers. The intention was to shed dancers’ natural habits, where one movement instinctively leads to another, and find something new, something choreographers have always tried to do.
So is it a good or a bad thing for the industry? “I try to avoid falling into the kind of binaries of technology as saviour or destroyer,” says Whitley. There will inevitably be a disruptive impact on industries, “but also really exciting possibilities to emerge with it.” Dancers aren’t all about to lose their jobs. “I never worry about the replacement argument,” says McGregor. For him, it’s about using technology to better understand the complexity of the human body. “And we’re so super-far from building a version that in any way replicates the brilliance of the human body. Human virtuosity and ingenuity is the thing that we connect to most of the time.”
Part of watching dance is knowing, intimately, the limitations of the human body, and seeing them being pushed. That’s meaningless if an avatar can do anything. As McGregor puts it: “There’s no jeopardy in the digital world.” But even if choreography uses AI interventions, “when it’s enacted by a living, breathing human it becomes a meaningful, tangible thing,” says Whitley. “Live performance never can be replaced by the digital experience, for sure,” says Nakamura. She’s not interested in seeing a real person replicated on screen (“What’s the point?”) but in Lilith.Aeon she does want to create something that couldn’t exist any other way. Yet despite the fact AΦE are pioneering the latest in artificial intelligence, they’re not really into tech, insists Lecoq. “The best technology I like is my washing machine and my microwave.”
• Lilith.Aeon is at La Filature, Mulhouse, France, 30-31 October. Alexander Whitley is curating the Digital Body festival, Hackney Wick, London, 15-17 November.