Artificial Intelligence
Google's latest experiment teaches AI to dance like a human
In his high-ceilinged dance studio, nestled in the Queen Olympic Park in East London, award-winning British choreographer Wayne McGregor has been working hard on his latest partnership: teaching artificial intelligence to dance.
Working with Damien Henry, technical programme manager at Google’s Arts & Culture in Paris, McGregor has been using an AI-driven tool that can generate its own independent choreography based on hundreds of hours of video footage it has been fed – both from the choreographer’s archives, and from the ten dancers in his company, whose individual styles were captured in solos performed for the technology.
McGregor’s interest in science and technology is not new; he holds an honorary doctor of science from Plymouth University. The new project, he says, came about as he looked back on his 25 year-long career recorded in video and wondered if technology could do anything to help keep the performances fresh.
“I wanted to make use of this massive archive of work in an interesting way,” says McGregor. “So I asked Damien if he could use it to generate something new. It all goes down to the same question that is crucial in choreography: how do you keep creating fresh content?”
AI can already determine many things, from predicting the next word you will type on your phone to recognising your identity in pictures you upload to social media. But anticipating gestures is tricky. To create the tool, Henry says he took inspiration from a post he saw on scientific website distill.pub, which uses a neural network to predict the form of the next letter you will write based on your handwriting in the previous letter. He came up with a similar algorithm capable of making predictions for a given movement. Based on a dancer’s pose, captured through video, the resulting tool comes up with several options for the most likely choreographic sequence to follow, and displays them on screen in real-time.
The technology was fed McGregor’s archives and recorded videos of his dancers. “It learned how to dance, in a way,” says Henry – but in the style of Wayne McGregor. The tool works by taking video input with a webcam, and extracting the “skeleton” of a dancer making a particular pose, by drawing points between their different body parts. It runs this input through three different algorithms to guess what the next pose could be – taking into account the individual style of that particular dancer, but also those of the nine other dancers from the company.
This produces a total of 30 potential choreography sequences, which at the moment last about ten seconds, and which are displayed on a computer screen using a similar “skeleton” visual. McGregor chooses which one he would like to use. “Then, whatever the dancer does can be recorded again and re-injected in the machine,” Henry says.
For McGregor, one of the most fascinating aspects of the technology is that it can learn and recreate the particular style of a dancer. Much like motion capture, described by Andy Serkis as the “bottling up” of performance, it can go some way toward capturing their creative identity. “I choose certain dancers because of their physical signatures, because of the particular way they move,” he says. “The genius of Damien’s invention is that it captures something of that essence of performance.”
Algorithms have been used in choreography for decades; as early as 1964, at the University of Pittsburgh, Jeanne Beaman and Paul Le Vasseur used a computer to generate random dance sequences from different time variations, spatial directions and types of movement they had fed it. At the start of the century, choreographers were using a software called The ChoreoGraph, a sequencing tool that let the choreographer set up digital variations on a timeline, which acted as a cue-sheet for dancers during performance. It was capable of updating in real-time to the dancers’ moves thanks to sensors placed around the stage.
But AI has limits when it comes to dance and creativity in general. Michael Klien, choreographer and director of the dance programme at Duke University, was an adept user of The ChoreoGraph. His work with the software was presented at the Royal Opera House in London, but he explains that he soon grew tired of the process, and in 2002 abandoned the system in favour of working directly with his dancers.
“We kept including more algorithms in the system to make it more intelligent,” he says. “I wanted to achieve an AI choreographic structure. But I realised it is not the strength of dance to be developed by AI. Our assumptions of intelligence build our AI machines and that spells a tragic limit of our imagination.”
Henry says that the Google tool is not meant to invent moves that have never been seen before. Like all AI, it works through interpolation – that is, it is built to predict what is most likely to happen from previous patterns that it has learned. “The point is not to replace the choreographer,” he says. “But to create options in a very efficient and fast way, so that the creative process never stops. The creativity will come from the use that Wayne will make of these options.”
McGregor, however, is excited to see what happens when the machine is programmed to extrapolate. And it can be: the technology navigates a “map” of different dance moves that it recognises, and recommends specific sequences of moves, but Henry explains that the way it transitions between the moves – the “trajectory”, in a way – could happen outside its training data. This means it has to potential to come up with movements that have never been done before. “I’m always interested in a thing I’ve never seen,” says McGregor.
The technology can also already be set to deliver a choreographic sequence that mixes the style of two dancers. This caused a few giggles when the company tried it out – imagine recognising yourself moving on a screen in someone else's style.
What would happen if, say, it was fed footage of Brazilian samba dancing on top of the McGregor archives it already knows? The result would be a hybrid dance that has probably never been seen before. But McGregor says he is not worried about becoming the Dr Frankenstein of dance. “The choices are still mine. I’m the source, I’m the person at the beginning,” he says. “I see this more as an opportunity – I love being in the game.”
McGregor wants to use the tool in real-time for a future work: with each dancer having their own screen and reproducing on stage the sequences that the algorithm is delivering. Some technical details still need sorting out – should the programme be adapted to ten individual phones? Or displayed on screens? – but McGregor is confident. “We’ll be constructing dance in real time,” he says. “It’s going to be fascinating.”
Comments (0 posted)
Post your comment