Photo Courtesy of Blythe DeWitt
March 19th 2024 Boston, MA
Shakespeare and artificial intelligence are not concepts typically associated with one another — yet, together, they can create something remarkable.
The Boston University College of Fine Arts production of “Dream,” inspired by William Shakespeare’s “A Midsummer Night’s Dream,” ran from Oct. 24-26 with a twist: The production utilized generative AI software, Random Actor, to project immersive images onto the stage in real time.
Random Actor was created by Clay Hopper, a senior lecturer in CFA, and James Grady, creative director of Spark! Product Innovation at BU’s Center for Data Sciences. The AI software uses motion capture technology to project AI-generated images onto the stage and its actors to further immerse its audience in the show.
Nearly eight years in the making, Hopper and Grady designed Random Actor with the hope of revolutionizing interactive media design and changing the way viewers experience live performances.
“One of the things I noticed as a theater artist is that the projections were being used … in a way that were very limited and shackled to a kind of naturalism that I thought didn’t fulfill their full possibility,” Hopper said.
The technology combines computer vision, motion tracking and AI image generation into one system. Built primarily in C++ using openFrameworks, the software evolved from early prototypes that relied on Java-based environments designed for artists and designers, Hopper said.
Using a motion-sensing device called Microsoft Kinect, the program captures color and depth information to track performers’ movement in three-dimensional space, Grady said.
Hopper added this includes skeletal tracking, which maps up to 16 points on multiple bodies simultaneously without the need for actors to wear sensors.
Once movement data is captured, Random Actor feeds it into a generative particle system, producing visuals that behave like physical materials, Grady said.
“There’s a physics engine that allows us to do the flames and the liquid [special effects], but it’s a smoother version,” he added.
Some aren’t optimistic about this kind of technology being used in performance spaces.
BU freshman Zara Stahl, who was in the audience Friday night, said she had mixed reactions to the software. She described “Dream” as one of the “most unique versions of Shakespeare’s play” she has ever seen.
“It wasn’t what I expected,” she said. “It definitely fit the play and performance well, which was probably the ultimate surprise on my part. I don’t really think of AI and that sort of technology as mixing well with art or at least performing arts.”
Though the technology helped her interpret the play’s themes in a unique way, Stahl said implementing Random Actor in other productions “would be an interesting challenge.”
“I don’t know if I would call it an evolution. I think it’s a new technique in performing arts,” Stahl said. “More shows could incorporate it. I don’t think all shows will, and I don’t think it’s a necessity.”
Hopper said skepticism is to be expected, and even the actors involved in the production held reservations when first starting out.
“They were a little concerned at first that it wasn’t really about the play or the actor, [but] that it was more about the flashy technology,” Hopper said. “But throughout the rehearsal process, they came … to embrace it in a huge way. ”
While some may view artificial intelligence and performance art as a threat to authenticity, Hopper said it deepens viewers’ experience with humanity rather than replaces it.
“These sets of tools … are machines,” Hopper said. “They will never, ever come close to that alchemical thing that happens in a dark room with people telling a story to people who are listening to it.”
While implementing AI in art has been an ongoing debate, Hopper said there is enough room in the art world to encompass AI as well.
“To the naysayers, I say, ‘Get on board with this,’” Hopper said. “When you’re on board with this, you’re on board with the people. You’re not on board with the machine.”