Portfolio of Theatre Engine Research Project

This page contains excerpts from Theatre Engine. This three-year project explores how to move an audience to performer through media.

Theatre Engine: Project Prototype

Created & Directed by Alison Dobbins
Choreography by Colleen Synk
Music Composed by Dr. Micah Levy
Edited by Josh Rickert
Filmed by Steven Van Maele & Taylor Reschka

Prototype video created to recruit and gather collaborators for Theatre Engine. The initial concept for iOS development was discarded in favor of the more open Android platform. Performance workshops illustrated some key challenges in this project, namely how to help the audience navigate through the distraction of a performance on stage AND a small performance on their screen. Each performance in the Theatre Engine series is treated as a workshop performance with an audience talkback at the conclusion. Performance cycles are planned to have time in between the performance days to implement any audience suggestions so they can be tested.

Dancer Toss

Directed by Alison Dobbins
Software Designed by Dr. Charles Owen
Choreographed by Colleen Synk
Music Arranged by Dr. Carlos Mello
Lighting Designed by Chris Haug
Costumes Designed by Amber Marissa Cook
3D Model Created by Andrew Dennis
3D Animation Created by Jesus Ambriz

Performed Michigan State University
(April 2013)

The piece begins with one dancer on stage and one on screen and every audience member controlling a bird projected on the opposite wall. This trains the audience in the use of the mobile application. The first dancer enters the space and interacts with the birds. The birds coalesce onto the center back screen and turn into an additional dancer who runs off the screen and onto the mobile devices. The audience then tosses this dancer from phone to phone and finally to stage. The first dancer, suprised to find another live person on stage, races off-stage and onto the mobile devices. She is chased by the second dancer and now the audience can toss both dancers back and forth.

Live instrumentalists play during the performance. They receive a signal from the server each time the dancer is tossed. The signal comes in the form of a UDP packet that is translated by an ardiuno unit. A MIDI marimba track accompaniment is controlled by the musical performers through a arduino pedal unit. The bassoon corresponds to dancer one, the flute to dancer two. Later in the performance, when both dancers are being tossed, the bassoon and flute play musical phrases to accompany each toss.

Each time the dancer landed on the phone she would perform a different dance move that mimicked the choreography completed by the live dancers. Choreography of the piece took place in October - December and 3D modeling and animation began in December. The costume design accomodated the need for a low polygon count for the 3D models. Limited flowy fabric was used and a tight to the head hairstyle was chosen to avoid the need to increase the size of the model with hair modeling.

To assist the audience in being part of the performance and watching the performance during the toss, moving lights spotlit each person when they had the dancer on their phone. This helped the audience engage with the performance rather than look down at their phone until cued to do so.

Results of Dancer Toss were presented at the MoMM2013 conference in Vienna, Austria and published "Intergrating the audience into a theatre performance using mobile devices," International Journal of Pervasive Computing and Communications, Vol.10 Iss: 1, pp 4-26.

Flashmob

Directed by Alison Dobbins
Software Designed by Dr. Charles Owen
Choreography by Heather Vaughan-Southard
Music Composed by Dr. William Sallak
Set Design by Todd F. Edwards
Lighting Design by Michael Kraczek

Performed Michigan State University
(April 2014)

Part two of a three part project exploring mobile applications and performance. A dancer appears on the audience members' phone. Moving the phone makes the dancer on screen and stage dance. Audience members are invited into the dance space and now their phone gives them poses. Remaining audience members are given control of moving lights, "magic" cubes and audio devices. Soon everyone is moving in this Flashmob.

The dancers received an audio signal that corresponded to the audience movement of the mobile device. Control over the dancers passed from audience member to audience member randomly every sixteen counts.

The music was composed using MIDI instruments on Max/MSP. The server sent a signal to the audio computer every time the pose information was changed. The word "change" was heard over the audio system. This allowed the audience to focus on intereaction with each other and the dancers, and to only look down at their mobile devices when they heard the word "change."

The seating consisted of inert benches and twelve magic cubes. The cubes contained various off-line interaction, such as noise-makers, periscope reflectors, and spinning wheels. Audience members who sat on a magic cube were given a command through their mobile device to "interact with your seat."

Five moving lights were used in this performance. The moving light colors corresponded to dancer costumes during the first phase when audience had one-to-one control over the dancer. In this phase, the audience member controlling a particular dancer was spotlit with the moving light of that color. Once the poses section began, the moving lights were under the control of audience members. They could change the color of the light through finger swipe and move the light by moving the phone.

The Digitizer

Directed by Alison Dobbins
Written by Alison Dobbins and Dennis Corsi
Lighting Designer Genesis Garza
Arduinio Interaction Designer Nathan Bliton
Max MSP Program Developer Deon Foster
3D Models by Erin Brandt
3D Animation by Joseph Valeen
iPhone Computer Programmer Kevin Dunlap

Performed Riverwalk Theatre, Lansing, MI
(June 2011)

The play tracks the misadventures of the mad scientist Dr. Bob and her luckless lackey. E.R.T. (Efficient Recycling Technology Robot) is accidentally digitized and sent into the computer where she proceeds to wreak havoc. To save Dr. Bob the audience must attempt to capture ERT. Using a red ball and audio levels the audience works together to try and capture ERT in a bubble and move her to a specfic part of the screen. If they succeed then ERT is de-digitized and the story proceeds along path A. If they don't succeed then the story proceeds along path B. The audience has an application installed on their phones that allows them to control a ninja avatar. During the final battle of the performance (path A - against Dr. Bob and ERT, path B - against unlucky Henry) the audience must get their avatars to work together to win and save the day.

Physical interactions and audio cues were created using Arduinos (to connect the digitizer machine to send audio and motor cues).

bottomright
bottomleft