I create new immersive experiences for audiences. To do this, I work with dense pieces of software to create custom programs, audio systems, and interfaces. I often find myself using emergent technology in conjunction with old forgotten relics.
(2021) 32 Acres
32 Acres is an app developed for the LA state Historic Park. It is sound walk that tracks the user location to trigger audio play back. The audience is encouraged to wear headphones. It was presented by Center Theatre Group, the experience was created/written by Marike Splint, the music and sound were created by Jonathon Snipes, and the app was coded/developed by me. It is available on Google Play and the App Store.
A multimedia experience dedicated to the exploration of time. Using a combination of video effects exploring time, we invited the audience to experience an interactive art making experience. By recording their gestures in real time we edited and distorted a reproduction of an image of the audience on multiple surfaces. I created it with a frequent collaborator: Harry Foster. It used live feed, generative music, lighting, custom interactivity programming, and multiple projectors.
(2020) Reflected Voices:
Reflected Voices is a virtual sound-art installation. It is meant to as an amplification of four interviews from people important to me exploring a singular issue: America. Each of these unique voices is housed in separate structures placed periodically around the venue. Every room has a different story and person attached to it. In the center you hear my personal community speaking all at once, but when you walk into one of the structures you hear one interview. The effect is zooming in on one voice among a community of voices.
I had the pleasure of designing this at the Cleveland Playhouse. The script is split into two halves: one half is in typical script format and the other half is running a constant monologue of commercials that underscores the entire show. For this production, we decided that we wanted to do these monologues live. To aid in doing this, I created a custom Ableton based vocal effect setup that allowed for user-controlled effects to be added onto the voices of the actors. This process gave the performer’s more of a sense of ownership over their commercials, and, because of this, the commercials felt much more organic than if the underscoring was played through QLab and the A1 controlled the vocal effects at the board.
(2018) Slow No Wake:
In early 2018, I was a contributing artist in the event entitled Bricolage put on by the art company Maelstrom Collaborative Arts in Cleveland, Ohio. In it, as the contributing artists were paired together at random, I was paired with the analog projection artist Nathan Melarangro. We decided that we wanted to improvise in front of our audience live each night in a piece that we entitled Slow No Wake. To do this, I created an Ableton-based live composition tool that allowed me to create live sound design and music that matched his improvised visuals.
(2017) Akron Haunted School & Lab:
This behemoth of a project happened in the Summer of 2017. Eighty-two individual point source speakers were hung across the 7 floors of 2 haunted houses. As you walk through, the content shifts from room to room sculpting unique environments and coloring each space.
Audio Reactive Programming :
These are examples of audio reactive programming I created utilizing Jitter in Max/MSP. Using openGL one can create real time interactive audio visual experiences. In these two examples I wrote the code and the music to make small audio visual experiences.
Computer Vision :
This is a video utilizing a depth sensing camera and some custom coding in Max/MSP to create 3D visuals of a space in real-time.
I use creative show control strategies to achieve show specific effects. Often obscure software packages can create original design moments. I have controlled audio/visuals with cellphones, microphones, and body movements. Some free software can be found under the tools and research tab.
This is a preliminary Vocal effect software written in the Max/MSP environment for a show in the UCSD Wagner New Play Festival that is now canceled due to Covid-19. It makes use of the Spat~ suite created by IRCAM as well as an array of resonant filters.