TECHNICAL AUDIO REEL

Aside from composition, I have explored interactive media and game development as a programmer and implementer for various projects. Some examples include...

A Way Closer (Role: Creative Producer, Lead Programmer) | Unity/FMOD

Over 14 weeks, seven students from a wide range of disciplines across Humber College conceptualized and produced an artistic project driven by a narrative journey. As their Lead Programmer and Creative Producer, we created A Way Closer, a 10-minute interactive digital art piece that invites participating audiences to take on the role of an unnamed character as they try to escape the commotions caused by everyday life. Inspired by Nuit Blanche Toronto's curatorial theme, this immersive video game experience reflects on the relationships we can sometimes take for granted, challenging participants to discover what human

connections really mean in the wake of unexpected isolation and social upheaval. A Way Closer embraces the passing of time and welcomes audiences to celebrate the bond between art and technology as part of a shared human experience. The project was developed in Unity and utilized FMOD for audio implementation. 

Borf Clorgman (Programmer, Self-developer) | Unity/FMOD

The Adventures of Borf Clorgman: Space Intern, is a 10-20 minute interactive virtual reality experience that follows the story of Borf, an elderly alien lifeform, and their attempt to live out their life long dream — to work on the galaxy’s most renowned and prestigious space station. The purpose of my final project was to explore various game audio-focused tools and concepts that are accessible for sound designers and composers that lack coding expertise. This included the integration and audio implementation of sound design, dynamic composition, and detail to psychoacoustics when developing for interactive media and virtual reality (VR).

Each room includes a unique audio-based puzzle — by creating these situations, I showcase the importance of audio and its implementation when creating an immersive gaming experience. This experience was created in Unity and used custom scripts built around the Google Resonance plugin.

Dynamic FMOD Demo (Implementation, Composer) | FMOD

This is an example of a dynamic orchestral composition using FMOD. This composition requires specific parameters that are called via code in order to trigger new sections of the composition. using marker regions, random containers and transition timelines, the composition avoids listener fatigue by consistently changing its orchestration, melodies and harmony. 

Seen and Heard (Programmer, Performer) | TouchDesigner

“Seen and Heard” is an interactive exhibit that is structured around the concept of uncovering a moving image through a holistic improvised piece of music. The installation requires a MIDI keyboard and a screen to show an interactive visual element. Every time a note on the keyboard is pressed, the data of the velocity and pitch of the note are used to localize an ink splotch that appears on the screen. These splotches slowly unveil an image underneath, in this case, a moving
image of Van Gogh’s “Starry Night”. The installation is presented as a sort of “compositional game”.

Performers must structure their work around the unveiling of the image, only ending when the entire image is fully revealed. Representing an improvised composition in a visual format forces the performer to utilize all aspects of the instrument in order to create a fully realized performance and image. It visually shows performers where their musical tendencies are without overtly pointing it out to them during the performance. The design of this installation was to create an accessible piece that can be enjoyable for artists of all levels and can be repurposed in a variety of presentation formats. I could imagine this installation being presented on a projector with either a disklavier piano in front of the projection, or for an outdoor and more family-oriented experience, a piano dance mat can be replaced with the MIDI keyboard. Additional changes can be made to make a unique and replayable experience for each visitor by swapping out the image that is revealed.

Coding Chaos (Artistic Producer, Programmer) Python/Google Magenta

Coding Chaos was a new music concert held by the Spectrum Music that featured a jazz trio including GRAMMY award-winning drummer Larnell Lewis, Canadian trumpeter Bruce Cassidy, and award-winning pianist Chris Pruden performing alongside a musical artificial intelligence. 
The AI program was built on top of Google Magenta's "AI JAM" demo which was programmed in Python - we created our own neural network by inputting over 1000 jazz standards, classical pieces and jazz licks. Since the output of the instrument is MIDI values, we fed this back into a DAW

(Digital Audio Workstation) in order to change the timbre of the instrument in real time during the concert.

Blips & Sparkles (Programmer, Performer) | Max/MSP

Blips & Sparkle is a composition using two custom made Max/MSP instruments named Blips and Sparkles respectively. Blips is a granular FFT texture machine that creates a crystal-like ambience. Blips is a generative sample sequencer with reverb effects. Additionally, blips has a melodic "blip" that is tied to a scale - which can be changed with the buttons at the bottom of the patch. The following video is a live performance of how these two patches can be used together in a single performance