My research creations in the design of sound for dance, theatre, dance film, games, and xR (AR, VR and MR) date back to the mid 1990’s when I continued exploring what was then referred to as ‘electronic music’. I have been commissioned for sound design research creations by a wide range of collaborators from the performing arts, digital, corporate, conference and not-for profit industries. I have had the honour of being nominated and have won multiple awards for live theatre and dance film. Some sound research creations have also involved xR development including Augmented Reality (Small Stage, Shakespeare), Virtual Reality (iSpace Lab, VR Theatre) and Mixed Reality (Fun Palace, Small Stage). Each sound creation I undertake involves several types of research in the following ways, and greatly influenced by the medium I design for.
Types of Research Involved in the Creative Process
Investigating the field of previously conducted related productions and the sound design aesthetic that informed the work(s). While not many productions provide access to soundtracks, at times there have been writings on the role of sound or composition. A recent example of research involved investigating the styles, rhythmic, melodic and harmonic patterns common to murder mystery genres like Murder She Wrote, Midsomer Murders, Agatha Christie’s corpus and the Knives Out movie series. Approaches to setting up and supporting tension and suspense informed the design of Shakespeare in the Metaverse noted below.
Persistent iterative research and creation of the design of the sound palette for a production. Research activities have included sourcing pre-composed music and re-arranging it for new instrumentation, composing original music, original sound effect creation, designing sound/music to underscore spoken word scenes, composing more complex music for transitions or to support dance independently, stylistic references correlating to the director’s aesthetic and where/when the production is situated historically. The medium greatly informs the aesthetic in addition to the constraints of the playback systems used in every production.
Design of the system. This includes working with existing sound systems, how these can be extended or augmented with supplementary equipment, playback, sound reinforcement including mic’ing actors, speaker placement in the space, stereo vs spatial arrangements, etc..
Investigating the sound design process itselfthrough documentation, reflectionand dissemination. The collaborative part of working with a director and improving communication is important and part of the toolset of a skilled sound designer. Past collaborations inform new ones. Lessons learned can be used to be mentor students and shared with communities of practice, residencies and workshops. Many research creations involving sound for xR (AR, VR, and MR installations) have led to multiple peer-reviewed journal publications including lead authorship on a paper for the International Symposium on Electronic Art in Barcelona 2022 based on a typology of Mixed Reality from a production of the Fun Palace; a mixed reality event I produced featuring 11 installations and 5 simultaneous soundtracks. Of note is also a co-authored paper entitled Body Remixer: Extending Bodies to Stimulate Social Connection in an Immersive Installation published in Leonardo through MIT Press Direct; the leading international peer-reviewed journal on the use of contemporary science and technology in the arts and music.
While it’s difficult to extract tracks out of their inter-connected scenes and contexts, their greater context within an overall design of sound can be articulated. Each track is usually a sample from a larger collection of sound files that contribute to one production. At times the number of cues can be small and under 40 cues, while I am persistently commissioned to design research creations with upwards of over 100 cues.
xR Sound Research Creation
Many sound research creations have oriented towards public Mixed Reality or Virtual Reality. The most recent in 2023 was Shakespeare in the Metaverse: Sound design for volumetrically captured Shakespearean actors in VR. Once participants enter VR they experience musical, voice over and SFX sounds in 360 degrees. The sound design process will be detailed as part of a paper that will be submitted to SIGGRAPH 2024 Art Papers Special Interest Group as a unique contribution to a volumetric capture-to-VR pipeline. That process involved pre-recording actors remotely, then in a studio environment the same day as the volumetric shoot since only one actor could be captured at the same time. The first track below is an ambient composition for a pre-show experience in a public space.
The second short track was intended to be loop-able as file size in VR is a constant challenge and optimization ‘builds’ are a necessary part of the entire pipeline. The track was influenced by those typical scenes in murder mysteries where detectives are seeking clues as to who committed a murder. Music accompanied participants in VR who attempted to wake up the Porter in Act 2 Scene 3 of the Scottish Play. To wake the Porter participants had to work together to trigger specific sequences that caused audio sound effects to play. They needed to wake up the Porter in order to answer the door to let Macduff in who is concerned about the well-being of King Duncan. Once they triggered all sequence puzzles, a volumetric scene from the actual play was experienced.
Finally, the trailer for the production also features music that was used in the VR experience.
Ongoing collaborations with iSpace Lab and the work of Dr. Bernhard Riecke have resulted in Mixed Reality VR experiences in public spaces in addition to journal publications based on investigating the phenomena of human connection.
Live Theatre Sound Research Creation
I have worked on many live productions of Shakespeare over the years with a host of talented directors, actors and design teams. Some of the notable productions have been with Bard on the Beach. The next two tracks represent two very different productions of The Merchant of Venice. The first track in collaboration with director Rachel Ditor tended towards a more orchestral and fully composed soundtrack to support the romantic themes of the play. The second is representative of a more rhythmic, less melodic and at times imposing sound track to support the vision of Nigel Shawn Williams who described the production as a “sinister parable for our times”. Other musical tracks written for All’s Well that Ends Well and Richard II can also be accessed here.
My research creation process with the innovative Electric Company Theatre has alway been an intense one. The creative team’s dedication to creating prototypical theatrical experiences is unique in the world of theatre. Easily one of the most challenging and rewarding sound design processes I have ever survived, Studies in Motion culminated in the design of over 140 cues that changed and shifted over a period of 5 years with four Canadian and American remounts. The tracks below are excerpts of compositions that accompanied Member of the Order of Canada Crystal Pite’s compelling choreography. You can also read this post on the value of iterative research design when it comes to the compositional process.
Live Dance Sound Research Creation
My roles with Small Stage dance productions have been numerous including sound designer, composer, performer, musical director, and currently as researcher-in-residence. Many of my research creations for Small Stage and other dance companies and independent choreographers have been focused on the composition of music to support choreographic vision. This highly iterative process can be much more demanding and subtle than a typical compositional track for theatre. Minute changes to a production are common and timing of musical or rhythmic gestures down to the second. There is, however, more freedom of melodic, harmonic and rhythmic variation in creating for dance and often times compositions can evolve over a longer period of time than in theatre productions.
The composition below underwent a minimum of 45 variations over a two year period as the research creation process it accompanied consisted of six choreographers crafting the music for completely different dance creations. Every layer in the music became malleable as each rendered track was different in terms of melody, rhythm, tempo, duration, instrumentation and gestural emphasis. The version below was chosen by master dance artist Chengxin Wei.
The second track is an excerpt from a suite of longer compositions that accompanied a 60-minute dance composition for company Dancers Dancing choreographed by Judith Garay. Both tracks feature the altered voices of vocal artist Dr. Sheinagh Anderson who has been a persistent ‘instrument’ in many compositions.
With Small Stage we have been exploring the intersection of live dance with extended realities such as augmented, virtual and mixed reality for the past 5 years. These research creations have involved me in the capacity of sound designer and emerging technology developer. The results have included an AR application that allowed audience-goers for a live production ‘take home’ the same dancer and choreography by scanning a QR code in front of the stage. The trailer below describes the process and the resultant AR application that was developed by graduate students at the Centre for Digital Media’s xR Prototyping Lab.
Installation-based Sound Research Creation
More recently in my history of designing sound for different media, I have co-designed and led the design of sound installations at art galleries. In this capacity I work regularly with interdisciplinary sound artist Dr. Sheinagh Anderson. In a more supportive role I designed the overall spatial audio for Anderson’s design at the Reach Gallery and mixed over 75 tracks of music, ocean and sonified data for a 3-hour ambient track in collaboration with artist Erika Grimm. The work, entitled Salt Water Skin Boats also enjoyed an installation in Amsterdam. Tracks are available to listen to on Sound Cloud.
In addition I led the design for two separate sound installations at at an international exhibition on Art and Artificial Intelligence at the Vancouver Art Gallery in 2022. The design also experimented with motion triggered hypersonic speaker technology that were mechanized using 3D printed recyclable materials. Hypersonic speakers emit harmless ultrasonic tones that we can’t hear use the property of air to create new tones that are within the range of human hearing. As a result sound is only heard from the first surface that the ultrasonic tones hit creating a disorienting effect on the listener. The technology was originally devised for a production of The Crucible at Langara College’s Studio 58 and focused by actors. In the Vancouver Art Gallery installation all sounds were manipulated sound effects from well known sci-fi films that feature artificial intelligence, such as A.I by Spielberg, The Terminator series by Cameron, and Pixar’s Wall-E.
Sound Design Research Creations +
My research creations in the design of sound for dance, theatre, dance film, games, and xR (AR, VR and MR) date back to the mid 1990’s when I continued exploring what was then referred to as ‘electronic music’. I have been commissioned for sound design research creations by a wide range of collaborators from the performing arts, digital, corporate, conference and not-for profit industries. I have had the honour of being nominated and have won multiple awards for live theatre and dance film. Some sound research creations have also involved xR development including Augmented Reality (Small Stage, Shakespeare), Virtual Reality (iSpace Lab, VR Theatre) and Mixed Reality (Fun Palace, Small Stage). Each sound creation I undertake involves several types of research in the following ways, and greatly influenced by the medium I design for.
Types of Research Involved in the Creative Process
While it’s difficult to extract tracks out of their inter-connected scenes and contexts, their greater context within an overall design of sound can be articulated. Each track is usually a sample from a larger collection of sound files that contribute to one production. At times the number of cues can be small and under 40 cues, while I am persistently commissioned to design research creations with upwards of over 100 cues.
xR Sound Research Creation
Many sound research creations have oriented towards public Mixed Reality or Virtual Reality. The most recent in 2023 was Shakespeare in the Metaverse: Sound design for volumetrically captured Shakespearean actors in VR. Once participants enter VR they experience musical, voice over and SFX sounds in 360 degrees. The sound design process will be detailed as part of a paper that will be submitted to SIGGRAPH 2024 Art Papers Special Interest Group as a unique contribution to a volumetric capture-to-VR pipeline. That process involved pre-recording actors remotely, then in a studio environment the same day as the volumetric shoot since only one actor could be captured at the same time. The first track below is an ambient composition for a pre-show experience in a public space.
The second short track was intended to be loop-able as file size in VR is a constant challenge and optimization ‘builds’ are a necessary part of the entire pipeline. The track was influenced by those typical scenes in murder mysteries where detectives are seeking clues as to who committed a murder. Music accompanied participants in VR who attempted to wake up the Porter in Act 2 Scene 3 of the Scottish Play. To wake the Porter participants had to work together to trigger specific sequences that caused audio sound effects to play. They needed to wake up the Porter in order to answer the door to let Macduff in who is concerned about the well-being of King Duncan. Once they triggered all sequence puzzles, a volumetric scene from the actual play was experienced.
Finally, the trailer for the production also features music that was used in the VR experience.
Ongoing collaborations with iSpace Lab and the work of Dr. Bernhard Riecke have resulted in Mixed Reality VR experiences in public spaces in addition to journal publications based on investigating the phenomena of human connection.
Live Theatre Sound Research Creation
I have worked on many live productions of Shakespeare over the years with a host of talented directors, actors and design teams. Some of the notable productions have been with Bard on the Beach. The next two tracks represent two very different productions of The Merchant of Venice. The first track in collaboration with director Rachel Ditor tended towards a more orchestral and fully composed soundtrack to support the romantic themes of the play. The second is representative of a more rhythmic, less melodic and at times imposing sound track to support the vision of Nigel Shawn Williams who described the production as a “sinister parable for our times”. Other musical tracks written for All’s Well that Ends Well and Richard II can also be accessed here.
My research creation process with the innovative Electric Company Theatre has alway been an intense one. The creative team’s dedication to creating prototypical theatrical experiences is unique in the world of theatre. Easily one of the most challenging and rewarding sound design processes I have ever survived, Studies in Motion culminated in the design of over 140 cues that changed and shifted over a period of 5 years with four Canadian and American remounts. The tracks below are excerpts of compositions that accompanied Member of the Order of Canada Crystal Pite’s compelling choreography. You can also read this post on the value of iterative research design when it comes to the compositional process.
Live Dance Sound Research Creation
My roles with Small Stage dance productions have been numerous including sound designer, composer, performer, musical director, and currently as researcher-in-residence. Many of my research creations for Small Stage and other dance companies and independent choreographers have been focused on the composition of music to support choreographic vision. This highly iterative process can be much more demanding and subtle than a typical compositional track for theatre. Minute changes to a production are common and timing of musical or rhythmic gestures down to the second. There is, however, more freedom of melodic, harmonic and rhythmic variation in creating for dance and often times compositions can evolve over a longer period of time than in theatre productions.
The composition below underwent a minimum of 45 variations over a two year period as the research creation process it accompanied consisted of six choreographers crafting the music for completely different dance creations. Every layer in the music became malleable as each rendered track was different in terms of melody, rhythm, tempo, duration, instrumentation and gestural emphasis. The version below was chosen by master dance artist Chengxin Wei.
The second track is an excerpt from a suite of longer compositions that accompanied a 60-minute dance composition for company Dancers Dancing choreographed by Judith Garay. Both tracks feature the altered voices of vocal artist Dr. Sheinagh Anderson who has been a persistent ‘instrument’ in many compositions.
Click here to listen to more compositions for dance with Dancers Dancing and to see/listen to award-winning dance film featuring the talents of Alvin Erasga Tolentino.
Digital Dance Sound and xR Research Creation
With Small Stage we have been exploring the intersection of live dance with extended realities such as augmented, virtual and mixed reality for the past 5 years. These research creations have involved me in the capacity of sound designer and emerging technology developer. The results have included an AR application that allowed audience-goers for a live production ‘take home’ the same dancer and choreography by scanning a QR code in front of the stage. The trailer below describes the process and the resultant AR application that was developed by graduate students at the Centre for Digital Media’s xR Prototyping Lab.
Installation-based Sound Research Creation
More recently in my history of designing sound for different media, I have co-designed and led the design of sound installations at art galleries. In this capacity I work regularly with interdisciplinary sound artist Dr. Sheinagh Anderson. In a more supportive role I designed the overall spatial audio for Anderson’s design at the Reach Gallery and mixed over 75 tracks of music, ocean and sonified data for a 3-hour ambient track in collaboration with artist Erika Grimm. The work, entitled Salt Water Skin Boats also enjoyed an installation in Amsterdam. Tracks are available to listen to on Sound Cloud.
In addition I led the design for two separate sound installations at at an international exhibition on Art and Artificial Intelligence at the Vancouver Art Gallery in 2022. The design also experimented with motion triggered hypersonic speaker technology that were mechanized using 3D printed recyclable materials. Hypersonic speakers emit harmless ultrasonic tones that we can’t hear use the property of air to create new tones that are within the range of human hearing. As a result sound is only heard from the first surface that the ultrasonic tones hit creating a disorienting effect on the listener. The technology was originally devised for a production of The Crucible at Langara College’s Studio 58 and focused by actors. In the Vancouver Art Gallery installation all sounds were manipulated sound effects from well known sci-fi films that feature artificial intelligence, such as A.I by Spielberg, The Terminator series by Cameron, and Pixar’s Wall-E.