Title: Ascendance: an experiment in colour music for VR
Abstract: This research project is a collaboration between composers of music and vision for 360 cinematic VR. Colour or Visual Music boasts a strong tradition in the field of abstract animation including luminaries such as Oskar Fischinger, Jordan Belson, Norman McLaren, Mary Ellen Bute, John and James Whitney, etc. Often these practitioners worked on pre-recorded music in applying form, colour and movement to an interpretation of sonic scores. Sometimes the audio-visual synergy was the result of genuine collaboration between animator and musician. The uniqueness of the Ascendance project lies in the challenges the VR canvas presented in its seemingly limitless spatial affordances, and in the genuine cross-disciplinary collaboration required to effectively meet these demands. An original musical score was revisited for re-composing in VR space in anticipation of its animated visual expression, and in consideration of the viewer’s experience. The musical composer and animator worked together to consider how musical aspects such as pitch, volume and tempo might inform both audio and visual staging, and the visual aspects of colour, form and movement.
Initial consideration of audio-visual correspondences was broad, taking in the proposals of, among others, visual artist and theorist Wassily Kandinsky, psychologist Robert Plutchik and the writer Goethe. Published schemas were then put aside as the three contributors —artist, musician, producer— noted their personal responses to the musical score in terms of colour, shape, size and dynamics. Correlations between us were identified, and some basic tenets concerning volume size, pitch-hue reconfirmed.
CGI artist and animator Louise Harvey set to work on crafting colours, settings, lighting, etc; each draft was reviewed by composer Mark Douglas Williams, Louise, and Peter Moyes as creative producer, towards complementary and impactful music-animation relations.
A key finding of the research project is the necessity of a flexible workflow/approach that acknowledges the equal status of visual and audio elements (and contributors), embraces an iterative back and forth dynamic in testing and refining audio-visual relations, and features a nimble attitude in responding to the constraints and liberties of the VR canvas discovered only through research as practice.
It was found that the specific dynamics of sound and vision in the VR space necessitate a conciliatory approach. When mapping vision to sound, the narrow(er) field of attention for the staging of visuals in the VR space can be at odds with the 360° immersive canvas enjoyed by the sound designer. Vision occupies a distinct location in space, while sound can be broad and amorphous. Subsequently, for our project, a visual interpretation of the music focused on key audio motifs and ‘hero’ instruments, and oft-times a reading of the broad emotional sweep of the music, in order to avoid too many visual elements competing for the attention of the viewer across the wide VR space. The music then took its cues from animation as the original 3D sound mix was remapped to echo these visual stylings. Ultimately, the objective of engagement was deemed to be emotionally-driven cohesion and flow over strict audio-visual correspondence.
The project was insightful as proof of concept testing the efficacy of technical/procedural and creative methods, and importantly, ways of working collaboratively across disciplines towards evocative immersive VR experiences in colour and music.
Publication Year: 2021
Publication Date: 2021-01-01
Language: en
Type: article
Access and Citation
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot