This performance merged live music with real-time, audio-reactive generative visuals, creating a unique sensory environment that complemented Yawn's ethereal sound. The visual design aimed to envelop the performer in a sea of dynamic imagery, inspired by natural phenomena and biological simulations. These nature-inspired, generative visuals evolved in real-time, responding to the music's rhythm, pitch, and intensity. This approach created a fluid, ever-changing backdrop that enhanced the dreamlike quality of the performance and echoed Yawn's musical themes of hope, resignation, and the tension between technology and nature.
Technical Setup:
Using a combination of TouchDesigner for real-time graphics processing and Processing for additional generative algorithms, I developed a system that could:
The event successfully showcased the powerful synergy between live music and responsive, nature-inspired visuals. The audio-reactive system created a seamless connection between sound and image, enhancing the immersive quality of the performance. While the documentation by Mitch Films beautifully captured the essence of the show, I noted that future collaborations could benefit from closer coordination between visual programming and video editing to fully capture the beat-synchronized effects.This project exemplifies my ability to create complex, responsive visual systems for live performances, blending technology, nature-inspired algorithms, and art to enhance musical experiences. It also demonstrates my proficiency in advanced tools like TouchDesigner and Processing, as well as my capacity to work collaboratively in a multifaceted creative environment.