Generative Visuals for Yawn Performance

Generative Visuals for Yawn Performance

In collaboration with Jam in Jubilee, supported by the Abbotsford Arts Council and Raven Brewery, I created an immersive visual experience for Yawn, a dream-pop project by artist and musician Julia McDougall.
Generative Visuals for Yawn PerformanceGenerative Visuals for Yawn Performance

This performance merged live music with real-time, audio-reactive generative visuals, creating a unique sensory environment that complemented Yawn's ethereal sound. The visual design aimed to envelop the performer in a sea of dynamic imagery, inspired by natural phenomena and biological simulations. These nature-inspired, generative visuals evolved in real-time, responding to the music's rhythm, pitch, and intensity. This approach created a fluid, ever-changing backdrop that enhanced the dreamlike quality of the performance and echoed Yawn's musical themes of hope, resignation, and the tension between technology and nature.

Generative Visuals for Yawn PerformanceGenerative Visuals for Yawn Performance

Technical Setup:

  • Custom-designed stage featuring a screen placed in front of the performer
  • Real-time generative visuals programmed using TouchDesigner and Processing
  • Audio-reactive system responding dynamically to Yawn's live music
  • Visual effects synchronized with musical beats and transitions
Generative Visuals for Yawn PerformanceGenerative Visuals for Yawn Performance

Using a combination of TouchDesigner for real-time graphics processing and Processing for additional generative algorithms, I developed a system that could:

  1. Analyze incoming audio in real-time
  2. Generate and manipulate visuals based on audio input
  3. Simulate natural phenomena like fluid dynamics, particle systems, and organic growth patterns
  4. Blend multiple layers of visuals for a rich, complex visual experience
  • Music: Yawn (Julia McDougall)
  • Visuals: [Your Name]
  • Event Organization: Jam in Jubilee
  • Support: Abbotsford Arts Council, Raven Brewery
  • Documentation: Mitch Films

The event successfully showcased the powerful synergy between live music and responsive, nature-inspired visuals. The audio-reactive system created a seamless connection between sound and image, enhancing the immersive quality of the performance. While the documentation by Mitch Films beautifully captured the essence of the show, I noted that future collaborations could benefit from closer coordination between visual programming and video editing to fully capture the beat-synchronized effects.This project exemplifies my ability to create complex, responsive visual systems for live performances, blending technology, nature-inspired algorithms, and art to enhance musical experiences. It also demonstrates my proficiency in advanced tools like TouchDesigner and Processing, as well as my capacity to work collaboratively in a multifaceted creative environment.

Next Project
or explore all projects