Live Performances
Real-time interactive scenographies
for live performances
Real-time interactive scenographies
for live performances
My PhD final project : a complex system simulation in real-time and interactive with the performer for a live show ! Using TouchDesigner and Compute shaders.
In this practice-based research project, I played a pivotal role in the R&D of a generative, evolving scenography. Crafted with compute shaders, this innovative setting dynamically improvises in real-time alongside a dancer.
I developed a website designed for collective participation in a performance experience. Users can connect and become part of a guided performance, interacting with autonomous artificial pixels from remote locations.
In collaboration with the theater company Syncope Collectif, I crafted the digital backdrop for a performance addressing an ecological disaster in Brazil. My contributions encompassed dynamic animations and poignant illustrations that captured the gravity and emotion of the event.
For this project, I designed an electronic wearable costume/device that's intuitive to both the performer's movements and the stimuli from an art exhibition. This unique creation transforms the wearer into a fusion of a visitor and an art intervention, blurring the lines between observer and exhibit!
I've ventured into VJing, crafting live generative visuals that seamlessly interact with MIDI, OSC, and audio-responsive triggers. These dynamic visuals brought a unique vibrancy to events and performances, enriching the audience's experience.