They said, "You have a blue guitar, You do not play things as they are." --- Wallace Stevens
The “Blue Guitar" is a motion based spatial sound environment based on Microsoft Kinect device. The "blue guitar”, inspired by David Hockney’s painting, is a metaphor for not only an instrument that you can play with but also an unrealistic and distorted acoustic world that you might encounter.
The project intends to create a potential audio environment which allows user to interact with and gradually reveals and evolves itself through interaction. During the project, I also attempt to balance the designed narrative of audio scene and audience triggered audio elements. The story of each section is told automatically in a pre-designed sequence: the opening, the piano, the sea, and the memory. Meanwhile, the users can participate in the performance by triggering the sound samples and changing the amplitude and panning of the sound according to their distance, position, and body part detection.
In contrast to the complexity and direct control of the sound, the visualization is simplified as waves which is controlled by the changing of total volume. The verlet integration is used to create the creature-like motion of the visualization object. (The entire body looks like a jellyfish in the deep sea.)
Building real time synthesis sound with supercollider.
// The program is developed with Kinect and Processing. [ 265 Optical-Computational Processes / George Legrady ]
Minim library example, Hilda's demo, Supercollider example.
the VerletBall & VerletStick class is based on Ira Greenberg's VerletStick example
(https://github.com/irajgreenberg/worksh ... tStick.pde)