The Time of Flight project reported in this experimental work links computing and creative arts together with live movement performance and computer controlled sound and image staging. This project is as an example of an immersive, interdisciplinary, interactive autokinetic artwork, which investigates the dialogic possibilities between forms that communicate with each other and their audience through sound and image. It allows direct performer-to-machine interactivity and control - without the use of any external, hand-held accessories while using arm and hand gestures to act as inputs for image and sound projection. Head tracking and facial movement sensors also enable head and facial movements to operate as inputs into the performance.Currently in most stage performance, the lighting and sound systems are human operator controlled or indirectly controlled through pre-specified software routines in computerised lighting and sound desks. In the TOF project, instead of using pre-programmed routines or operator-assisted cues, the performer controls both sound and vision according to a movement-based notation enacted within the performance by the performer. The computer interpreted images of the artists movement are captured in real time three-dimensional space through the use of Time of Flight image technology.A time of flight camera measures image data in infrared light, and from this it can extrapolate three-dimensional data in real time.This data is used to monitor body performance and the three dimensional imaging data is then fed into a processing computer where specifically devised computer vision and artificial intelligence algorithms (Levenberg-Marqardt 1963) are used to recognise and track the artist's movements and gestures in real time without any special use of optical trackers. This data is then fed into automated lighting and sound production software to produces a digital performance totally in the control of the performer. The design of the performance through a movement-based notation allows an artistic designer control over the final performance. Alternatively an innovative approach followed in this experimental research allows the performers to use the notation to improvise reflexive original biodigital performance pieces encompassing sound, image and movement much in the manner of a jazz improvisation.
|Title of host publication||1st Global Proceedings|
|Publication status||Published - 2010|
|Event||Global Conference of Performance visual aspect of performance practice - Prague, Czech Republic, Czech Republic|
Duration: 11 Nov 2010 → 13 Nov 2010
|Conference||Global Conference of Performance visual aspect of performance practice|
|Period||11/11/10 → 13/11/10|