Conception design: LAb[au]
Manuel Abendroth, Jérôme Decock, Alexandre Plennevaux, Els Vermang
The interactive installation '12m4s' is constituted of a 12 meter long free hanging projection screen, a distance which takes a 'man walking at ordinary speed' the time of 4 seconds to cross. In this space-time setting human movements are tracked to generate graphic elements (particles) and sonic elements (grains).
The real-time tracking is achieved by two different techniques. Firstly, image recognition is used to generate sounds, diffused through 8 speakers placed along the installation. Secondly, ultra-sound sensors are used to create echograms of the space, constituting the background of the projection. Thus the visual and sonic sensing techniques produce their reciprocal representations as they merge them into one common spatial construct.
The foreground of the projection is constituted by a graphical vector field. Each of its vectors reacts to people's movements by pointing to his/her orientation. In this manner some vectors display the actual movement of a visitor while the others remain in position. The vector field thus becomes the expression, the sum, of the entire flows having taken place in the space and updated on the spot each time a new movement is tracked. Where motion is detected graphic and sonic objects emerge while starting to negotiate their own way through the vector field. This can lead to situations such as: a passer-by will be followed by sounds and particles but as soon as s/he stops or changes direction,
turbulences in the vector field appear. The result is a flow of sonic particles shaping themselves according to visitors' movements while inviting him/her to actively imprint his motion on the screen.
In order to enhance this play between people's action and the behaviour of the particles, the relation between the physical space and the electronic construct, a specific projection screen has been developed by LAb[au] for the installation. The specificity of the semi-transparent mirror screen is that it turns transparent on strong illuminated surfaces and that it remains a mirror on non-illuminated zones. The result is a consequence of varying the light intensity in the front or back of the screen leading to moments where the projection becomes visible on the surface of the screen while mixing with the reflections of the space.
This leads to the following effects: for example, at first glance a person passing by will see the graphical 'white' elements emerging from his movements but once he remains on a spot his echogram, darkening the projection, causes his self-image to appear in the screen which will have turned reflective again. This effect is based purely on the characteristics of the screen which fuses projection and reflection while building a common space in between the digital and the body spaces; a space where technology and architecture merge in the play of light.
The installation is based on a software developed by LAb[au] using Basic and C . Motion tracking is achieved via infrared cameras and infrared illuminators along with ultra-sound sensors. The projection screen is developed by LAb[au]
Metalab02 History Navigator (requires the Adobe Flash Player 8+).