In its first implementation, the Flyfire project sets out to explore the capabilities of this display system by using a large number of self-organizing micro helicopters.Each helicopter contains small LEDs and acts as a smart pixel. Through precisely controlled movements, the helicopters perform elaborate and synchronized motions and form an elastic display surface for any desired scenario.With the self-stabilizing and precise controlling technology from the ARES Lab, the motion of the pixels is adaptable in real time.The Flyfire canvas can transform itself from one shape to another or morph a two-dimensional photographic image into an articulated shape. The pixels are physically engaged in transitioning images from one state to another, which allows the Flyfire canvas to demonstrate a spatially animated viewing experience.Flyfire serves as an initial step to explore and imagine the possibilities of this free-form display: a swarm of pixels in a space.Flyfire, a project initiated by the SENSEable City Laboratory in collaboration with ARES Lab (Aerospace Robotics and Embedded Systems Laboratory) at MIT aims to transform any ordinary space into a highly immersive and interactive display environment.For more information, please contact:senseable-fly@mit.eduhttp://senseable.mit.edu/flyfire/
You need to be a member of dance-tech to add comments!
Comments