We manipulate ultrasonic wavefronts to create levitated objects that can be seen, heard and felt.
As we move away from traditional human-computer interaction techniques like keyboards and mice towards touch (e.g., Smartphones ) and touchless interfaces (e.g., Kinect) our interactions lose physicality. Both touch and touchless interactions lack a controller or interface element that provides meaningful physical feedback. The same holds for voice control interfaces.
Interactive Displays made of Levitating Particles
Therefore, we propose a radically different system that can bring the physical interface to the user in mid-air. In our vision, the computer can control the existence, form, and appearance of complex levitating objects composed of "levitating particles". Users of this interactive display will be able to reach into the levitating matter, feel it, manipulate it, and hear how they deform it with all feedback originating from the levitating object's position in mid-air, as it would with objects in real life. Read more about our team and the technology we are progressing to bring our vision to life.