Through the emergence of ever more intelligent devices and progressive technological development, the way we interact with devices surrounding us will change in the future.
Of the existing interaction modalities like haptics, sound, and vision, by now this potential of contactless gestures has not yet been exhausted and the design and application of these gestures are still largely unexplored. So far, however, the concept of touchless gestures still feels like shouting at a voice interface.
We tried to close this gap and conducted a study on touchless gesture interactions, in which we designed microgestures and potentials corresponding use cases in our everyday life on the basis of prototypes and usability tests.
We designed microgestures from the ground up, both the motion sequence and the function triggered with it. The developed gestures were evaluated based on prototypes and user tests. From the gained knowledge, we developed rules and principles for designing gesture-controlled interfaces and summarized them as a study.
In our study, we explored hand movements of people while they were communicating or expressing thought non-verbally. We focused on small movements in which users got direct feedback about the status during the exercise. Among all these possible interactions, we filtered movements, which are triggered by the thumb, easy to use and underlie intuitive movement patterns transferred from the smartphone usage.
In order to test and evaluate the developed microgestures, we created various functional prototypes. For the realization of the prototypes, we used the motion capture camera Leap Motion as a tracking tool for the hand movements. With these functional prototypes, we conducted expert interviews and usability-tests.
While testing the usability of each prototype respectively microgesture, we focused on six previously defined criteria. With the gathered data about the criteria and the gained insights from the interviews, the microgestures were accordingly adjusted and improved. For the next stage, based on the taxonomy of gestures and the knowledge gained from usability tests, we elaborated principles for designing microgestures, which should ensure a positive User Experience and efficient use.
We eventually applied the revised microgestures to real use cases to see how they work out in everyday use. To simulate an everyday situation we applied gestural control interfaces to a living-room-like setting. Within this setting, we designed our own AppleTV and hacked a lamp, which both can be controlled only by using microgestures.
For the AppleTV, we utilized the gestures to navigate through the User Interface. The slide gesture was used to navigate horizontally, the scroll gesture to navigate vertically. With a tap on the left or on the right side of his finger, the user can move through the navigation hierarchy.
The lamp can be controlled by using a simple tap gesture (touching the index tip & thumb tip) to turn it on and off. By holding the index finger and the thumb tip and moving it vertically the light can be dimmed.
There is no doubt that this technology has a lot of potentials, which will become even more relevant in the future. As soon as technologies like the soli sensor from Google are precise enough and are available for the mass, this will show. This will certainly change the relationship between humans and computers since the interaction becomes much more natural than it was before. The implementation of touchless interactions into everyday life may become the next paradigm shift in interface design.
I am curious to see which applications will emerge in the future and which new sensors and new algorithms will lead the field of gestural recognition.
If you have some ideas, I’d love to hear 📭 your thoughts.
This is a student project created during semester 2016 as part of the course in Invention Design.