Gestures

A bit more than two years ago Granit Luzhniza, Christoffer Öjeling and me created a hobbyist maker glove tracking physical movement. With this glove we did an experiment for automated gesture analysis which produced a paper, won some prices, and is supposed to be my master thesis (with some extensions). But writing blocked this endeavour until now. I will finally write that stuff. And on my blog I will document that story a bit. I will write less scientifically here, which hopefully helps me to express myself better.

So, a basic question in this context is, what are gestures anyway. Intuitively many would define gestures as an embodied means of communication, consisting mainly of arm movements. In an extreme form this leads to sign languages, which can form complete sentences. In fact several gloves for auto translating sign language exist as prototypes. Nonverbal communication can also seen as a system of gestures. Here the gestures contain more than just the arm, but whole body posture and especially also facial expressions.

With the rise of smartphones and AR gestures also got another meaning, namely as gestures for interaction with an IT system. Early research in for augmented reality diversified in ionic gestures, like swiping, pointing in the space, or describing a shape, and symbolic gestures, which are gestures with an unambiguous fixed meaning used to trigger system actions. I guess the latter maps to the idea of the glove we did, where we had gestures like a thumbs up, which could trigger the same action as pressing an ok button in a GUI. The former found its way into commodity hardware like the smart phone. I guess everyone uses the swipe and pinch gesture there.

An interesting thought is, how to combine the automatic detection of gestures with the augmented reality of physical world. Some research in multimodal input stresses the problem that the real meaning of a gesture is also defined trough the context around you. Saying the thing on my left needs information what is on my left. Similar, calling the elevator by swiping the arm up next to the elevators door needs to know you are in the proximity of the elevator. This is far away from the works of the thesis, which discusses this issues, but technically and by experiments explores techniques for the gesture detection itself. Still it is an interesting issue.

If anyone reads that and finds some other diversification of gestures I did not thought of yet, or usage or viewpoints, I would love to hear your comments.

Category(s): gestures, thesis
Tags: , , , , ,

Leave a Reply