Thumbnail Yas.png

Algerian Hand Gestures [AI Project]

Thumbnail%2BYas.jpg

This interactive project uses a camera to detect hand gestures and a speaker to produce corresponding verbal sounds.

I was inspired to produce a system that can preserve and demonstrate common Algerian and Mediterranean hand gestures that are a big part of my culture. I remember when I was young, watching my relatives having entire conversations with neighbours in other buildings using hand gestures alone. They would catch up on local news, gossip and share jokes.

This gestural language has been diluted and forgotten over time. The main reason is that it is seen as no longer necessary; most people have a mobile phone they use to send messages and emojis with. Another factor is generational; many in my generation are influenced by or have emigrated to Western Europe and the US, where hand gestures and expressive communication is less common. Although I am not a traditionalist, I have come to realise that the English language and mannerisms lack the range and richness in emotional expression found in Mediterranean cultures.

yasgif.gif

I wanted to develop a way of capturing the emotional vocabulary of these gestures before they are forgotten completely, and to make the case for the non-verbal expressions of emotion that cannot be articulated verbally. Ideally this would be an interactive installation where audience members can try out different hand gestures and learn what their meanings represent. Even without necessarily understanding the language they hear, they can get a sense of the emotion conveyed in the tone and still enjoy the experience.

I wrote the piece in C++ and used a PlayStation Eye™ camera. I provided the audio recordings of my voice and my hand for the gestures. I first produced a data file with the three gestures separately, then imported that into the final program which recognised the visual input and matched them to the audio files.

A more accurate reading of hand movement could be performed by a Kinect camera or a Leap Motion sensor, which would be able to detect depth, or indeed by having sensors detect muscle movement. However, the wider purpose of this project is for audiences to be able to interact with it outside of an installation, using personal devices, which would use 2D cameras only. Therefore, to be as accessible as possible, I would put my effort into making movement detection more efficient, perhaps by using more sophisticated machine learning algorithms.

I would like to develop an open-source, de-centralised gestural library that anyone can contribute to and curate. Machine learning algorithms could be employed to match and categorise gestures from different cultures. This library would act as way to preserve gestural languages that are at risk of becoming extinct and to trace the ways they have evolved over time and geographically.