Adaptive Gesture Recognition Framework for Mobile Applications

gesture.jpg




Overview

With the recent rapid development of the technology, touch input has become the most common type of input mechanism in mobile devices. Also, it continues to spread into other mainstream computers as well. So, it is a good time to study deeply how does the finger gestures work.

Inspiration

We all have played games like temple run, subway surfers and minion rush. In order to play the game you will input finger gestures. As an example you will swipe upwards, downwards to right and left in order to your character to perform respective actions. While the speed of the game increases it is  tough to recognize user inputs because the reaction speeds get quicker where user needs to perform quick actions. Game developers want to increase the difficulty of the game when the time grows but it will be frustrating if we lose even when the game is intense. With that inspiration in our minds we came up with an idea to study how exactly the finger swipe gestures work in different environment such as games and if possible a way to improve the user experience.
Goals

Studying how the finger gesture behavior change with the time given to make the gesture input and implementing an improved gesture recognition framework which can be used for any touch based application.

Methodology

First of all we need data about finger gesture movement of users. In order to collect data we developed a simple application.

The application shows an arrow pointing to left, right, up and down. After seeing the arrow user will be swiping respective input. Once the correct input is given, the arrow disappears. Direction of the arrow changes respect to time. Time gap between two different arrows decreases with time and the time that user has to take an action reduces.


We have collected the swipe start position, swipe end position, required direction and detected direction against time.


Results and Data Analysis

To get the results and analyze them, we have plotted graphs for various gesture features against time. From analyzing those graphs we can get interpretations for that data.

Swipe distance (random directions) against time

swipe Distance against time.JPG

In our implementation, time given for a swipe decreases with time.This graph represent how does the swipe distance across screen changes with this decreasing time interval to make the gesture. Here all the swipe directions are considered and data is collected for random swipe directions. We can see that swipe distance of a swipe decreases as the time interval decreases. Trend line drawn for the collected data clearly represent that.

Variation of Up Swipe distance against time

swipe_distance_poly.JPG

After measuring the swipe distance for random swipe directions with time, we then tried to analyse individual swipe directions. First we plotted a graph for swipe up distance against time. Results shown in the particular graph are identical for the previous graph which plotted random swipe directions against time. Swipe distance is decreasing with the decreasing of time interval given for  swipe.


Then we followed the same procedure for other individual swipe directions and observed that the result was same for the others.

Variation of Left Swipe distance against time

swipedistance_left.JPG



Variation of Down Swipe distance against time

swipe Distance_down.JPG


Variation of Right Swipe distance against time

swiperight_distance.JPG


Variation of Swipe angle against time

Capture.PNG

We measured the swipe angle and plotted a graph for swipe angle against time. But we could see that we can get a special interpretation from that. Swipe angle didn’t show any difference with time.


Swipe vector scatter plot (Right Hander)

swipe_vector_plot.JPG



We considered the (0,0) point as the start point for the swipe and the dots on the grapgh represent the endpoints of the swipe. First we analyzed the swipe behavior of a right hander for all four directions. When we plotted the graphs and analyzed we saw that the swipe up gesture is bending towards right and even the swipe down gesture was also slightly bending towards left.  Also the left swipe gesture is  bending downwards and the right swipe gesture is bending upwards.


Swipe vector scatter plot (Left Hander)


Then we followed the same procedure for a left hander. From that analysis we have noticed that Swipe up gesture for a left hander is bending towards left and the swipe down gesture was bending towards right. Also the left swipe gesture was  bending upwards and right swipe gesture was bending downwards.

Results observed for a right hander was quietly opposite for left hander. Therefore we realized that a universal adapting rule for all users is not suitable and we have to adapt for a particular user as the gesture behaviors changes with the user characteristics.



Swipe vector variation with time

swipe_vector_variation_with time.JPG




This is the swipe vector variation with time where Y and Z axis's represent swipe distances and X axis represent time. The graph shows that when X increases time between two swipes decreases which causes the swipe distance to minimize.  

Adaptive swipe recognition scheme

UP = UP*(1-LearningRate) + Input*LearningRate
This is our adaptive learning approach. This can be done for all the vectors such as up, down, left and  right. Vectors will be updated on their respective input. User’s input is taken into consideration and the difference will be calculated using our fixed inputs. The minimal difference between input and the fixed values will be picked and we categorize it with a known gesture.


Adjusted vectors(Right handed)

This is for a right handed gamer where there is a slight angle to his right hand side when he is swiping up.


Adjusted vectors(Left handed)




And this for a left hander where there is a swipe angle to left hand side.


Conclusions

  • Swipe behavior changes according to the user
  • Different users will have different vector values as their gesture inputs. If we plot graphs for users and find vectors that will show some variety.
  • Adaptive Learning Approach
We found an adaptive learning approach where we update the gesture vectors respect to current inputs we receive. There is no fixed vector value for a particular input but we adopt it according to the user where it correctly identifies an input.

Future Work

  • Touch Pressure  and User’s Mental Condition


Angry_iPhone_Owner_Wide.jpg



Another research area that we can look forward is to find a correlation between touch pressure and user’s mental condition. Touch pressure will only be one criteria to understand the mental condition of a user. Successful research work will provide adopting applications by understanding the mental state of the user. As an example if the pressure applied to the screen is way too much and if the framework classifies user is angry we could adopt the application for peaceful themes or anything which prevents user going to a dangerous mental state.

Comments