Research department for
Intelligent User Interfaces

Android Gesture Recognition Tool


This tool is a follow-up work of my master thesis TaKG - A Toolkit for Automatic Classification of Gestures I wrote at the University of Saarland. It allows to record hand movement gestures by exploiting the accelerometers of an android smart phone. A gesture learning tool makes it possible to create gesture training sets. These training sets contain accelerometer records of gestures and a label that tags a gesture with a specific class.

The gesture recognition service can recognize new performed gestures and provide the recognition result to android applications (activities) that subscribe to this service. The result contains the tag of the best matching gesture and the distance of the classified gesture to the best matching gesture. The smaller the distance the better the similarity. The service supports different training sets that allow to personalize recorded gestures or to define specific gesture training sets for individual applications. For gesture recognition the service uses the genectic algorithm dynamic time warp which has been extended to classify multidimensional signals.

To use the gesture recognition service and the training application you need the following:

  • Android smartphone with at least Android standard platform 1.6
  • The gesture trainer application. You will find the app here.
Figure 1 represents the package architecture. It consists of the following components:
Gesture Recorder The gesture recorder component listens to events that are emitted from the accelerometer sensors, so it steadily monitors the acceleration of the device. A gesture is detected if the absolute value of the acceleration exceeds a special threshold for a sufficient period of time. Thus, it is assured that noises produced by unintended movements of the hand do not emit gestures.
Gesture Classifier The classifier implements the dynamic time warp algorithm and is responsible for the training and recognition of gestures. Further more it manages the available training sets.
Gesture Recognition Service The Gesture Recognition Service is a service that can be subscribed by any android application. Gesture recognition is running in the background and reacts on new gestures recorded by the Gesture Recorder. Depended on the currrent mode, that can be set via the service's interface, new gestures are added to a training set or classified. The result is delivered to the application that is registered as a listener.
Gesture Training Application The gesture training application is an application that uses the Gesture Recognition Service to manage and to extend training sets with new gestures. It uses the service interface to create new training sets, delete training sets, record new gestures, delete existing gestures. Furthermore it supports two modes. The first mode is the training mode. When it is active, new recorded gestures are added to the actual selected training set. Second is the classification mode. Here, new gestures are classified in respect to the actual training set and the result (label and distance) are displayed on the screen. See in the next section how to use this application.

GestureTrainer Framework Figure 1: The GestureTrainer package architecture

Train Gestures

The Gesture Trainer Application provides an easy way to create gesture training sets, to manage them and to test the gesture recognition. After starting the application you see the windows depicted in figure 2 to the right.

The content in the red circle gives information about the active training set. This can be changed by editing the name in the box below and accepting the change by pressing the button "Change Training Set".

The entry in the "Gesture Name" box informs about the label (tag) of the gesture that will be trained to the system. You can start the training process by pressing the "Start Training" button. If this button is active, every recorded gesture will be added to the active training set, tagged with the label that is defined in the gesture name box. Press "Stop Training" to stop the training process.

If the system is not in the training modes, it classifies every gesture that will be performed by the smartphone. The result of the recognition process is displayed in an info box (recognize that there will be no result, if the training set is emtpy).

The "Delete Training Set" button allows to delete the complete active training set.
Gesture Trainer Application
Figure 2: Gesture Trainer Application
To see all known gestures select the edit gestures view via the options menu. The menu lists the labels of all trained gesture of the active training set. You can delete single gestures with the "Delete" option in an item's context menu. Gesture Trainer Application
Figure 3: Edit Gesture Menu

Using the Recognition Service

The gesture recognition service can be accessed by any android application.

You need: Preparations - Step by Step:
  • Create a new Android project
  • Copy the content of the zip file into the src directory. The file contains the interfaces for the gesture recognition service in aidl-format (android interface description language). When compiling the project the android SDK should automatically create the files IGestureRecognitionListener and IGestureRocognitionServer in the gen directory.
Integrating the Gesture Recognition Service into your Activity

Create a Stub for the IGestureRecognitionListener interface. You appliaction will be registered as a listener of the service. In the stub you can override the event handling methods with your own code and implement your program's reaction:
IBinder gestureListenerStub = new IGestureRecognitionListener.Stub() {

   public void onGestureLearned(String gestureName) throws RemoteException {
      System.out.println("Gesture" + gestureName + "learned!");
   public void onGestureRecognized(Distribution distribution) throws RemoteException {
      System.out.println(String.format("%s %f", distribution.getBestMatch(),distribution.getBestDistance()));

   public void onTrainingSetDeleted(String trainingSet) throws RemoteException {
      System.out.println("Training Set " + trainingSet + " deleted!");
Create a service connection to the recognition service. If the recognition service is connected, we create an interface to the gesture recognition service from the stub and assign it to the member variable recognitionService. Then we register the Listener Stub created before to the service:
private IGestureRecognitionService recognitionService;

private final ServiceConnection gestureConnection = new ServiceConnection() {

   public void onServiceConnected(ComponentName className, IBinder service) {
      recognitionService = IGestureRecognitionService.Stub.asInterface(service);
      try {     
      } catch (RemoteException e) {
   public void onServiceDisconnected(ComponentName className) {

The last step is to bind your application to the service at an adequate place, e.g. the onCreate-method:
Intent gestureBindIntent = new Intent("de.dfki.ccaal.gestures.GESTURE_RECOGNIZER");
bindService(gestureBindIntent, gestureConnection, Context.BIND_AUTO_CREATE);


The interface is declared as follows:

interface IGestureRecognitionService {
    void startClassificationMode(String trainingSetName);    
    void stopClassificationMode();			
    void registerListener(IGestureRecognitionListener listener);	
    void unregisterListener(IGestureRecognitionListener listener);    
    void startLearnMode(String trainingSetName, String gestureName);					
    void stopLearnMode();	
    void deleteTrainingSet(String trainingSetName);
    void deleteGesture(String trainingSetName, String gestureName);
    List <String> getGestureList(String trainingSet);

startClassificationMode(String trainingSetName) start the recognition service in recognition mode with given training set. new performed gestures are classified with the training set as base.
startLearnMode(String trainingSetName, String gestureName) start the recognition service in training modus. New performed gestures are added to the training set with name trainingSetName. The label of the new trained gesture is gesture Name.
getGestureList(String trainingSet) return all known gestures in the given training set