Outils pour utilisateurs

Outils du site


projets:plim:20142015:gr1

Ceci est une ancienne révision du document !


Projets Objets Connectés et Services (OCS)

Voici le code de notre projet sur GitHub :

 https://github.com/SwanJumelle/GesturesRecognizer 

GROUP GR1

in cas of X=1, Your Web page is

 [[http://www.tigli.fr/doku.php?id=projets:plim:20142015:gr1|Web of the Project PLIM GR1]]

Vous trouverez des consignes en bas de page pour remplir votre site

MAIN INFORMATIONS about the Project

MEMBERS

[Swan Jumelle-Dupuy] et [Antoine Lavail]

Title of the Project : GestureReconizer - MusicPlayer

Name of the members of the Group

  • Jumelle-Dupuy
  • Swan
  • IHM
  • jumelled@polytech.unice.fr
  • Lavail
  • Antoine
  • IHM
  • alavail@polytech.unice.fr

Equipments

  • HTC 8s
  • Téléphone personnel N
  • 358721050411700

Content of the Project

Our project aims to revamp the use of a music player. We have chosen to use the added value of mobile sensors in order to redefine the usual interactions of a technophile passionate by music. With an eye on to mobile sensors we focus on the good usage of data by using a machine learning classification algorithm: Clustering. This algorithm will allow us to recognize users gestures, then our software target actions on the music player.

In the aim of recognizing gestures, we planed to gather data from the mobile accelerometer. So we use the sensors API (Microsoft.Devices.Sensors) and the XNA Framework because accelerometer data is passed in the form of an XNA Framework Vector3 object. Every 100ms we measure the value of the sensor and we compute the average values of it every 2sec. This was the best compromise between recognition and user experience.

With the collected and processed data we build a Clustering in an orthonormal frame and compute it with a kMeans implementation. We aim to identify 2 clusters, one considering the unmoving state, and one considering a specifical move (forward and tilted for us). So when the user make this actions, we can recognize the gesture, and then target the “play/pause music” actions.

The informations are auditively and visualy showed to the user, the song is on pause or play, and the linked buttons are changed.

As we done without the Clustering algorithm, we want to add 2 more clusters to recognize 2 other moves. This moves could be connected to the previous and next buttons. But we can imagine every possible targeted actions, for example the navigation in a picture gallery.

SOFTWARE PACKAGES of the Project

  • README File

You will need the Windows Phone 8.1 SDK with Visual Studio 2013 SP2 installed on your computer. Then, import the project into a new Windows Phone 8.1 project. You don’t need to import any .dll into the project. Finally, you just need to run the project (on a device, to have access to all functionalities)

  • Project zip file
  • All required softwares

You don’t need to configure anything to make the project works, here’s some documentation about the libraries we use during the development of our project:

AlgLib: http://www.alglib.net/ How to get data from the accelerometer: http://msdn.microsoft.com/en-us/library/windows/apps/ff431810%28v=vs.105%29.aspx How to use the Media Library: http://msdn.microsoft.com/en-us/library/microsoft.xna.framework.media.medialibrary.aspx

HOW TO USE IT

Here’s the main interface :

1) You need to have some music (at least two songs) on your phone to make it works. 2) You can tap on the icons Play/Pause, Previous and Next song to launch the expected behavior on your music experience. 3) When the button Shake is Off, you can use the accelerometer to trigger actions :

  • An acceleration on the axis Z triggers the Play/Pause behavior
  • An acceleration on the axis X triggers the NextSong behavior
  • An acceleration on the axis Y triggers the PreviousSong behavior.

4) Turn Shake button to On. 5) Now the device use our clustering with the accelerometer data. If you shake your phone on the Y axis, the k means algorithm will understand that the device is shaking, and will triggers the Play/Stop behavior.

RESULTS

The accuracy of the accelerometer values are pretty great, whatever if the Shake button is on On or Off state (we’re using the same sensors anyway). When the Shake mode is Off, our calculations to detect the type of gesture made by the user are convenient and is working pretty well. But the user have to do the gestures with some amplitude. If the Shake mode is On, when we shake our device on the Y axis, our clustering will determine the the device is moving. And then, our application will Start or Pause the played song. We can improve our clustering if we give to our k mean algorithm more data. Another perspective is to polish our clustering to detect more gestures.

projets/plim/20142015/gr1.1416307392.txt.gz · Dernière modification : 2014/11/18 10:43 de lavail