Outils pour utilisateurs

Outils du site


projets:plim:20142015:gr1

Différences

Ci-dessous, les différences entre deux révisions de la page.

Lien vers cette vue comparative

Les deux révisions précédentesRévision précédente
Prochaine révision
Révision précédente
projets:plim:20142015:gr1 [2014/10/14 09:20] tigliprojets:plim:20142015:gr1 [2014/11/18 10:53] (Version actuelle) – [MAIN INFORMATIONS about the Project] lavail
Ligne 1: Ligne 1:
 ====== Projets Objets Connectés et Services (OCS) ====== ====== Projets Objets Connectés et Services (OCS) ======
 +
 +
 +Voici le code de notre projet sur GitHub : <code> https://github.com/SwanJumelle/GesturesRecognizer </code> 
 +
 +
 ==== GROUP GR1 ==== ==== GROUP GR1 ====
  
-Your Web page is [[http://www.tigli.fr/doku.php?id=projets:plim:20142015:gr1|Web of the Project PLIM GR1]+//in cas of X=1, Your Web page is // <code> [[http://www.tigli.fr/doku.php?id=projets:plim:20142015:gr1|Web of the Project PLIM GR1]]</code>
  
 // Vous trouverez des consignes en bas de page pour remplir votre site// // Vous trouverez des consignes en bas de page pour remplir votre site//
  
-==== Main informations about the Project ====+==== MAIN INFORMATIONS about the Project ====
  
-//Dans la suite,  +=== MEMBERS ===
-[<nom1>] [<nom2>] [<nom3> est la liste des noms des membres d'une groupe de travail. +
-//+
  
-=== Title of the Project : XXX ===+[Swan Jumelle-Dupuy] et [Antoine Lavail] 
 + 
 + 
 +=== Title of the Project : GestureReconizer - MusicPlayer ===
  
 === Name of the members of the Group === === Name of the members of the Group ===
  
-    * Nom 1  +    * Jumelle-Dupuy 
-    * Prénom 1  +    * Swan  
-    * Option  +    * IHM 
-    * Adresse email 1 +    * jumelled@polytech.unice.fr 
-    * Nom 2 + 
-    * Prénom 2  +    * Lavail 
-    * Option  +    * Antoine 
-    * Adresse email 2 +    * IHM 
-    * Nom 3 +    * alavail@polytech.unice.fr
-    * Prénom3  +
-    * Option  +
-    * Adresse email 3+
  
 === Equipments === === Equipments ===
    
-    * Marque et type de votre téléphone  +    * HTC 8s 
-    * Téléphone personnel O/+    * Téléphone personnel N 
-    * IMEI de votre téléphone (sur lequel vous aurez créé votre compte développeur)  +    * 358721050411700
-    * Mon projet : sera intitulé <nom de projet> / je sais pas +
  
 === Content of the Project === === Content of the Project ===
  
-// quelques lignes sur le projet //+Our project aims to revamp the use of a music player. 
 +We have chosen to use the added value of mobile sensors in order to redefine the usual interactions of a technophile passionate by music. 
 +With an eye on to mobile sensors we focus on the good usage of data by using a machine learning classification algorithm: Clustering. This algorithm will allow us to recognize users gestures, then our software target actions on the music player. 
 + 
 +In the aim of recognizing gestures, we planed to gather data from the mobile accelerometer. So we use the sensors API (Microsoft.Devices.Sensors) and the XNA Framework because accelerometer data is passed in the form of an XNA Framework Vector3 object. 
 +Every 100ms we measure the value of the sensor and we compute the average values of it every 2sec. This was the best compromise between recognition and user experience. 
 + 
 +With the collected and processed data we build a Clustering in an orthonormal frame and compute it with a kMeans implementation. 
 +We aim to identify 2 clusters, one considering the unmoving state, and one considering a specifical move (forward and tilted for us). 
 +So when the user make this actions, we can recognize the gesture, and then target the “play/pause music” actions. 
 + 
 +The informations are auditively and visualy showed  to the user, the song is on pause or play, and the linked buttons are changed. 
 + 
 +As we done without the Clustering algorithm, we want to add 2 more clusters to recognize 2 other moves. This moves could be connected to the previous and next buttons. 
 +But we can imagine every possible targeted actions, for example the navigation in a picture gallery. 
 + 
 +=== SOFTWARE PACKAGES of the Project === 
 + 
 +   * README File  
 + 
 + 
 +You will need the Windows Phone 8.1 SDK with Visual Studio 2013 SP2 installed on your computer. 
 +Then, import the project into a new Windows Phone 8.1 project. You don’t need to import any .dll into the project. 
 +Finally, you just need to run the project (on a device, to have access to all functionalities) 
 + 
 +   * Project zip file  
 + 
 +[[https://mega.co.nz/#!BJwzCLra!th2BEZJOyhiCrrNtziD-zqOpaubdgh_-Tj-U8h8QIwk]] 
 + 
 +   * All required softwares  
 + 
 +You don’t need to configure anything to make the project works, here’s some documentation about the libraries we use during the development of our project: 
 + 
 +AlgLib: http://www.alglib.net/ 
 + 
 +How to get data from the accelerometer: http://msdn.microsoft.com/en-us/library/windows/apps/ff431810%28v=vs.105%29.aspx 
 + 
 +How to use the Media Library: http://msdn.microsoft.com/en-us/library/microsoft.xna.framework.media.medialibrary.aspx
  
-// indiquer les données collectées sur le téléphone, le nom de l'API et du namespace .Net pour cela// 
  
-// indiquer les informations que vous voulez obtenir après traitement (classification ...) des données collectées// 
  
-// indiquer la manière dont vous présenterez les informations obtenues à l'utilisateur// 
  
-// indiquer si vous souhaitez utiliser les informations obtenues pour d'autres fonctionnalités comme des actions sur votre téléphone//+=== HOW TO USE IT  ===
  
  
-== Optionnal but considered in the evaluation An 2 pages ACM paper on your results ==+Here’s the main interface :
  
-// Optionnal but considered in the evaluation //+[[http://image.noelshack.com/fichiers/2014/47/1416307718-recognizer.png]]
  
-//if you finnd enough time, you can write a 2 pages paper about your project in a professional ACM format. This paper will be submitted to other researchers in the field and perhaps to present  a poster in a conference and/or Workshop +1) You need to have some music (at least two songs) on your phone to make it works.
-//+
  
-**A Paper ACM Format :** +2) You can tap on the icons Play/Pause, Previous and Next song to launch the expected behavior on your music experience.
-[[http://www.acm.org/sigs/publications/pubform.doc|Word ACM Format]]+
  
 +3) When the button Shake is Off, you can use the accelerometer to trigger actions :
 +     * An acceleration on the axis Z triggers the Play/Pause behavior
 +     * An acceleration on the axis X triggers the NextSong behavior
 +     * An acceleration on the axis Y triggers the PreviousSong behavior.
  
-=== CONSIGNES ET RAPPELS ===+4) Turn Shake button to On.
  
-== Quelques consignes : ==+5) Now the device use our clustering with the accelerometer data. If you shake your phone on the Y axis, the k means algorithm will understand that the device is shaking, and will triggers the Play/Stop behavior.
  
-  - //  Warning : everything must be written in english ....//  
-  - // Warning : every added file (ex. figures, picture, sources, binaries, video ...) must be added through a web links on a Web repository. You can use such a syntax to integrate the link in this page :// <code> [[http://example.com|Lien externe]] </code> 
-  - //For more informations on DokuWiki syntax go to// [[https://www.tigli.fr/doku.php?id=wiki:syntax|syntax dokuwiki]] 
  
-== Template de page de Projet :==+=== RESULTS  ===
  
-[[projets:oc:template_rendu| Template de Page DokuWiki de Rendu de projet (in english]]+The accuracy of the accelerometer values are pretty great, whatever if the Shake button is on On or Off state (we’re using the same sensors anyway).
  
 +When the Shake mode is Off, our calculations to detect the type of gesture made by the user are convenient and is working pretty well. But the user have to do the gestures with some amplitude.
  
 +If the Shake mode is On, when we shake our device on the Y axis, our clustering will determine the the device is moving. And then, our application will Start or Pause the played song. We can improve our clustering if we give to our k mean algorithm more data. Another perspective is to polish our clustering to detect more gestures.
projets/plim/20142015/gr1.1413278436.txt.gz · Dernière modification : 2014/10/14 09:20 de tigli