Outils pour utilisateurs

Outils du site


projets:plim:20142015:gr1

Différences

Ci-dessous, les différences entre deux révisions de la page.

Lien vers cette vue comparative

Les deux révisions précédentesRévision précédente
Prochaine révision
Révision précédente
projets:plim:20142015:gr1 [2014/11/18 09:35] – [Projets Objets Connectés et Services (OCS)] lavailprojets:plim:20142015:gr1 [2014/11/18 10:53] (Version actuelle) – [MAIN INFORMATIONS about the Project] lavail
Ligne 1: Ligne 1:
-====== Projets Objets Connectés et Services (OCS) - MusicPlayer par reconnaissance de gestes ======+====== Projets Objets Connectés et Services (OCS) ======
  
  
Ligne 15: Ligne 15:
 === MEMBERS === === MEMBERS ===
  
-//Dans la suite,  +[Swan Jumelle-Dupuy] et [Antoine Lavail]
-[Swan Jumelle-Dupuy] [Antoine Lavail]  est la liste des noms des membres d'une groupe de travail. +
-//+
  
-=== Title of the Project : Reconnaissance de gestes ===+ 
 +=== Title of the Project : GestureReconizer - MusicPlayer ===
  
 === Name of the members of the Group === === Name of the members of the Group ===
  
-    * Nom 1  +    * Jumelle-Dupuy 
-    * Prénom 1  +    * Swan  
-    * Option  +    * IHM 
-    * Adresse email 1 +    * jumelled@polytech.unice.fr 
-    * Nom 2 + 
-    * Prénom 2  +    * Lavail 
-    * Option  +    * Antoine 
-    * Adresse email 2 +    * IHM 
-    * Nom 3 +    * alavail@polytech.unice.fr
-    * Prénom3  +
-    * Option  +
-    * Adresse email 3+
  
 === Equipments === === Equipments ===
Ligne 44: Ligne 40:
 === Content of the Project === === Content of the Project ===
  
-// quelques lignes sur le projet //+Our project aims to revamp the use of a music player. 
 +We have chosen to use the added value of mobile sensors in order to redefine the usual interactions of a technophile passionate by music. 
 +With an eye on to mobile sensors we focus on the good usage of data by using a machine learning classification algorithm: Clustering. This algorithm will allow us to recognize users gestures, then our software target actions on the music player.
  
-// indiquer les données collectées sur le téléphonele nom de l'API et du namespace .Net pour cela//+In the aim of recognizing gestureswe planed to gather data from the mobile accelerometer. So we use the sensors API (Microsoft.Devices.Sensors) and the XNA Framework because accelerometer data is passed in the form of an XNA Framework Vector3 object. 
 +Every 100ms we measure the value of the sensor and we compute the average values of it every 2sec. This was the best compromise between recognition and user experience.
  
-// indiquer les informations que vous voulez obtenir après traitement (classification ...) des données collectées//+With the collected and processed data we build a Clustering in an orthonormal frame and compute it with a kMeans implementation. 
 +We aim to identify 2 clusters, one considering the unmoving state, and one considering a specifical move (forward and tilted for us). 
 +So when the user make this actions, we can recognize the gesture, and then target the “play/pause music” actions.
  
-// indiquer la manière dont vous présenterez les informations obtenues à l'utilisateur// +The informations are auditively and visualy showed  to the user, the song is on pause or play, and the linked buttons are changed.
- +
-// indiquer si vous souhaitez utiliser les informations obtenues pour d'autres fonctionnalités comme des actions sur votre téléphone//+
  
 +As we done without the Clustering algorithm, we want to add 2 more clusters to recognize 2 other moves. This moves could be connected to the previous and next buttons.
 +But we can imagine every possible targeted actions, for example the navigation in a picture gallery.
  
 === SOFTWARE PACKAGES of the Project === === SOFTWARE PACKAGES of the Project ===
  
    * README File     * README File 
-// an README file to explain all you install from Visual Studio 2013 SP2 with the SDK WP8.0 or later, to deploy and execute your project on the Windows Phone //+ 
 + 
 +You will need the Windows Phone 8.1 SDK with Visual Studio 2013 SP2 installed on your computer. 
 +Then, import the project into a new Windows Phone 8.1 project. You don’t need to import any .dll into the project. 
 +Finallyyou just need to run the project (on a device, to have access to all functionalities)
  
    * Project zip file     * Project zip file 
-// add the zip file of the Project //+ 
 +[[https://mega.co.nz/#!BJwzCLra!th2BEZJOyhiCrrNtziD-zqOpaubdgh_-Tj-U8h8QIwk]]
  
    * All required softwares     * All required softwares 
-//Put all the links and explanations to install and to configure required softwares, before your WP8.X C# project// + 
 +You don’t need to configure anything to make the project works, here’s some documentation about the libraries we use during the development of our project: 
 + 
 +AlgLib: http://www.alglib.net/ 
 + 
 +How to get data from the accelerometer: http://msdn.microsoft.com/en-us/library/windows/apps/ff431810%28v=vs.105%29.aspx 
 + 
 +How to use the Media Library: http://msdn.microsoft.com/en-us/library/microsoft.xna.framework.media.medialibrary.aspx 
 + 
 + 
  
 === HOW TO USE IT  === === HOW TO USE IT  ===
  
-// here write step by step how to use your project //  
-// you can add copies of screen in it // 
  
-=== RESULTS  ===+Here’s the main interface :
  
-// Write a summary of the main results of the project //+[[http://image.noelshack.com/fichiers/2014/47/1416307718-recognizer.png]]
  
-=== Optionnal but considered in the evaluation : An 2 pages ACM paper on your results ====+1) You need to have some music (at least two songs) on your phone to make it works.
  
-// Optionnal but considered in the evaluation //+2) You can tap on the icons Play/Pause, Previous and Next song to launch the expected behavior on your music experience.
  
-//if you finnd enough time, you can write a 2 pages paper about your project in a professional ACM format. This paper will be submitted to other researchers in the field and perhaps to present  a poster in a conference and/or Workshop +3) When the button Shake is Off, you can use the accelerometer to trigger actions : 
-//+     * An acceleration on the axis Z triggers the Play/Pause behavior 
 +     * An acceleration on the axis X triggers the NextSong behavior 
 +     * An acceleration on the axis Y triggers the PreviousSong behavior.
  
-**A Paper ACM Format :** +4) Turn Shake button to On.
-[[http://www.acm.org/sigs/publications/pubform.doc|Word ACM Format]]+
  
 +5) Now the device use our clustering with the accelerometer data. If you shake your phone on the Y axis, the k means algorithm will understand that the device is shaking, and will triggers the Play/Stop behavior.
  
-=== CONSIGNES ET RAPPELS === 
  
-== Quelques consignes : ==+=== RESULTS  ===
  
-  - //  Warning : everything must be written in english ....//  +The accuracy of the accelerometer values are pretty greatwhatever if the Shake button is on On or Off state (we’re using the same sensors anyway).
-  - // Warning : every added file (ex. figurespicture, sources, binaries, video ...) must be added through a web links on a Web repository. You can use such a syntax to integrate the link in this page :// <code> [[http://example.com|Lien externe]] </code> +
-  - //For more informations on DokuWiki syntax go to// [[https://www.tigli.fr/doku.php?id=wiki:syntax|syntax dokuwiki]]+
  
 +When the Shake mode is Off, our calculations to detect the type of gesture made by the user are convenient and is working pretty well. But the user have to do the gestures with some amplitude.
  
 +If the Shake mode is On, when we shake our device on the Y axis, our clustering will determine the the device is moving. And then, our application will Start or Pause the played song. We can improve our clustering if we give to our k mean algorithm more data. Another perspective is to polish our clustering to detect more gestures.
projets/plim/20142015/gr1.1416303301.txt.gz · Dernière modification : 2014/11/18 09:35 de lavail