projets:plim:20142015:gr1
Différences
Ci-dessous, les différences entre deux révisions de la page.
Les deux révisions précédentesRévision précédenteProchaine révision | Révision précédente | ||
projets:plim:20142015:gr1 [2014/10/14 09:25] – [Main informations about the Project] tigli | projets:plim:20142015:gr1 [2014/11/18 10:53] (Version actuelle) – [MAIN INFORMATIONS about the Project] lavail | ||
---|---|---|---|
Ligne 1: | Ligne 1: | ||
====== Projets Objets Connectés et Services (OCS) ====== | ====== Projets Objets Connectés et Services (OCS) ====== | ||
- | ==== GROUP GR1 ==== | ||
- | Your Web page is [[http:// | ||
- | // Vous trouverez des consignes en bas de page pour remplir votre site// | + | Voici le code de notre projet sur GitHub : < |
- | ==== Main informations about the Project ==== | ||
- | // Becareful : as soon as you want to add a file, you must post it on an external repository like google, dropbox,... or your own web site. You don't have permission to add files here. You can only edit and modify this page. | + | ==== GROUP GR1 ==== |
- | The syntax to integrate external link in this page is : < | + | |
+ | //in cas of X=1, Your Web page is // < | ||
- | === GROUP : GRX === | + | // Vous trouverez des consignes en bas de page pour remplir votre site// |
+ | ==== MAIN INFORMATIONS about the Project ==== | ||
- | //Dans la suite, | + | === MEMBERS === |
- | [< | + | |
- | // | + | |
+ | [Swan Jumelle-Dupuy] et [Antoine Lavail] | ||
- | === Title of the Project : XXX === | + | |
+ | === Title of the Project : GestureReconizer - MusicPlayer | ||
=== Name of the members of the Group === | === Name of the members of the Group === | ||
- | * Nom 1 | + | * Jumelle-Dupuy |
- | * Prénom 1 | + | * Swan |
- | * Option | + | * IHM |
- | * Adresse email 1 | + | * jumelled@polytech.unice.fr |
- | * Nom 2 | + | |
- | * Prénom 2 | + | * Lavail |
- | * Option | + | * Antoine |
- | * Adresse email 2 | + | * IHM |
- | * Nom 3 | + | * alavail@polytech.unice.fr |
- | * Prénom3 | + | |
- | * Option | + | |
- | * Adresse email 3 | + | |
=== Equipments === | === Equipments === | ||
- | * Marque et type de votre téléphone | + | * HTC 8s |
- | * Téléphone personnel | + | * Téléphone personnel N |
- | * IMEI de votre téléphone (sur lequel vous aurez créé votre compte développeur) | + | * 358721050411700 |
=== Content of the Project === | === Content of the Project === | ||
- | // quelques lignes sur le projet | + | Our project aims to revamp the use of a music player. |
+ | We have chosen to use the added value of mobile sensors in order to redefine the usual interactions of a technophile passionate by music. | ||
+ | With an eye on to mobile sensors we focus on the good usage of data by using a machine learning classification algorithm: Clustering. This algorithm will allow us to recognize users gestures, then our software target actions on the music player. | ||
+ | |||
+ | In the aim of recognizing gestures, we planed to gather data from the mobile accelerometer. So we use the sensors API (Microsoft.Devices.Sensors) and the XNA Framework because accelerometer data is passed in the form of an XNA Framework Vector3 object. | ||
+ | Every 100ms we measure the value of the sensor and we compute the average values of it every 2sec. This was the best compromise between recognition and user experience. | ||
+ | |||
+ | With the collected and processed data we build a Clustering in an orthonormal frame and compute it with a kMeans implementation. | ||
+ | We aim to identify 2 clusters, one considering the unmoving state, and one considering a specifical move (forward and tilted for us). | ||
+ | So when the user make this actions, we can recognize the gesture, and then target the “play/pause music” actions. | ||
+ | |||
+ | The informations are auditively and visualy showed | ||
+ | |||
+ | As we done without the Clustering algorithm, we want to add 2 more clusters to recognize 2 other moves. This moves could be connected to the previous and next buttons. | ||
+ | But we can imagine every possible targeted actions, for example the navigation in a picture gallery. | ||
+ | |||
+ | === SOFTWARE PACKAGES of the Project === | ||
+ | |||
+ | * README File | ||
+ | |||
+ | |||
+ | You will need the Windows Phone 8.1 SDK with Visual Studio 2013 SP2 installed on your computer. | ||
+ | Then, import the project into a new Windows Phone 8.1 project. You don’t need to import any .dll into the project. | ||
+ | Finally, you just need to run the project (on a device, to have access to all functionalities) | ||
+ | |||
+ | * Project zip file | ||
+ | |||
+ | [[https://mega.co.nz/# | ||
+ | |||
+ | * All required softwares | ||
+ | |||
+ | You don’t need to configure anything to make the project works, here’s some documentation about the libraries we use during the development of our project: | ||
+ | |||
+ | AlgLib: http:// | ||
+ | |||
+ | How to get data from the accelerometer: | ||
+ | |||
+ | How to use the Media Library: http:// | ||
- | // indiquer les données collectées sur le téléphone, | ||
- | // indiquer les informations que vous voulez obtenir après traitement (classification ...) des données collectées// | ||
- | // indiquer la manière dont vous présenterez les informations obtenues à l' | ||
- | // indiquer si vous souhaitez utiliser les informations obtenues pour d' | + | === HOW TO USE IT === |
- | === All the Software Package of the Project === | + | Here’s |
- | == | + | [[http:// |
- | === Optionnal but considered in the evaluation : An 2 pages ACM paper on your results ==== | + | 1) You need to have some music (at least two songs) |
- | // Optionnal but considered in the evaluation // | + | 2) You can tap on the icons Play/Pause, Previous and Next song to launch |
- | //if you finnd enough time, you can write a 2 pages paper about your project in a professional ACM format. This paper will be submitted to other researchers in the field and perhaps | + | 3) When the button Shake is Off, you can use the accelerometer |
- | // | + | * An acceleration on the axis Z triggers the Play/Pause behavior |
+ | * An acceleration on the axis X triggers the NextSong behavior | ||
+ | * An acceleration on the axis Y triggers the PreviousSong behavior. | ||
- | **A Paper ACM Format :** | + | 4) Turn Shake button to On. |
- | [[http:// | + | |
+ | 5) Now the device use our clustering with the accelerometer data. If you shake your phone on the Y axis, the k means algorithm will understand that the device is shaking, and will triggers the Play/Stop behavior. | ||
- | === CONSIGNES ET RAPPELS === | ||
- | == Quelques consignes : == | + | === RESULTS |
- | - // Warning : everything must be written in english ....// | + | The accuracy of the accelerometer values are pretty great, whatever if the Shake button is on On or Off state (we’re using the same sensors anyway). |
- | - // Warning : every added file (ex. figures, picture, sources, binaries, video ...) must be added through a web links on a Web repository. You can use such a syntax to integrate | + | |
- | - //For more informations on DokuWiki syntax go to// [[https:// | + | |
+ | When the Shake mode is Off, our calculations to detect the type of gesture made by the user are convenient and is working pretty well. But the user have to do the gestures with some amplitude. | ||
+ | If the Shake mode is On, when we shake our device on the Y axis, our clustering will determine the the device is moving. And then, our application will Start or Pause the played song. We can improve our clustering if we give to our k mean algorithm more data. Another perspective is to polish our clustering to detect more gestures. |
projets/plim/20142015/gr1.1413278744.txt.gz · Dernière modification : 2014/10/14 09:25 de tigli