Outils pour utilisateurs

Outils du site


projets:plim:20142015:gr9

Différences

Ci-dessous, les différences entre deux révisions de la page.

Lien vers cette vue comparative

Les deux révisions précédentesRévision précédente
Prochaine révision
Révision précédente
projets:plim:20142015:gr9 [2014/11/23 13:37] palmaprojets:plim:20142015:gr9 [2014/11/23 18:08] (Version actuelle) palma
Ligne 1: Ligne 1:
-==== User Activity Recognition Using Cell Phone Accelerometer and GPS Sensors ==== +==== Activity Recognition Using Cell Phone Accelerometer and GPS Sensors ==== 
-=== Name of the members of the Group ===+=== Authors ===
     * Palma,  Adam   [IFI/IAM], palma.adam@etu.unice.fr     * Palma,  Adam   [IFI/IAM], palma.adam@etu.unice.fr
     * Rocher, Gerald [IFI/IAM], rocher.gerald@gmx.fr     * Rocher, Gerald [IFI/IAM], rocher.gerald@gmx.fr
 +
 +== Abstract ==
 +//**The aim of this project is to evaluate an approach to accurately recognize a range of user’s activities and report the duration of each activity. For that purpose, tri-axial accelerometer and GPS sensors, made available in all modern smart phones, are used for the classification of four activities: resting, walking, running and driving a car. Time domain features are extracted from the GPS (User’s average speed) and tri-axis accelerometer (means, standard deviations) sensors. Accelerometer raw data is cleaned-up using the Butterworth low pass filter and a Fast Fourier Transform (FFT) is then applied to extract frequency domain features on each axis. Finally, the unsupervised k-means clustering algorithm is implemented for activities recognition and classification from time and frequency domains features. Clusters centroids are made persistent to keep activity learning and constantly improve the activity recognition accuracy.**//
  
 === Equipments === === Equipments ===
Ligne 11: Ligne 14:
 === Content of the Project === === Content of the Project ===
  
-The aim of this project is to evaluate an approach to accurately recognize a range of user’s activities and report the duration of each activityFor that purposetri-axial accelerometer and GPS sensors, made available in all modern smart phones, are used for the classification of four activitiesrestingwalking, running and driving a car. Time domain features are extracted from the GPS (User’s average speed) and tri-axis accelerometer (means, standard deviationssensorsAccelerometer raw data is cleaned-up using the Butterworth low pass filter and a Fast Fourier Transform (FFT) is then applied to extract frequency domain features on each axis. Finally, the unsupervised k-means clustering algorithm is implemented for activities recognition and classification from time and frequency domains features. Clusters centroids are made persistent to keep activity learning and constantly improve the activity recognition accuracy.+== Activities == 
 + 
 +We have considered in this study four user activities: resting, walking, running and driving a car. These activities are common for all of the usersAlsothese activities are not “transient”, occurring a long period of time and then facilitating the further clusterization. 
 + 
 +== Data collection == 
 + 
 +The sensors data have been gathered on a HTC Windows phone 8S from an application developed with the Windows Phone SDK 8.0. 
 +Sensors data are gathered during a 6s measurement windows. Accelerometer sampling period is set to 35ms by defaultwhich gives up to 171 measures per axis per measurement window. GPS latitude and longitude are measured each second during the measurement window only if the user moved by more than 10m within this period of time. 
 + 
 +__We have used the following namespaces for measuring the sensors data:__ 
 +  - Windows.Devices.Sensors ([[http://msdn.microsoft.com/en-us/library/ie/windows.devices.sensors]]) 
 +  - Windows.Devices.Geolocation ([[http://msdn.microsoft.com/library/windows/apps/BR225603]]) 
 + 
 +== Data filtering == 
 + 
 +In order to remove the high frequency noise occurring on the accelerometer axis measurements in the real conditionswe implemented a [[http://www.google.fr/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CCEQFjAA&url=http%3A%2F%2Ffr.wikipedia.org%2Fwiki%2FFiltre_de_Butterworth&ei=swdyVKGqHoLBOfv8gcAB&usg=AFQjCNFLaZ26jbyqhYBzbYZs0EJoJPTSOw&sig2=0fJbahybk4UQQv9inRpmgg&bvm=bv.80185997,d.ZWU|ButterWorth]] low pass filter with a cutting frequency set to 100Hz. 
 +Also, walking and running activities generates a periodic pattern on the accelerometer axis data within a frequency range from 2Hz to 6Hz while resting and driving a car activities do not match any periodic pattern data on the accelerometer axisThe periodic pattern frequency feature cannot be measured in the time domain hence the use of a [[http://www.google.fr/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&sqi=2&ved=0CC8QFjAB&url=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FFast_Fourier_transform&ei=9wdyVO7CHcvHPcP7gLAH&usg=AFQjCNG4CRtY8aUnfktzpT3UnZcnkTStuA&sig2=WqJmuiFrBAjH9MteWPKrfg&bvm=bv.80185997,d.ZWU|Fast Fourier Transform]] (FFT) applied on the accelerometer sensor raw (or filtered) in order to extract the pattern frequency feature. 
 + 
 +== Features extraction == 
 + 
 +We describe in this part the time domain and frequency domain features extracted from the raw data and made available for the classification algorithm once the 6s measurement windows is over. 
 +  * Average[3]: Average acceleration for each axis in time domain 
 +  * StDev[3]: The standard deviation for each axis in time domain 
 +  * Freq[3]: The highest FFT frequency peak for each axis in frequency domain. 
 +  * Speed[1]: Computed from the distance travelled during the measurement window. 
 +The average and standard deviation are computed for the 171 measures gathered during the measurement windows. 
 +The frequency step for the FFT on the tri-axis accelerometer data is given by the accelerometer sampling rate divided by the number of samples in the FFT result array (which is a power of 2). For instance, with 35ms sampling rate for the accelerometer sensor, we get 171 measures once the 6s measurement window is over. Thus, 256 values in the FFT result array. Therefore, the frequency step is (1/35ms)/256 = 0.111Hz. And the maximum frequency given by the FFT computation is the accelerometer sampling rate = 28.57Hz. 
 +So, by estimating the walking/running activity periodic pattern frequency at 10Hz max on the acceleration measures for each axis, the maximum accelerometer sampling rate should not be set above 50ms (Shannon law). 
 +Note also that increasing the measurement window will increase the FFT result accuracy but with a highest memory footprint since more sensors measures will be gathered. Beside the fact that such accuracy is unnecessaryincreasing measurement window might generate clusters mapping errors as user activity transition can occur (from resting to walking, from walking to running) inside a (long) measurement window. On the other hand, reducing measurement window can prevent clusters mapping error but would degrade FFT results and user speed calculation. 
 +So features validity depends on some parameters that have to be carefully tuned depending the activities planned to be classified. We list here after the most important parameters: 
 +  * Measurement window (default 6s), 
 +  * Accelerometer sampling period (default 35ms), 
 +  * GPS position change reporting time (default 1s), 
 +  * GPS position change reporting trigger (default 10m). 
 + 
 +== Classification == 
 + 
 +The [[http://www.google.fr/url?sa=t&rct=j&q=&esrc=s&source=web&cd=7&cad=rja&uact=8&ved=0CEkQFjAG&url=http%3A%2F%2Fhome.deib.polimi.it%2Fmatteucc%2FClustering%2Ftutorial_html%2Fkmeans.html&ei=OQhyVLDtF8fKPYHHgcAJ&usg=AFQjCNE_GiISzddCbzNj4N6_a0d1jCt5gg&sig2=f6LcAqWzH8fyFCkFrVGUfA&bvm=bv.80185997,d.ZWU|k-means]] unsupervised classification algorithm has been used to cluster features into categories mapping onto user activities. It is well suited for our purpose since it is fast and we know upfront the amount of clusters which corresponds to the user activities we want to track (resting, walking, running and driving a car). 
 +The k-means clustering algorithm computes the mean value of a ten dimension vector (the ten features defined earlier) and computes the Euclidian distance in between this value and the values of each cluster mean value (cluster’s centroid)The vector is assigned to the nearest cluster (with the lowest Euclidian distance). Then, the nearest cluster’s centroid is updated to take into account the new vector it has been assigned to. 
 + 
 +== Learning == 
 + 
 +As it is an iterative approach, the k-means classification algorithm needs a high amount of input vectors to accurately define cluster’s centroid. For that reason we first need to repeat several times all defined activities until having accurate centroid values for the four clusters. Once done, we manually annotate each cluster with the name of the corresponding activity. 
 +Cluster’s centroids have to be persistent in the application in order to enable constant iterative learning. For that purpose, we record inside the phone internal file system the four cluster’s centroid values and the amount of samples used to compute it. Recording the amount of samples used to compute each cluster’s centroid is important to ensure that the cluster’s centroid value is well weighted.
  
 === SOFTWARE PACKAGES of the Project === === SOFTWARE PACKAGES of the Project ===
  
-   * README File  +== Requirements == 
-// an README file to explain all you install from Visual Studio 2013 SP2 with the SDK WP8.0 or later, to deploy and execute your project on the Windows Phone //+  - Windows 8/8.1 Professional. 
 +  - Visual Studio 2013 SP2 update 3. 
 +  - Windows Phone SDK 8.0
  
-   Project zip file  +== Project zip file == 
-// add the zip file of the Project //+ 
 +[[https://drive.google.com/file/d/0B6t-5TDyw60heUtzTXlja3hndUE/view?usp=sharing|Project.zip]] 
 + 
 +== Installation == 
 + 
 +  - Unzip the project zip file (see above) 
 +  - Open the project from Visual Studio 2013 ("UserBehavior.sln" file in the directory "UserBehavior\"
 + 
 +== Deployment on the cell phone == 
 +You need a Microsoft Windows developer account in order to be able to deploy this application on a cell phone. (see [[http://msdn.microsoft.com/en-us/library/windows/apps/hh868184.aspx]]) 
 +  - Connect the device(phone) with the USB cable, 
 +  - in "BUILD" menu, select "Deploy Solution".
  
 == The application requires some external libraries to work: == == The application requires some external libraries to work: ==
Ligne 37: Ligne 96:
 ''PM> Install-Package SilverlightToolkit-DataViz'' ''PM> Install-Package SilverlightToolkit-DataViz''
  
-This will install:+__This will install:__
  
   - System.Windows.Controls.DataVisualization.Toolkit   - System.Windows.Controls.DataVisualization.Toolkit
   - System.Windows.Controls   - System.Windows.Controls
    
- 
 === HOW TO USE IT  === === HOW TO USE IT  ===
  
 Once started, you will get the following user interface: Once started, you will get the following user interface:
  
-You have nothing to do. Just rest, walk, run or drive a car to get sensors data gathered+  * [[https://www.flickr.com/photos/129285131@N04/15241381073/|Main interface]] 
-At the end of the day, you can open-up the statistics window to get a status about your daily activities.+ 
 +This interface is mainly designed for a debug purpose. It displays real time data gathered for each measurement window on both accelerometer and GPS. Some sliders are also made available to modify some parameters (although we do not recommend to modify these parameters).  
 +The GPS sensor status is also given by a LED (RED : GPS sensor not ready, GREEN : GPS sensor ready). 
 + 
 +You have nothing to do. Just take the cell phone and rest, walk, run or drive a car to get sensors data gathered.
    
 +Finally, the "statistics" button open-up a new page displaying the overall results
 +So, at the end of the day, you can open-up the statistics window to get a status about your daily activities:
 +
 +  * [[https://www.flickr.com/photos/129285131@N04/15673692180/|Statistics interface]]
 +
 +
 === RESULTS  === === RESULTS  ===
  
 Results are given real time to the user indicating his current activity. More interesting, results are also aggregated in the form of a histogram representing the amount of vectors assigned to each activities. Doing so we are able to provide, on a daily, weekly or monthly basis, the summary of the user activity. For that purpose a naïve algorithm can be used to compute user activity as a ratio of each activity value over the total amount of vectors composing the histogram.  Results are given real time to the user indicating his current activity. More interesting, results are also aggregated in the form of a histogram representing the amount of vectors assigned to each activities. Doing so we are able to provide, on a daily, weekly or monthly basis, the summary of the user activity. For that purpose a naïve algorithm can be used to compute user activity as a ratio of each activity value over the total amount of vectors composing the histogram. 
  
-=== Publication ====+=== CONCLUSION AND FUTURE WORK === 
 + 
 +We have developed a methodology based on mobile phone accelerometer and GPS sensors to measure user activity. First results are very encouraging despite some instabilities noticed on the GPS measurements leading to wrong cluster mapping. This phenomena is aggravated by the fact that resting to walking and walking to running vectors are close (from a user speed standpoint of view) and very sensitive to speed (then GPS) measurement accuracy. On the other hand, GPS measures are key when dealing with driving activity. So, we could improve the clustering efficiency by disregarding the speed measurements that are below 15km/h (set to 0km/h) and rely only on the periodic pattern of each axis of the accelerometer to classify resting, walking and running activities. 
 +Also, the amount of dimensions used in the k-means vector is quite important making hard to ensure data globularity which is a requirement for the k-means algorithm to properly work [19]. The amount of dimension necessary might be refined by additional experiments. 
 + 
 +=== PUBLICATION ==== 
 + 
 +A preliminary publication of this study can be found here after. This publication would still need to be updated with real sensors results gathered from the cell phone. Having the successful matching rate for each activity would be great.  
 [[https://drive.google.com/file/d/0B6t-5TDyw60hQTdvekVUWC1QVlU/view?usp=sharing|Publication]] [[https://drive.google.com/file/d/0B6t-5TDyw60hQTdvekVUWC1QVlU/view?usp=sharing|Publication]]
  
 +=== REFERENCES ===
 +
 +  * [1] Ismail, S. A., Matin, A. F. A., & Mantoro, T. (2012). A Comparison Study of Classifier Algorithms for Mobile-phone's Accelerometer Based Activity Recognition. Procedia Engineering, 41, 224-229.
 +  * [2] M.-hee Lee, J. Kim, K. Kim, I. Lee, S. H. Jee, and S. K. Yoo, “Physical Activity Recognition Using a Single Tri-Axis Accelerometer,” vol. I, pp. 20-23, 2009
 +  * [3] X. Long, B. Yin, and R. M. Aarts, “Single-Accelerometer-Based Daily Physical Activity Classification,” pp. 6107-6110, 2009.
 +  * [4] L. Sun, D. Zhang, and N. Li, “Physical Activity Monitoring with Mobile Phones,” pp. 104-111, 2011.
 +  * [5] J. R. Kwapisz, G. M. Weiss, and S. A. Moore, “Activity Recognition using Cell Phone Accelerometers,” Human Factors, vol. 12, no. 2, pp. 74-82, 2010.
 +  * [6] L. Sun, D. Zhang, B. Li, B. Guo, and S. Li, “Activity Recognition on an Accelerometer Embedded Mobile Phone with Varying Positions and Orientations,” pp. 548-562, 2010.
 +  * [7] T. Brezmes, J.-luis Gorricho, and J. Cotrina, “Activity Recognition from Accelerometer Data on a Mobile Phone,” Test, pp. 796-799, 2009.
 +  * [8] Brezmes, T., Gorricho, J.L., and Cotrina, J. 2009. Activity Recognition from accelerometer data on mobile phones. In IWANN '09: Proceedings of the 10th International Work-Conference on Artificial Neural Networks, 796-799.
 +  * [9] Ravi, N., Dandekar, N. 2005. Activity recognition from accelerometer data. In Proceedings of the Seventeenth Conference on Innovative Applications of Artificial Intelligence.
 +  * [10] D. Anguita, A. Ghio, L. Oneto, X. Parra, and J.L. Reyes-Ortiz. Human activity recognition on smart- phones using a multiclass hardware-friendly support vector machine. In Proceedings of the Interna- tional Workshop of Ambient Assited Living, 2012.
 +  * [11] Bernecker, T., Graf, F., Kriegel, H. P., Moennig, C., Dill, D., & Tuermer, C. Activity Recognition on 3D Accelerometer Data (Technical Report).
 +  * [12] Gartner survey on smart phones worldwide penetration (http://www.gartner.com/newsroom/id/2665715 )
 +  * [13] Connaissances et comportements de la population française en matière d’alimentation et d’activité physique (http://www.inpes.sante.fr/CFESBases/catalogue/pdf/1283.pdf )
 +  * [14] Fujiki Y.: iPhone as a Physical Activity Measurement Platform. In: CHI’10 USA (2010).
 +  * [15] Ayu M. A., Mantoro T., Abdul Matin A. F., Basamh S. S. O., “Recognizing user activity based on accelerometer data from a mobile phone,” in Proc. 2011 IEEE Symp. on Computers and Informatics, Kuala Lumpur, Malaysia, 2011, pp.617-621.
 +  * [16] Subramanya, A., Raj, A., Bilmes, J., and Fox, D. 2006. Recognizing activities and spatial context using wearable sensors. In Proceedings of the 22nd Conference on Uncertainty in Artificial Intelligence.
 +  * [17] Bao, L., & Intille, S. S. (2004). Activity recognition from user-annotated acceleration data. In Pervasive computing (pp. 1-17). Springer Berlin Heidelberg.
 +  * [18] A. Krause, D.P. Siewiorek, A. Smailagic, and J. Farringdon. Unsupervised, dynamic identification of physiological and activity context in wearable computing. In Proceedings of the 7th International Symposium on Wearable Computers, pages 88–97. IEEE Press.
 +  * [19] Alizadeh, A. (2014). Gesture Recognition based on Hidden Markov Models from Joints' Coordinates of a Depth Camera for Kids age of 3-8
projets/plim/20142015/gr9.1416749871.txt.gz · Dernière modification : 2014/11/23 13:37 de palma