Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

This repository provides the codes and data used in our paper "Human Activity Recognition Based on Wearable Sensor Data: A Standardization of the State-of-the-Art", where we implement and evaluate several state-of-the-art approaches, ranging from handcrafted-based methods to convolutional neural networks.

NotificationsYou must be signed in to change notification settings

arturjordao/WearableSensorData

Repository files navigation

This repository provides the codes and data used in our paper "Human Activity Recognition Based on Wearable Sensor Data: A Standardization of the State-of-the-Art", where we implement and evaluate several state-of-the-art approaches, ranging from handcrafted-based methods to convolutional neural networks. Also, we standardize a large number of datasets, which vary in terms of sampling rate, number of sensors, activities, and subjects.

Requirements

Quick Start

  1. Clone this repository
  2. Run
    python<Catal2015|...|ChenXue2015>.py data/<SNOW|FNOW|LOTO|LOSO>/<MHEALTH|USCHAD|UTD-MHAD1_1s|UTD-MHAD2_1s|WHARF|WISDM>.npz
    For example
    python Catal2015.py data/LOSO/MHEALTH.npz

Data Format

The raw signal provided by the original dataset was segmented by using a temporal sliding window of 5 seconds.Its format is (number of samples, 1, temporal window size, number of sensors)

Contributing

Contributions to this repository are welcome. Examples of things you can contribute:

  • Implementation of other methods. See template_hancrafted.py and template_convNets.py
  • Accuracy Improvements.
  • Reporting bugs.

The table below shows the mean accuracy achieved by the methods using the Leave-One-Subject-Out (LOSO) as validation protocol. The symbol 'x' denotes which was not possible to execute the method on the respective dataset.

MethodMHEALTHPAMAP2USCHADUTD-MHAD1UTD-MHAD2WHARFWISDMMean Accuracy
Kwapisz et al.90.4171.2770.1513.0466.6742.1975.3161.29
Catal et al.94.6685.2575.8932.4574.6746.8474.9669.29
Kim et al.93.9081.5764.2038.0564.6051.4850.2263.43
Chen and Xue88.6783.0675.58xx61.9483.8978.62
Jiang and Yin51.46x74.88xx65.3579.9767.91
Ha et al.88.3473.79xxxxx81.06
Ha and Choi84.2374.21xxxxx79.21
Mean Accuracy84.5278.1972.1427.8468.6453.5572.87x

Please cite our paper in your publications if it helps your research.

@article{Jordao:2018,author    = {Artur Jordao,Antonio Carlos Nazare,Jessica Sena andWilliam Robson Schwartz},title     = {Human Activity Recognition Based on Wearable Sensor Data: A Standardization of the State-of-the-Art},journal   = {arXiv},year      = {2018},eprint    = {1806.05226},}

About

This repository provides the codes and data used in our paper "Human Activity Recognition Based on Wearable Sensor Data: A Standardization of the State-of-the-Art", where we implement and evaluate several state-of-the-art approaches, ranging from handcrafted-based methods to convolutional neural networks.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp