Ensembles of deep LSTM Learners for Activity Recognition using Wearables

  1. Lookup NU author(s)
  2. Dr Yu Guan
  3. Dr Thomas Ploetz
Author(s)Guan Y, Ploetz T
Publication type Article
JournalProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Year2017
Volume1
Issue2
Pages
ISSN (electronic)2474-9567
Full text is available for this publication:
Recently, deep learning (DL) methods have been introduced very successfully into human activity recognition (HAR) scenarios in ubiquitous and wearable computing. Especially the prospect of overcoming the need for manual feature design combined with superior classification capabilities render deep neural networks very attractive for real-life HAR applications. Even though DL-based approaches now outperform the state-of-the-art in a number of recognition tasks, still substantial challenges remain. Most prominently, issues with real-life datasets, typically including imbalanced datasets and problematic data quality, still limit the effectiveness of activity recognition using wearables. In this paper we tackle such challenges through Ensembles of deep Long Short Term Memory (LSTM) networks. LSTM networks currently represent the state-of-the-art with superior classification performance on relevant HAR benchmark datasets. We have developed modified training procedures for LSTM networks and combine sets of diverse LSTM learners into classifier collectives. We demonstrate that Ensembles of deep LSTM learners outperform individual LSTM networks and thus push the state-of-the-art in human activity recognition using wearables. Through an extensive experimental evaluation on three standard benchmarks (Opportunity, PAMAP2, Skoda) we demonstrate the excellent recognition capabilities of our approach and its potential for real-life applications of human activity recognition.
PublisherACM
URLhttps://doi.org/10.1145/30900761
DOI10.1145/30900761
Actions    Link to this publication

Altmetrics provided by Altmetric

Share