Add resource "Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition" Accepted
Changes: 3
-
Add Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition
- Title
-
- Unchanged
- Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition
- Type
-
- Unchanged
- Paper
- Created
-
- Unchanged
- 2016-01-18
- Description
-
- Unchanged
- Human activity recognition (HAR) tasks have traditionally been solved using engineered features obtained by heuristic processes. Current research suggests that deep convolutional neural networks are suited to automate feature extraction from raw sensor inputs. However, human activities are made of complex sequences of motor movements, and capturing this temporal dynamics is fundamental for successful HAR. Based on the recent success of recurrent neural networks for time series domains, we propose a generic deep framework for activity recognition based on convolutional and LSTM recurrent units, which: (i) is suitable for multimodal wearable sensors; (ii) can perform sensor fusion naturally; (iii) does not require expert knowledge in designing features; (iv) explicitly models the temporal dynamics of feature activations.
- Link
-
- Unchanged
- http://www.mdpi.com/1424-8220/16/1/115
- Identifier
-
- Unchanged
- ISSN: 1424-8220
Resource | v1 | current (v1) -
Add Mobile phone based sensing software treated in Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition
- Current
- treated in
Topic to resource relation | v1 -
Add Deep learning relates to Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition
- Current
- relates to
Topic to resource relation | v1