A real-time sleeping position recognition system using IMU sensor motion data

Odongo Steven Eyobu, Young Woo Kim, Daewoong Cha, Dong Seog Han

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

13 Scopus citations

Abstract

The emergence of wearable miniature inertial measurement unit (IMU) sensors is a powerful enabler for lying motion data extraction. Consumer wearable sleep devices with inertial measurement capability are in the market with some having limited functions such as automatic sleep detection, awakening, determination of sleep position changes and sleep efficiency. In this study, an IMU sensor is used for capturing 3D motion data. A spectrogram based algorithm for feature extraction from the motion data is proposed and implemented. Using the generated spectrogram based features, the long term short memory (LSTM) recurrent neural network (RNN) model is used for recognition of sleeping positions. The test results show that an accuracy of 99.09% can be achieved in a supervised learning mode. A real-time feature extraction and recognition system is developed to implement the proposed algorithm.

Original languageEnglish
Title of host publication2018 IEEE International Conference on Consumer Electronics, ICCE 2018
EditorsSaraju P. Mohanty, Peter Corcoran, Hai Li, Anirban Sengupta, Jong-Hyouk Lee
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1-2
Number of pages2
ISBN (Electronic)9781538630259
DOIs
StatePublished - 26 Mar 2018
Event2018 IEEE International Conference on Consumer Electronics, ICCE 2018 - Las Vegas, United States
Duration: 12 Jan 201814 Jan 2018

Publication series

Name2018 IEEE International Conference on Consumer Electronics, ICCE 2018
Volume2018-January

Conference

Conference2018 IEEE International Conference on Consumer Electronics, ICCE 2018
Country/TerritoryUnited States
CityLas Vegas
Period12/01/1814/01/18

Fingerprint

Dive into the research topics of 'A real-time sleeping position recognition system using IMU sensor motion data'. Together they form a unique fingerprint.

Cite this