Explore Intrinsic Geometry of Sleep Dynamics and Predict Sleep Stage by Unsupervised Learning Techniques

Gi Ren Liu, Yu Lun Lo, Yuan Chung Sheu, Hau Tieng Wu

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

We propose a novel unsupervised approach for sleep dynamics exploration and automatic annotation by combining modern harmonic analysis tools. Specifically, we apply diffusion-based algorithms, diffusion map (DM), and alternating diffusion (AD) algorithms, to reconstruct the intrinsic geometry of sleep dynamics by reorganizing the spectral information of an electroencephalogram (EEG) extracted from a nonlinear-type time frequency analysis tool, the synchrosqueezing transform (SST). The visualization is achieved by the nonlinear dimension reduction properties of DM and AD. Moreover, the reconstructed nonlinear geometric structure of the sleep dynamics allows us to achieve the automatic annotation purpose. The hidden Markov model is trained to predict the sleep stage. The prediction performance is validated on a publicly available benchmark database, Physionet Sleep-EDF [extended] SC and ST, with the leave-one-subject-out cross-validation. The overall accuracy and macro F1 achieve 82.57% and 76% in Sleep-EDF SC and 77.01% and 71.53% in Sleep-EDF ST, which is compatible with the state-of-the-art results by supervised learning-based algorithms. The results suggest the potential of the proposed algorithm for clinical applications.

Original languageEnglish
Title of host publicationSpringer Optimization and Its Applications
PublisherSpringer
Pages279-324
Number of pages46
DOIs
Publication statusPublished - 2021

Publication series

NameSpringer Optimization and Its Applications
Volume168
ISSN (Print)1931-6828
ISSN (Electronic)1931-6836

All Science Journal Classification (ASJC) codes

  • Control and Optimization

Fingerprint Dive into the research topics of 'Explore Intrinsic Geometry of Sleep Dynamics and Predict Sleep Stage by Unsupervised Learning Techniques'. Together they form a unique fingerprint.

Cite this