51黑料吃瓜在线观看,51黑料官网|51黑料捷克街头搭讪_51黑料入口最新视频

設(shè)為首頁(yè) |  加入收藏
首頁(yè)首頁(yè) 期刊簡(jiǎn)介 消息通知 編委會(huì) 電子期刊 投稿須知 廣告合作 聯(lián)系我們
基于深度學(xué)習(xí)的跨對(duì)象腦電睡眠分期研究

Deep learning based sleep staging research of cross-subject EEG

作者: 張金輝  汪鵬  李蕾 
單位:解放軍總醫(yī)院服務(wù)保障中心裝備保障室(北京 100853),<br />北京郵電大學(xué)人工智能學(xué)院(北京 100876),<br />通信作者:張金輝,E-mail: [email protected];<br />&nbsp; &nbsp; &nbsp; &nbsp; 汪鵬,E-mail: [email protected]
關(guān)鍵詞: 跨對(duì)象睡眠分期;腦電;深度學(xué)習(xí);AttnSleep模型;類(lèi)感知損失函數(shù) 
分類(lèi)號(hào):&nbsp;R318.04
出版年·卷·期(頁(yè)碼):2022·41·4(399-404)
摘要:

目的 跨對(duì)象腦電睡眠分期是國(guó)際頂級(jí)會(huì)議NeurIPS 2021最新提出的一項(xiàng)挑戰(zhàn)性任務(wù),目的是解決當(dāng)前腦電睡眠分期中主要存在的目標(biāo)數(shù)據(jù)不足問(wèn)題。本文基于深度學(xué)習(xí)方法對(duì)該任務(wù)進(jìn)行了初步探索,通過(guò)對(duì)數(shù)據(jù)集的深入分析,結(jié)合深度學(xué)習(xí)AttnSleep(attention-based deep learning approach for sleep stage classification)模型,設(shè)計(jì)實(shí)現(xiàn)了一種基于類(lèi)感知損失函數(shù)(class-aware loss function)的單通道腦電睡眠分期方法。方法 實(shí)驗(yàn)數(shù)據(jù)來(lái)自NeurIPS 2021 BEETL Competition任務(wù)一官方所提供的跨對(duì)象數(shù)據(jù)集,首先對(duì)腦電數(shù)據(jù)進(jìn)行標(biāo)準(zhǔn)化預(yù)處理,然后使用本文設(shè)計(jì)的方法進(jìn)行睡眠分期,并對(duì)其結(jié)果進(jìn)行檢驗(yàn)。結(jié)果 在數(shù)據(jù)集提供的2個(gè)不同年齡組別中,本文方法分別達(dá)到了67.33和66.68的任務(wù)指標(biāo),同時(shí)也驗(yàn)證了類(lèi)感知損失函數(shù)的作用。結(jié)論 實(shí)驗(yàn)結(jié)果表明,使用基于類(lèi)感知損失函數(shù)的單通道AttnSleep模型有助于在目標(biāo)數(shù)據(jù)不足的情況下提升跨對(duì)象腦電睡眠分期的效果。

Objective Cross-subject EEG sleep staging is a challenging task recently proposed by the international top conference NeurIPS 2021, which aims to solve the main problem of insufficient data of objects in the current EEG sleep staging. In this paper, a preliminary exploration of this task is carried out based on the deep learning method. Through thorough analysis of the data set, we design and implement a single-channel EEG sleep staging method based on the AttnSleep model with class-aware loss function. Methods The cross-subject experimental data comes from the official data set provided by the task one in NeurIPS 2021 BEETL Competition. First, the EEG data is standardized in preprocessing, and then the method designed in this paper is used and tested for sleep staging. Results In the two age groups provided by the data set, the method in this paper reached the task indicators of 67.33 and 66.68, respectively. The effect of class-aware loss function is also verified. Conclusions Experimental results show that the use of the single-channel AttnSleep model with the class-aware loss function can help to improve the effect of cross-subject EEG sleep staging in the case of lacking target data. The code will be available on https://github.com/MatrixWP/EEG-sleep-stage-classification.

參考文獻(xiàn):

[1] Luyster FS, Strollo Jr PJ, Zee PC, et al. Sleep: a health imperative[J]. Sleep, 2012, 35(6): 727-734.
[2] Finan PH, Quartana PJ, Remeniuk B, et al. Partial sleep deprivation attenuates the positive affective system: effects across multiple measurement modalities[J]. Sleep, 2017, 40(1): zsw017.
[3] Memar P, Faradji F. A novel multi-class EEG-based sleep stage classification system[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2018, 26(1): 84-95.
[4] Lin YP, Jung TP. Improving EEG-based emotion classification using conditional transfer learning[J]. Frontiers in Human Neuroscience, 2017,11: 334.
[5] Tsinalis O, Matthews PM, Guo Y, et al. Automatic sleep stage scoring with single-channel EEG using convolutional neural networks[EB/OL]. (2016-10-05)[2022-04-29]. https://arxiv.org/abs/1610.01683
[6] Chambon S, Galtier MN, Arnal PJ, et al. A deep learning architecture for temporal sleep stage classification using multivariate and multimodal time series[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2018, 26(4): 758-769.
[7] Supratak A, Dong H, Wu C, et al. DeepSleepNet: a model for automatic sleep stage scoring based on raw single-channel EEG[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2017, 25(11): 1998-2008.
[8] Mousavi S, Afghah F, Acharya UR. SleepEEGNet: automated sleep stage scoring with sequence to sequence deep learning approach[J]. PLoS One, 2019, 14(5): e0216456.
[9] Chawla NV, Bowyer KW, Hall LO, et al. SMOTE: synthetic minority over-sampling technique[J]. Journal of Artificial Intelligence Research, 2002, 16: 321-357.
[10] Lin TY, Goyal P, Girshick R, et al. Focal loss for dense object detection[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020, 42(2): 318-327.?
[11] Schirrmeister RT, Springenberg JT, Fiederer LDJ, et al. Deep learning with convolutional neural networks for EEG decoding and visualization[J]. Human Brain Mapping, 2017, 38: 5391-5420.
[12] Eldele E, Chen Z, Liu C, et al. An attention-based deep learning approach for sleep stage classification with single-channel EEG[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2021, 29: 809-818.
[13] Huang W, Cheng J, Yang Y, et al. An improved deep convolutional neural network with multi-scale information for bearing fault diagnosis[J]. Neurocomputing, 2019, 359: 77-92.
[14] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]// 31st Conference on Neural Information Processing Systems(NIPS 2017). Long Beach, CA, USA: NIPS, 2017: 5998-6008.
[15] Kingma DP, Ba J. Adam: a method for stochastic optimization[C]//3rd International Conference on Learning Representations. San Diego, USA: ICLR, 2015: 1?15.
[16] Lawhern VJ, Solon AJ, Waytowich NR, et al. EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces[J]. Journal of Neural Engineering, 2018, 15(5): 056013.
[17] Goldberger AL, Amaral LA, Glass L, et al. PhysioBank, PhysioToolkit, and PhysioNet: components of a new research resource for complex physiologic signals[J]. Circulation, 2000, 101(23): E215-E220.

服務(wù)與反饋:
文章下載】【加入收藏
提示:您還未登錄,請(qǐng)登錄!點(diǎn)此登錄
 
友情鏈接  
地址:北京安定門(mén)外安貞醫(yī)院內(nèi)北京生物醫(yī)學(xué)工程編輯部
電話:010-64456508  傳真:010-64456661
電子郵箱:[email protected]