首页 | 本学科首页   官方微博 | 高级检索  
     

基于3CNN-BiGRU的睡眠自动分期研究
引用本文:唐洁,文元美.基于3CNN-BiGRU的睡眠自动分期研究[J].计算机与现代化,2022,0(2):120-126.
作者姓名:唐洁  文元美
作者单位:广东工业大学信息工程学院,广东 广州 510006
摘    要:针对单通道脑电信号睡眠自动分期效率和准确率问题,提出采用三尺度并行卷积神经网络提取睡眠信号特征和双向门控循环单元学习睡眠阶段之间内部时间关系的3CNN-BiGRU睡眠自动分期模型。首先对原始单通道脑电信号进行带通滤波处理,并采用合成少数类过采样技术进行类平衡,然后送入搭建的模型中进行训练和验证实验,其中采用预训练和微调训练对模型进行优化,采用10次和20次交叉验证提高训练可靠性。不同数据集下的不同模型对比实验结果表明,3CNN-BiGRU模型取得了更高的训练效率和更好的分期准确率。

关 键 词:脑电信号  睡眠分期  卷积神经网络  双向门控循环单元  合成少数类过采样技术
收稿时间:2022-03-31

Automatic Sleep Staging Based on 3 CNN-BiGRU
TANG Jie,WEN Yuan-mei.Automatic Sleep Staging Based on 3 CNN-BiGRU[J].Computer and Modernization,2022,0(2):120-126.
Authors:TANG Jie  WEN Yuan-mei
Affiliation:(College of Information Engineering, Guangdong University of Technology, Guangzhou 510006, China)
Abstract:Aiming at the efficiency and accuracy of single-channel EEG signal sleep automatic staging,this paper proposes to use three-scale parallel Convolutional Neural Networks to extract sleep signal features and Bidirectional Gated Recurrent Unit 3CNN-BiGRU automatic sleep staging model to learn the internal time relationship between sleep stages.First,the model performs band-pass filtering on the original single-channel EEG signal,and uses the synthetic minority oversampling technique for class balance,and then sends it to the built model for training and verification experiments.Pre-training and fine-tuning training are used for optimizing the model,and 10-folds and 20-folds cross-validation is uses to improve training reliability.The experimental results of different models under different data sets show that the 3CNN-BiGRU model has achieved better training efficiency and better staging accuracy.
Keywords:EEG signal  sleep staging  convolutional neural network  bidirectional gated recurrent unit  synthetic minority oversampling technique
本文献已被 维普 万方数据 等数据库收录!
点击此处可从《计算机与现代化》浏览原始摘要信息
点击此处可从《计算机与现代化》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号