首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 9 毫秒
1.
This paper presents a new algorithm for electrocardiogram (ECG) signal compression based on local extreme extraction, adaptive hysteretic filtering and Lempel-Ziv-Welch (LZW) coding. The algorithm has been verified using eight of the most frequent normal and pathological types of cardiac beats and an multi-layer perceptron (MLP) neural network trained with original cardiac patterns and tested with reconstructed ones. Aspects regarding the possibility of using the principal component analysis (PCA) to cardiac pattern classification have been investigated as well. A new compression measure called "quality score," which takes into account both the reconstruction errors and the compression ratio, is proposed.  相似文献   

2.
Ji  T.Y. Lu  Z. Wu  Q.H. Ji  Z. 《Electronics letters》2008,44(2):82-83
An approach to remove baseline wander from electrocardiogram (ECG) signals, based on empirical mode decomposition and mathematical morphology, is described.  相似文献   

3.
In this paper, we develop a general framework of a granular representation of ECG signals. The crux of the approach lies in the development and ongoing processing realized in the setting of information granules-fuzzy sets. They serve as basic conceptual and semantically meaningful entities using which we describe signals and build their models (such as various predictive schemes or classifiers). A comprehensive two-phase scheme of the design of the information granules is proposed and described. At the first phase, we discuss the temporal granulation through a series of temporal windows (granular windows) and an aggregation of the values of signal by means of fuzzy sets. To address this issue, offered is a detailed method of building a fuzzy set based on numeric data and a certain optimization criterion that strikes a balance between the highest experimental relevance of the fuzzy set supported by numeric data and its substantial specificity. At the next phase of the granular design, a collection of information granules is further summarized with the use of fuzzy clustering (Fuzzy C-Means). The resulting prototypes (centroids) formed by this grouping process serve as elements of the granular vocabulary. We discuss ways of using these vocabularies in the knowledge-based representation, modeling, and classification of ECG beats.  相似文献   

4.
A genetic segmentation of ECG signals   总被引:1,自引:0,他引:1  
  相似文献   

5.
基于信息融合的思想,简介了独立分量分析方法;以Matlab为辅助工具,应用独立分量分析方法中比较成熟的快速算法FastICA,给出了语音信号分离的独立分量分析方法的具体途径,并对其分离效果进行了分析;然后,应用该方法对轴承的故障噪声特征信号成功地实现了提取。这两个实例的验证结果表明,采用独立分量分析方法对噪声源信号分离是有效的,应用前景是广阔的。  相似文献   

6.
A wavelet-based electrocardiogram (ECG) data compression algorithm is proposed in this paper. The ECG signal is first preprocessed, the discrete wavelet transform (DWT) is then applied to the preprocessed signal. Preprocessing guarantees that the magnitudes of the wavelet coefficients be less than one, and reduces the reconstruction errors near both ends of the compressed signal. The DWT coefficients are divided into three groups, each group is thresholded using a threshold based on a desired energy packing efficiency. A binary significance map is then generated by scanning the wavelet decomposition coefficients and outputting a binary one if the scanned coefficient is significant, and a binary zero if it is insignificant. Compression is achieved by 1) using a variable length code based on run length encoding to compress the significance map and 2) using direct binary representation for representing the significant coefficients. The ability of the coding algorithm to compress ECG signals is investigated, the results were obtained by compressing and decompressing the test signals. The proposed algorithm is compared with direct-based and wavelet-based compression algorithms and showed superior performance. A compression ratio of 24:1 was achieved for MIT-BIH record 117 with a percent root mean square difference as low as 1.08%.  相似文献   

7.
A method is presented for representing signals made up of discrete component waves separated by iso-electric regions such as electrocardiogram (EGG), respiratory and blood pressure signals. The computational complexity is minimised by treating the discrete cosine transform of a group of component waves to be the sum of a finite number of biphasic rational functions  相似文献   

8.
This paper presents hybrid approaches for human identification based on electrocardiogram (ECG). The proposed approaches consist of four phases, namely data acquisition, preprocessing, feature extraction and classification. In the first phase, data acquisition phase, data sets are collected from two different databases, ECG-ID and MIT-BIH Arrhythmia database. In the second phase, noise reduction of ECG signals is performed by using wavelet transform and a series of filters used for de-noising. In the third phase, features are obtained by using three different intelligent approaches: a non-fiducial, fiducial and a fusion approach between them. In the last phase, the classification approach, three classifiers are developed to classify subjects. The first classifier is based on artificial neural network (ANN). The second classifier is based on K-nearest neighbor (KNN), relying on Euclidean distance. The last classifier is support vector machine (SVM) classification accuracy of 95% is obtained for ANN, 98 % for KNN and 99% for SVM on the ECG-ID database, while 100% is obtained for ANN, KNN, and SVM on MIT-BIH Arrhythmia database. The results show that the proposed approaches are robust and effective compared with other recent works.  相似文献   

9.
Recently,as recognizing emotion has been one of the hallmarks of affective computing,more attention has been paid to physiological signals for emotion recognition.This paper presented an approach to emotion recognition using ElectroCardioGraphy(ECG) signals from multiple subjects.To collect reliable affective ECG data,we applied an arousal method by movie clips to make subjects experience specific emotions without external interference.Through precise location of P-QRS-T wave by continuous wavelet transform...  相似文献   

10.
The phenomenon of frequency ambiguity may appear in radar or communication systems. S. Barbarossa(1991) had unwrapped the frequency ambiguity of single component uri-dersampled signals by Wigner-Ville distribution(WVD). But there has no any effective algorithm to analyze multicomponent undersampled signals by now. A new algorithm to analyze multicomponent undersampled signals by high-order ambiguity function (HAF) is proposed here. HAF analyzes polynomial phase signals by the method of phase rank reduction, its advantage is that it does not have boundary effect and is not sensitive to the cross-items of multicomponent signals. The simulation results prove the effectiveness of HAF algorithm.  相似文献   

11.
根据快反镜中位移传感器的位置特点,利用ICA方法对传感器信号进行去噪。ICA算法的缺点是需要计算高阶统计量,计算量较大,会影响实时性能,而借助高性能计算硬件则可以弥补这方面的不足。仿真结果表明,经ICA去噪后,信噪比能提升15 dB左右,用该方法对采集的传感器信号去噪,信噪比增量在13~16 dB之间。可见,ICA方法能够有效地提升信号的信噪比,具有较好的实用性。  相似文献   

12.
This paper describes an improved morphological approach to remove baseline wander from neonatal electrocardiogram (ECG) signals, with particular emphasis on preserving the ST segment of the original signal. The algorithm consists of two stages of morphological processing. First, the QRS complex and impulsive noise component due to skeletal muscle contractions etc., are detected and removed from the input signal. Second, the corrected QT interval (QTc) and RR interval are used to determine a structuring element. With this structuring element, the same morphological operation as in the first stage is then applied to the QRS-removed signal to obtain and remove the baseline wander. The performance of the algorithm is evaluated with simulated and real ECGs. Compared with an existing morphological method, there is a substantial improvement, especially in reducing distortion of the baseline waveform within the PR and QT intervals.  相似文献   

13.
We present a compact approach for mitigating the presence of electrocardiograms (ECG) in surface electromyographic (EMG) signals by means of time-variant harmonic modeling of the cardiac artifact. Heart rate and QRS complex variability, which often account for amplitude and frequency time variations of the ECG, are simultaneously captured by a set of third-order constant-coefficient polynomials modulating a stationary harmonic basis in the analysis window. Such a characterization allows us to significantly suppress ECG from the mixture by preserving most of the EMG signal content at low frequencies (less than 20?Hz). Moreover, the resulting model is linear in parameters and the least-squares solution to the corresponding linear system of equations efficiently provides model parameter estimates. The comparative results suggest that the proposed method outperforms two reference methods in terms of the EMG preservation at low frequencies.  相似文献   

14.
The author proposed an effective wavelet-based ECG compression algorithm (Rajoub, 2002). The reported extraordinary performance motivated us to explore the findings and to use it in our research activity. During the implementation of the proposed algorithm several important points regarding accuracy, methodology, and coding were found to be improperly substantiated. This paper discusses these findings and provides specific subjective and objective measures that could improve the interpretation of compression results in these research-type problems.  相似文献   

15.
ECG compression using long-term prediction   总被引:7,自引:0,他引:7  
A new algorithm for ECG signal compression is introduced. The compression system is based on the subautoregression (SAR) model, known also as the long-term prediction (LTP) model. The periodicity of the ECG signal is employed in order to further reduce redundancy, thus yielding high compression ratios. The suggested algorithm was evaluated using an in-house database. Very low bit rates on the order of 70 b/s are achieved with a relatively low reconstruction error (percent RMS difference-PRD) of less than 10%. The algorithm was compared, using the same database, with the conventional linear prediction (short-term prediction-STP) method, and was found superior at any bit rate. The suggested algorithm can be considered a generalization of the recently published average beat subtraction method  相似文献   

16.
Broadband high-density WDM transmission using superluminescent diodes   总被引:2,自引:0,他引:2  
Fibre-loop architectures employing high-density WDM for passive routing have been demonstrated using both DFB-laser and LED transmitters. The latter approach is known as 'spectrum slicing'. The first demonstration of broadband spectrally sliced transmission is reported. superluminescent diodes were used to transmit 150 Mbit/s on each of 10 WDM channels, or 50 Mbit/s on each of 16 WDM channels.<>  相似文献   

17.
ECG beat detection using filter banks   总被引:13,自引:0,他引:13  
The authors have designed a multirate digital signal processing algorithm to detect heartbeats in the electrocardiogram (ECG). The algorithm incorporates a filter bank (FB) which decomposes the ECG into subbands with uniform frequency bandwidths. The FB-based algorithm enables independent time and frequency analysis to be performed on a signal. Features computed from a set of the subbands and a heuristic detection strategy are used to fuse decisions from multiple one-channel beat detection algorithms. The overall beat detection algorithm has a sensitivity of 99.59% and a positive predictivity of 99.56% against the MIT/BIH database. Furthermore this is a real-time algorithm since its beat detection latency is minimal. The FB-based beat detection algorithm also inherently lends itself to a computationally efficient structure since the detection logic operates at the subband rate. The FB-based structure is potentially useful for performing multiple ECG processing tasks using one set of preprocessing filters  相似文献   

18.
ECG beat classification using GreyART network   总被引:1,自引:0,他引:1  
The grey relational grade is a similarity measure. On the basis of the grey relational grade, an adaptive resonant theory (ART) type network, GreyART, has been developed. When the GreyART is used to classify a dataset with varying amount of data, the measurement between two specific data in the dataset may vary since the measurement is affected by new added data. In this case, the grey relational grade is not a global measure. As the measurement varies, in the GreyART, it is hard to use a fixed vigilance threshold value for determining whether the current input data belong to one of the existing clusters or become the template of a new online-created cluster. A method to solve this problem has been proposed and then applied to develop an electrocardiogram (ECG) beat classifier. The proposed ECG beat classification involves two phases. One is the off-line learning phase. With the proposed performance index, the product of the classification accuracy and the partition quality, an optimal value for the vigilance threshold and the corresponding cluster centres from the learning results can be determined. The other is the online examining phase, which classifies the input ECG beats. In this phase, the vigilance threshold value and the initial cluster centres are the optimal ones obtained in the learning phase. Under these conditions, the GreyART network enables real-time classification of ECG beats. Simulation results show that the proposed network achieves a good accuracy with a good computational efficiency for ECG beat classification problems  相似文献   

19.
基于形态学的心电信号基线漂移矫正方法   总被引:1,自引:0,他引:1  
基线漂移噪声是心电信号(ECG)中噪声的主要组成成分,矫正基线漂移是心电信号预处理的重要一步。目前,矫正ECG信号中基线漂移的方法包括小波变换、曲线拟合法以及FIR和IIR滤波的方法,但因计算量较大、实时性受信号长度影响和截止频率固定等因素,使得滤波带来不便。提出数学形态滤波器来矫正基线漂移,并根据心电信号的几何特征设计参数。经验证:该方法能有效矫正基线漂移,较好地保留信号特征。  相似文献   

20.
This paper presents an effective and efficient preprocessing algorithm for two-dimensional (2-D) electrocardiogram (ECG) compression to better compress irregular ECG signals by exploiting their inter- and intra-beat correlations. To better reveal the correlation structure, we first convert the ECG signal into a proper 2-D representation, or image. This involves a few steps including QRS detection and alignment, period sorting, and length equalization. The resulting 2-D ECG representation is then ready to be compressed by an appropriate image compression algorithm. We choose the state-of-the-art JPEG2000 for its high efficiency and flexibility. In this way, the proposed algorithm is shown to outperform some existing arts in the literature by simultaneously achieving high compression ratio (CR), low percent root mean squared difference (PRD), low maximum error (MaxErr), and low standard derivation of errors (StdErr). In particular, because the proposed period sorting method rearranges the detected heartbeats into a smoother image that is easier to compress, this algorithm is insensitive to irregular ECG periods. Thus either the irregular ECG signals or the QRS false-detection cases can be better compressed. This is a significant improvement over existing 2-D ECG compression methods. Moreover, this algorithm is not tied exclusively to JPEG2000. It can also be combined with other 2-D preprocessing methods or appropriate codecs to enhance the compression performance in irregular ECG cases.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号