首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Compression of the ECG by Prediction or Interpolation and Entropy Encoding   总被引:2,自引:0,他引:2  
Compression of digital electrocardiogram (ECG) signals is desirable for two reasons: economic use of storage space for data bases and reduction of the data transmission rate for compatibility with telephone lines. In a sample of 220 Frank4ead ECG's the removal of signal redundancy by second-order prediction or interpolation with subsequent entropy encoding of the respective residual errors was investigated. At the sampling rate of 200 Hz, interpolation provided a 6 dB smaller residual error variance than prediction. A near-optimal value for the interpolation coefficients is 0.5, permitting simple implementation of the algorithm and requiring a word length for arithmetic processing of only 2 bits in extent of the signal precision. For linear prediction, the effects of occasional transmission errors decay exponentially, whereas for interpolation they do not, necessitating error control in certain applications. Encoding of the interpolation errors by a Huffman code truncated to ±5 quantization levels of 30 ?V, required an average word length of 2.21 bits/sample (upper 96 percentile 3 bits/sample), resulting in data transmission rates of 1327 bits/s (1800 bits/s) for three simultaneous leads sampled at the rate of 200 Hz. Thus, compared with the original signal of 8 bit samples at 500 Hz, the average compression is 9:1. Encoding of the prediction errors required an average wordlength of 2.67 bits/sample with a 96 percentile of 5.5 bits/sample, making this method less suitable for synchronous transmission.  相似文献   

2.
This comment seeks to point out that the ground-free ECG recording with two electrodes scheme proposed by Thakor and Webster1 appears to be useful only in certain applications. Many biotelemetry devices work successfully with two-electrode single-ended input amplifiers since their ground-referenced capacity is so small that 60 Hz line coupled displacement currents do not cause interference.  相似文献   

3.
结合经典无损预测编码方法和变长编码思想,提出了一种有自适应性的预测分组编码方法.通过自适应预测编码对不同纹理特征的图像均能有效地做去相关性处理,再结合图像数据分布规律通过分组编码实现图像的无损压缩.实验结果表明这套方法具有高压缩比和高压缩效率的特点.  相似文献   

4.
5.
In an earlier paper1 an adaptive filtering algorithm was proposed for the elimination of 60 Hz interference in the ECG. It is shown here that this algorithm is not truly adaptive, but is approximately equivalent to a fixed 60 Hz notch filter.  相似文献   

6.
提出了一种基于混合编码的分形图像压缩方案,改进了分形编码与SP IHT算法,对提升小波变换后的最低频部分采用改进的分形编码,其他部分采用改进的SP IHT算法。试验结果表明,该方法在缩短了图像压缩时间的同时,明显减少了分形压缩恢复图像的方块效应。  相似文献   

7.
The author proposed an effective wavelet-based ECG compression algorithm (Rajoub, 2002). The reported extraordinary performance motivated us to explore the findings and to use it in our research activity. During the implementation of the proposed algorithm several important points regarding accuracy, methodology, and coding were found to be improperly substantiated. This paper discusses these findings and provides specific subjective and objective measures that could improve the interpretation of compression results in these research-type problems.  相似文献   

8.
This paper reports the effect of compression by applying delta encoding and Huffman coding schemes together on speech signals of American-English and Hindi from International Phonetic Alphabet database. First of all, these speech signals have been delta encoded and then compressed by Huffman coding. By doing so, it has been observed here that the Huffman coding gives high compression ratio for this delta encoded speech signals as compared to the compression on the input speech signals only incorporating Huffman coding.  相似文献   

9.
We implemented a method for compression of the abulatory ECG that includes average beat subtraction and first differencing of residual data. Our previous investigations indicated that this method is superior to other compression methods with respect to data rate as a function mean-squared-error distortion. Based on previous results we selected a sample rate of 100 samples per second and a quantization step size of 35 microV. These selections allow storage of 24 h of two-channel ECG data in 4 Mbytes of memory with a minimum rms distortion. For this sample rate and quantization level, we show that estimation of beat location and quantizer location can significantly affect compression performance. Improved compression resulted when beats were located with a temporal resolution of 5 ms and coarse quantization was performed in the compression loop. For the 24-h MIT/BIH arrhythmia database our compression algorithm coded a single-channel of ECG data with an average data rate of 174 bits per second.  相似文献   

10.
高帧频相机需要较长时间记录目标状态的图像信息,由于记录介质的存储容量所限,需要对图像进行实时压缩、记录与传输。EZW和SPIHT两种编码方法速度慢,不易在FPGA上实现,且对缓存要求较高。根据小波变换后小波系数的冗余及分布特性,提出一种新的编码算法,通过直接对系数大小做比较,然后结合数值特性和分布特性对小波系统进行动态分组,用较少的存储空间记录图像系数信息。该算法适合在FPGA上实现,实验结果验证了方法的可行性。  相似文献   

11.
罗瑜  唐博 《电子学报》2018,46(4):969-974
为了进一步提高参考帧无损压缩的压缩性能,本文提出了一种基于纹理方向预测和游程哥伦布编码的帧存无损压缩算法.本算法首先采用双扫描和自适应预测的方法,按纹理方向,为每个像素选取最优的参考像素,并进行预测以获得预测残差;然后对预测残差进行哥伦布游程混合熵编码,从而提高了参考帧无损压缩的压缩性能.实现结果显示,与帧内预测哥伦布编码算法相比,本文算法不但平均压缩率提高了16%,而且降低了平均编码时间.  相似文献   

12.
13.
分形插值图象放大和压缩编码   总被引:2,自引:0,他引:2  
本文讨论了随机分形插值方法及其在图象放大和图象压缩编码中的应用。实验结果表明,用分形插值方法实现图象放大和压缩编码能获得良好的结果。  相似文献   

14.
基于排列熵的心电信号非线性分析   总被引:2,自引:0,他引:2  
排列熵是一种基于复杂性量度的非线性动力学参数,能够快速、有效地反映系统的特征。目前在心电信号检测研究中还少有此类方法的应用,通过对算法进行研究,并应用于心室病症的检测中,利用MIT—BIH数据库,对心室病症心电信号的发病时段进行排列熵值的计算,实验得出,熵值对病症发作时段的反映灵敏,在心室纤颤、室性心动过速等病症上的检测准确率较高。因此得到结论,排列熵是一个早期检测病症的灵敏参数,可以作为临床诊断的有效依据。  相似文献   

15.
16.
17.
18.
雷婷  史承兴 《电子科技》2010,23(10):107-109
通过程序结构的调整,编码结构的优化及代码的汇编级优化,完成编码器的DSP高效实现。实验结果表明,优化后的编码器降低了运算复杂度,提高了CCSDS图像压缩算法的实时性。  相似文献   

19.
黄博强  陈建华  汪源源 《电子学报》2008,36(9):1810-1813
 提出一种基于Context模型的ECG信号二维压缩方案.通过模极大检测和循环匹配识别R波特征,自动构建ECG图像,并根据心动周期信息制作编码数据图,之后对ECG图像进行一维离散小波变换和带截止区均匀量化,量化系数被分解为重要位置图、符号流、最高位位置流和剩余比特流,最后结合编码数据图进行基于Context模型的自适应算术编码.实验针对MIT-BIH心律失常数据库的两个数据集进行压缩.压缩比为20时,新方案的百分均方根误差分别为2.93%、4.31%,低于JPEG2000压缩方案的3.26%、4.8%.结果表明新方案优于其它ECG压缩算法.  相似文献   

20.
陈军波  陈亚光 《电视技术》2004,(4):69-71,74
讨论了JPEG2000的相关核心算法,给出了图像压缩编码系统的硬件设计,并在DSP系统上按JPEG2000标准组织压缩码流.实验结果表明,按JPEG2000标准得到的医学压缩图像具有理想的压缩效果,满足医学图像处理的特定要求.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号