首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Among all algorithms based on wavelet transform and zerotree quantization, Said and Pearlman’s set partitioning in hierarchical trees (SPIHT) algorithm is well known for its simplicity and efficiency. SPIHT’s high memory requirement is a major drawback to hardware implementation. In this study, we present a modification of SPIHT named modified SPIHT (MSPIHT), which requires less execution time at a low bit rate and less working memory than SPIHT. The MSPIHT coding algorithm is modified with the use of one list to store the coordinates of wavelet coefficients instead of three lists of SPIHT; defines two terms, number of error bits and absolute zerotree; and merges the sorting pass and the refinement pass together as one scan pass. Comparison of MSPIHT with SPIHT on different test image shows that MSPIHT reduces execution time at most 7 times for coding a 512 × 512 grayscale image; reduces execution time at most 11 times at a low bit rate; saves at least 0.5625 MB of memory; and reduces minor peak signal-to noise ratio (PSNR) values, thereby making it highly promising for real-time and memory limited mobile communications. Published in Russian in Radiotekhnika i Elektronika, 2008, Vol. 53, No. 6 pp. 676–685. The text was submitted by the authors in English.  相似文献   

2.
改进的小波域三维等级树集分割视频编码方法   总被引:5,自引:0,他引:5  
本文主要研究了视频图像在小波域进行压缩的三维等级树集分割编码(3D-SPIHT)方法,提出了一种改进的三维等级树集分割编码方法(13D-SPIHT),该方法克服了3D-SPIHT的限制。通过分析和仿真实验发现,本文的改进算法在降低编码延迟的基本下基本上能达到和3D-SPIHT方法相同的编码结果,在相同的编码时延的条件下可得到更好的编码结果。  相似文献   

3.
A wavelet-based electrocardiogram (ECG) data compression algorithm is proposed in this paper. The ECG signal is first preprocessed, the discrete wavelet transform (DWT) is then applied to the preprocessed signal. Preprocessing guarantees that the magnitudes of the wavelet coefficients be less than one, and reduces the reconstruction errors near both ends of the compressed signal. The DWT coefficients are divided into three groups, each group is thresholded using a threshold based on a desired energy packing efficiency. A binary significance map is then generated by scanning the wavelet decomposition coefficients and outputting a binary one if the scanned coefficient is significant, and a binary zero if it is insignificant. Compression is achieved by 1) using a variable length code based on run length encoding to compress the significance map and 2) using direct binary representation for representing the significant coefficients. The ability of the coding algorithm to compress ECG signals is investigated, the results were obtained by compressing and decompressing the test signals. The proposed algorithm is compared with direct-based and wavelet-based compression algorithms and showed superior performance. A compression ratio of 24:1 was achieved for MIT-BIH record 117 with a percent root mean square difference as low as 1.08%.  相似文献   

4.
This paper presents an effective and efficient preprocessing algorithm for two-dimensional (2-D) electrocardiogram (ECG) compression to better compress irregular ECG signals by exploiting their inter- and intra-beat correlations. To better reveal the correlation structure, we first convert the ECG signal into a proper 2-D representation, or image. This involves a few steps including QRS detection and alignment, period sorting, and length equalization. The resulting 2-D ECG representation is then ready to be compressed by an appropriate image compression algorithm. We choose the state-of-the-art JPEG2000 for its high efficiency and flexibility. In this way, the proposed algorithm is shown to outperform some existing arts in the literature by simultaneously achieving high compression ratio (CR), low percent root mean squared difference (PRD), low maximum error (MaxErr), and low standard derivation of errors (StdErr). In particular, because the proposed period sorting method rearranges the detected heartbeats into a smoother image that is easier to compress, this algorithm is insensitive to irregular ECG periods. Thus either the irregular ECG signals or the QRS false-detection cases can be better compressed. This is a significant improvement over existing 2-D ECG compression methods. Moreover, this algorithm is not tied exclusively to JPEG2000. It can also be combined with other 2-D preprocessing methods or appropriate codecs to enhance the compression performance in irregular ECG cases.  相似文献   

5.
Image compression using binary space partitioning trees   总被引:1,自引:0,他引:1  
For low bit-rate compression applications, segmentation-based coding methods provide, in general, high compression ratios when compared with traditional (e.g., transform and subband) coding approaches. In this paper, we present a new segmentation-based image coding method that divides the desired image using binary space partitioning (BSP). The BSP approach partitions the desired image recursively by arbitrarily oriented lines in a hierarchical manner. This recursive partitioning generates a binary tree, which is referred to as the BSP-tree representation of the desired image. The most critical aspect of the BSP-tree method is the criterion used to select the partitioning lines of the BSP tree representation, In previous works, we developed novel methods for selecting the BSP-tree lines, and showed that the BSP approach provides efficient segmentation of images. In this paper, we describe a hierarchical approach for coding the partitioning lines of the BSP-tree representation. We also show that the image signal within the different regions (resulting from the recursive partitioning) can be represented using low-order polynomials. Furthermore, we employ an optimum pruning algorithm to minimize the bit rate of the BSP tree representation (for a given budget constraint) while minimizing distortion. Simulation results and comparisons with other compression methods are also presented.  相似文献   

6.
Wavelet packet-based compression of single lead ECG   总被引:5,自引:0,他引:5  
A preliminary investigation of a wavelet packet based algorithm for the compression of single lead ECG is presented. The algorithm combines the efficiency and flexibility of wavelet packet expansions with the methodology of the Karhunen-Loeve transform (KLT). For selected records from the MIT-BIH arrhythmia database, an average data rate of 184.7 bits per second, corresponding to a compression ratio of 21.4:1, is achieved. When compared with the KLT applied to the same data, the wavelet packet algorithm generates significantly lower data rates with less than one-third the computational effort  相似文献   

7.
This paper presents a listless implementation of wavelet based block tree coding (WBTC) algorithm of varying root block sizes. WBTC algorithm improves the image compression performance of set partitioning in hierarchical trees (SPIHT) at lower rates by efficiently encoding both inter and intra scale correlation using block trees. Though WBTC lowers the memory requirement by using block trees compared to SPIHT, it makes use of three ordered auxiliary lists. This feature makes WBTC undesirable for hardware implementation; as it needs a lot of memory management when the list nodes grow exponentially on each pass. The proposed listless implementation of WBTC algorithm uses special markers instead of lists. This reduces dynamic memory requirement by 88% with respect to WBTC and 89% with respect to SPIHT. The proposed algorithm is combined with discrete cosine transform (DCT) and discrete wavelet transform (DWT) to show its superiority over DCT and DWT based embedded coders, including JPEG 2000 at lower rates. The compression performance on most of the standard test images is nearly same as WBTC, and outperforms SPIHT by a wide margin particularly at lower bit rates.  相似文献   

8.
The author proposed an effective wavelet-based ECG compression algorithm (Rajoub, 2002). The reported extraordinary performance motivated us to explore the findings and to use it in our research activity. During the implementation of the proposed algorithm several important points regarding accuracy, methodology, and coding were found to be improperly substantiated. This paper discusses these findings and provides specific subjective and objective measures that could improve the interpretation of compression results in these research-type problems.  相似文献   

9.
The delay performance of compression algorithms is particularly important when time-critical data transmission is required. In this paper, we propose a wavelet-based electrocardiogram (ECG) compression algorithm with a low delay property for instantaneous, continuous ECG transmission suitable for telecardiology applications over a wireless network. The proposed algorithm reduces the frame size as much as possible to achieve a low delay, while maintaining reconstructed signal quality. To attain both low delay and high quality, it employs waveform partitioning, adaptive frame size adjustment, wavelet compression, flexible bit allocation, and header compression. The performances of the proposed algorithm in terms of reconstructed signal quality, processing delay, and error resilience were evaluated using the Massachusetts Institute of Technology University and Beth Israel Hospital (MIT-BIH) and Creighton University Ventricular Tachyarrhythmia (CU) databases and a code division multiple access-based simulation model with mobile channel noise.  相似文献   

10.
A method is presented for representing signals made up of discrete component waves separated by iso-electric regions such as electrocardiogram (EGG), respiratory and blood pressure signals. The computational complexity is minimised by treating the discrete cosine transform of a group of component waves to be the sum of a finite number of biphasic rational functions  相似文献   

11.
This paper presents a new algorithm for electrocardiogram (ECG) signal compression based on local extreme extraction, adaptive hysteretic filtering and Lempel-Ziv-Welch (LZW) coding. The algorithm has been verified using eight of the most frequent normal and pathological types of cardiac beats and an multi-layer perceptron (MLP) neural network trained with original cardiac patterns and tested with reconstructed ones. Aspects regarding the possibility of using the principal component analysis (PCA) to cardiac pattern classification have been investigated as well. A new compression measure called "quality score," which takes into account both the reconstruction errors and the compression ratio, is proposed.  相似文献   

12.
This paper describes the implementation of the recently introducedcolor set partitioning in hierarchical tree (CSPIHT)-based scheme for video coding. The intra- and interframe coding performance of a CSPIHT-based video coder (CVC) is compared against that of the H.263 at bit rates lower than 64 kbit/s. The CVC performs comparably or better than the H.263 at lower bit rates, whereas the H.263 performs better than the CVC at higher bit rates. We identify areas that hamper the performance of the CVC and propose an improved scheme that yields better performance in image and video coding in low bit-rate environments.  相似文献   

13.
An adaptive real-time ECG compression algorithm with variable threshold   总被引:1,自引:0,他引:1  
A real-time compression algorithm has been developed which is suitable for both real-time ECG (electrocardiogram) transmission and ECG data storing. The algorithm represents a modification of the amplitude zone time epoch coding (AZTEC) technique extended with several statistical parameters used to calculate the variable threshold. The proposed algorithm has been applied in the design of a pacemaker followup system for the online ECG data transmission from the pacemaker implanted in a human being to the computer system located at the clinic.<>  相似文献   

14.
For the compression of medical signals such as electrocardiogram (ECG), excellent reconstruction quality of a highly compressed signal can be obtained by using a wavelet-based approach. The most widely used objective quality criterion for the compressed ECG is called the percent of root-mean-square difference (PRD). In this paper, given a user-specified PRD, an algorithm is proposed to meet the PRD demand by searching for an appropriate bit rate in an automatic, smooth, and fast manner for the wavelet-based compression. The bit rate searching is modeled as a root-finding problem for a one-dimensional function, where an unknown rate-distortion curve represents the function and the desired rate is the root to be sought. A solution derived from root-finding methods in numerical analysis is proposed. The proposed solution is incorporated in a well-known wavelet-based coding strategy called set partitioning in hierarchical trees. ECG signals taken from the MIT/BIH database are tested, and excellent results in terms of convergence speed, quality variation, and coding performance are obtained.  相似文献   

15.
董良  马正新  王毓晗 《电讯技术》2015,55(2):182-186
针对低信噪比高动态环境下直扩信号经典捕获算法中多普勒频偏估计精度和快速傅里叶变换(FFT)运算量的矛盾,提出了一种基于频偏补偿的两级非相干累加捕获算法。该算法以部分匹配滤波器结合快速傅里叶变换(PMF-FFT)算法为单元,首先利用第一级频偏粗估值对多普勒频偏进行补偿,然后通过调整第二级PMF-FFT中部分匹配滤波器长度并利用滑动平均窗对残余频偏进行精估。理论分析和仿真结果表明,与经典捕获算法相比,该算法在FFT运算点数相同的条件下,可以有效地提高多普勒频偏估计精度,并且在高动态环境下具有更好的检测性能。  相似文献   

16.
A novel efficient algorithm for solution of the problem of equal partitioning of a set with predefined weights of elements is proposed. The algorithm is based on calculation of a linear group preserving an invariant: the set of zeros of a cubic form. Algorithms for solution of related problems, including the problem of the search for the second solution if the first solution is known, are discussed.  相似文献   

17.
彭自然  王国军 《信号处理》2017,33(8):1122-1131
小波消失矩阶数的不同,对应的小波滤波器的幅频曲线也不相同,因此选用不同的小波滤波器对信号进行滤波,去噪效果会有明显差异。本文通过数学建模研究分析小波滤波器的幅频特性,明确小波幅频特征及与小波滤波器消失矩的阶数之间的关系,为选择最优小波滤波器提供理论依据。本文提出针对ECG噪声的频率特点实现精确陷波去噪,有效的保留了信号的奇异点与特征值,减少了信号失真。实验结果表明,选择具有相对最优消失矩阶数的提升小波滤波器对ECG进行去噪处理,可以使信号能量分布更加集中,去噪效果更好。   相似文献   

18.
In this paper, we study the problem of multi-terminal nets layout optimization. To solve the problem, we propose a polynomial-time algorithm, which first constructs minimum spanning tree and then generates spanning tree set. An algorithm for near-minimum spanning trees set with controllable length deviation from a minimum is proposed. Effective trees length deviation value and trees number are investigated.  相似文献   

19.
A progressive image-coding algorithm that uses a combination of the set partitioning in hierarchical tree (SPIHT) algorithm and an embedded trellis-coded quantiser (E-TCQ) is proposed. Using the hierarchical quadtree structure, a significant map of coefficients is encoded. The E-TCQ is then used to refine the amplitude of the significant pixels. Results are obtained using the proposed method. It is shown that this method gives good performance, with very little increase in computational complexity and storage memory  相似文献   

20.
Described is an efficient and low complexity technique to encode binary sequences like the significance map generated in wavelet-based compression methods. This is achieved using a generalised form of the bintree partitioning for 1-D signals, the vector K-tree partitioning. Results show higher compression rate and lower compression time than other binary sequences compression methods. Application to the compression of phonocardiograms for teletransmission also showed higher compression rate than set partitioning in hierarchical trees.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号