首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Cascaded differential and wavelet compression of chromosome images   总被引:2,自引:0,他引:2  
This paper proposes a new method for chromosome image compression based on an important characteristic of these images: the regions of interest (ROIs) to cytogeneticists for evaluation and diagnosis are well determined and segmented. Such information is utilized to advantage in our compression algorithm, which combines lossless compression of chromosome ROIs with lossy-to-lossless coding of the remaining image parts. This is accomplished by first performing a differential operation on chromosome ROIs for decorrelation, followed by critically sampled integer wavelet transforms on these regions and the remaining image parts. The well-known set partitioning in hierarchical trees (SPIHT) (Said and Perlman, 1996) algorithm is modified to generate separate embedded bit streams for both chromosome ROIs and the rest of the image that allow continuous lossy-to-lossless compression of both (although lossless compression of the former is commonly used in practice). Experiments on two sets of sample chromosome spread and karyotype images indicate that the proposed approach significantly outperforms current compression techniques used in commercial karyotyping systems and JPEG-2000 compression, which does not provide the desirable support for lossless compression of arbitrary ROIs.  相似文献   

2.
Partial encryption of compressed images and videos   总被引:12,自引:0,他引:12  
The increased popularity of multimedia applications places a great demand on efficient data storage and transmission techniques. Network communication, especially over a wireless network, can easily be intercepted and must be protected from eavesdroppers. Unfortunately, encryption and decryption are slow, and it is often difficult, if not impossible, to carry out real-time secure image and video communication and processing. Methods have been proposed to combine compression and encryption together to reduce the overall processing time, but they are either insecure or too computationally intensive. We propose a novel solution called partial encryption, in which a secure encryption algorithm is used to encrypt only part of the compressed data. Partial encryption is applied to several image and video compression algorithms in this paper. Only 13-27% of the output from quadtree compression algorithms is encrypted for typical images, and less than 2% is encrypted for 512×512 images compressed by the set partitioning in hierarchical trees (SPIHT) algorithm. The results are similar for video compression, resulting in a significant reduction in encryption and decryption time. The proposed partial encryption schemes are fast, secure, and do not reduce the compression performance of the underlying compression algorithm  相似文献   

3.
An efficient preprocessing technique of arranging an electroencephalogram (EEG) signal in matrix form is proposed for real-time lossless EEG compression. The compression algorithm consists of an integer lifting wavelet transform as the decorrelator with set partitioning in hierarchical trees as the source coder. Experimental results show that the preprocessed EEG signal gave improved rate-distortion performance, especially at low bit rates, and less encoding delay compared to the conventional one-dimensional compression scheme.  相似文献   

4.
In this paper, the problem of progressive lossless image coding is addressed. A nonlinear decomposition for progressive lossless compression is presented. The decomposition into subbands is called rank-order polynomial decomposition (ROPD) according to the polynomial prediction models used. The decomposition method presented here is a further development and generalization of the morphological subband decomposition (MSD) introduced earlier by the same research group. It is shown that ROPD provides similar or slightly better results than the compared coding schemes such as the codec based on set partitioning in hierarchical trees (SPIHT) and the codec based on wavelet/trellis-coded quantization (WTCQ). Our proposed method highly outperforms the standard JPEG. The proposed lossless compression scheme has the functionality of having a completely embedded bit stream, which allows for data browsing. It is shown that the ROPD has a better lossless rate than the MSD but it has also a much better browsing quality when only a part of the bit stream is decompressed. Finally, the possibility of hybrid lossy/lossless compression is presented using ultrasound images. As with other compression algorithms, considerable gain can be obtained if only the regions of interest are compressed losslessly.  相似文献   

5.
Compression and encryption are often performed together for image sharing and/or storage. The order in which the two operations are carried out affects the overall efficiency of digital image services. For example, the encrypted data has less or no compressibility. On the other hand, it is challenging to ensure reasonable security without downgrading the compression performance. Therefore, incorporating one requirement into another is an interesting approach. In this study, we propose a novel hybrid image encryption and compression scheme that allows compression in the encryption domain. The encryption is based on Chaos theory and is carried out in two steps, i.e., permutation and substitution. The lossless compression is performed on the shuffled image and then the compressed bitstream is grouped into 8-bit elements for substitution stage. The lossless nature of the proposed method makes it suitable for medical image compression and encryption applications. The experimental results shows that the proposed method achieves the necessary level of security and preserves the compression efficiency of a lossless algorithm. In addition, to improve the performance of the entropy encoder of the compression algorithm, we propose a data-to-symbol mapping method based on number theory to represent adjacent pixel values as a block. With such representation, the compression saving is improved on average from 5.76% to 15.45% for UCID dataset.  相似文献   

6.
Error-resilient SPIHT image coding   总被引:2,自引:0,他引:2  
The authors develop an efficient error-resilient scheme for the set partitioning in hierarchical trees (SPIHT) technique, one of the most successful image compression algorithms. By partitioning the coded data sequence and adding appropriate side information, the proposed algorithm provides significantly better PSNR performance over noisy channels with a minimal increase in the coding complexity  相似文献   

7.
A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm [1] has achieved notable success in still image coding. We modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.  相似文献   

8.
In this paper, a hybrid fractal zerotree wavelet (FZW) image coding algorithm is proposed. The algorithm couples a zerotree-based encoder, such as the embedded zerotree wavelet (EZW) coder or set partitioning in hierarchical trees, and a fractal image coder; this coupling is done in the wavelet domain. Based on perceptually-weighted distortion-rate calculations, a fractal method is adaptively applied to the parts of an image that can be encoded more efficiently relative to an EZW coder at a given rate. In addition to improving compression performance, the proposed algorithm also allows one to impose desirable properties from each type of image coder, such as progressive transmission, the zerotree structure, and range-domain block decoding.  相似文献   

9.
In this paper, multiwavelets are considered in the context of image compression based on the human vision system (HVS). First, selecting the BSA (4/4)* filters, a twodimensional image is transformed with our proposed algorithm I. Second, we apply HVS coefficients into the subbands of the transformed image. Third, we split the coefficients into two parts: the significance map and residue map. Then a new modified set partitioning in hierarchical trees (SPIHT) algorithm is proposed to encode the significance map. Fourth, algorithm III is presented for coding the residue map. Finally, we adopt context-based adaptive arithmetic coding to encode the bit stream. We also provide some experimental results proving that multiwavelets are worth studying and compare them with those of other multiwavelet and JPEG2000 algorithms.  相似文献   

10.
In a prior work, a wavelet-based vector quantization (VQ) approach was proposed to perform lossy compression of electrocardiogram (ECG) signals. In this paper, we investigate and fix its coding inefficiency problem in lossless compression and extend it to allow both lossy and lossless compression in a unified coding framework. The well-known 9/7 filters and 5/3 integer filters are used to implement the wavelet transform (WT) for lossy and lossless compression, respectively. The codebook updating mechanism, originally designed for lossy compression, is modified to allow lossless compression as well. In addition, a new and cost-effective coding strategy is proposed to enhance the coding efficiency of set partitioning in hierarchical tree (SPIHT) at the less significant bit representation of a WT coefficient. ECG records from the MIT/BIH Arrhythmia and European ST-T Databases are selected as test data. In terms of the coding efficiency for lossless compression, experimental results show that the proposed codec improves the direct SPIHT approach and the prior work by about 33% and 26%, respectively.  相似文献   

11.
This paper presents a listless implementation of wavelet based block tree coding (WBTC) algorithm of varying root block sizes. WBTC algorithm improves the image compression performance of set partitioning in hierarchical trees (SPIHT) at lower rates by efficiently encoding both inter and intra scale correlation using block trees. Though WBTC lowers the memory requirement by using block trees compared to SPIHT, it makes use of three ordered auxiliary lists. This feature makes WBTC undesirable for hardware implementation; as it needs a lot of memory management when the list nodes grow exponentially on each pass. The proposed listless implementation of WBTC algorithm uses special markers instead of lists. This reduces dynamic memory requirement by 88% with respect to WBTC and 89% with respect to SPIHT. The proposed algorithm is combined with discrete cosine transform (DCT) and discrete wavelet transform (DWT) to show its superiority over DCT and DWT based embedded coders, including JPEG 2000 at lower rates. The compression performance on most of the standard test images is nearly same as WBTC, and outperforms SPIHT by a wide margin particularly at lower bit rates.  相似文献   

12.
The authors propose a highly scalable image compression scheme based on the set partitioning in hierarchical trees (SPIHT) algorithm. The proposed algorithm, called highly scalable SPIHT (HS-SPIHT), adds the spatial scalability feature to the SPIHT algorithm through the introduction of multiple resolution-dependent lists and a resolution-dependent sorting pass. It keeps the important features of the original SPIHT algorithm such as compression efficiency, full SNR scalability and low complexity. The flexible bitstream of the HS-SPIHT encoder can easily be adapted to various resolution requirements at any bit rate. The parsing process can be carried out on-the-fly without decoding the bitstream by a simple parser (transcoder) that forms a part of a smart network. The HS-SPIHT algorithm is further developed for fully scalable coding of arbitrarily shaped visual objects. The proposed highly scalable algorithm finds applications in progressive web browsing, visual databases and especially in image transmission over heterogeneous networks.  相似文献   

13.
14.
The application of multiwavelet transform to image coding   总被引:3,自引:0,他引:3  
This work presents a new image coding scheme based on multiwavelet filter banks. First, two dimensional (2-D) multiwavelet decomposition is performed on the original image. Then, several hierarchical trees are constructed in the transform domain, and an extension of set partitioning in hierarchical trees algorithm is proposed to quantize multiwavelet coefficients. Our simulation shows that this scheme is effective and promising.  相似文献   

15.
文中提出了一种新的无损图像压缩编码方法.通过对图像灰度值按四种情况进行动态分段编码压缩,对于不同的图像会有不同的分段选择,因而具有一定的自适应性.实验证明,这种压缩方法具有很好的压缩效果.  相似文献   

16.
SAR image compression is very important in reducing the costs of data storage and transmission in relatively slow channels. The authors propose a compression scheme driven by texture analysis, homogeneity mapping and speckle noise reduction within the wavelet framework. The image compressibility and interpretability are improved by incorporating speckle reduction into the compression scheme. The authors begin with the classical set partitioning in hierarchical trees (SPIHT) wavelet compression scheme, and modify it to control the amount of speckle reduction, applying different encoding schemes to homogeneous and nonhomogeneous areas of the scene. The results compare favorably with the conventional SPIHT wavelet and the JPEG compression methods  相似文献   

17.
李群迎  张晓林 《电子学报》2010,38(11):2655-2659
 本文提出了一种针对航空遥感图像传输的信源信道联合编码方法.将小波变换后的图像进行小波树分组以形成多描述,并重复描述重要的低频子带系数;然后利用改进的多级树集合分裂(SPIHT)算法对每个描述单独编码,并为其提供不等差错保护.为保证编码实时性,提出了一种快速的码率分配搜索算法.仿真结果表明该方法在频率选择性莱斯衰落信道下实现了遥感图像的鲁棒传输,且具有较低的复杂度.  相似文献   

18.
A linear quadtree compression scheme for image encryption   总被引:5,自引:0,他引:5  
A private key encryption scheme for a two-dimensional image data is proposed in this work. This scheme is designed on the basis of lossless data compression principle. The proposed scheme is developed to have both data encryption and compression performed simultaneously. For the lossless data compression effect, the quadtree data structure is used to represent the image; for the encryption purpose, various scanning sequences of image data are provided. The scanning sequences comprise a private key for encryption. Twenty four possible combinations of scanning sequences are defined for accessing four quadrants, thereby making available 24n × 4n(n − 1)/2 possibilities to encode an image of resolution 2n × 2n. The security of the proposed encryption scheme therefore relies on the computational infeasibility of an exhaustive search approach. Three images of 512 × 512 pixels are used to verify the feasibility of the proposed scheme. The testing results and analysis demonstrate the characteristics of the proposed scheme. This scheme can be applied for problems of data storage or transmission in a public network.  相似文献   

19.
A new method for the compression of angiogram video sequences is presented. The method is based on the philosophy that diagnostically significant areas of the image should be allocated the greatest proportion of the total allocated bit budget. The approach uses a three-dimensional wavelet-coder based on the popular set partitioning in hierarchical trees algorithm. Incorporated into this framework are a region-of-interest (ROI) detection stage and a texture-modeling stage. The combined result is an approach that models the high-frequency wavelet coefficients for some diagnostically unimportant regions of the image in an extremely efficient manner. This allows additional bits to be used within the ROI to improve the quality of the diagnostically significant areas. Results are compared for a number of real data sets and evaluated by trained cardiologists.  相似文献   

20.
A 2-D ECG compression method based on wavelet transform and modified SPIHT   总被引:8,自引:0,他引:8  
A two-dimensional (2-D) wavelet-based electrocardiogram (ECG) data compression method is presented which employs a modified set partitioning in hierarchical trees (SPIHT) algorithm. This modified SPIHT algorithm utilizes further the redundancy among medium- and high-frequency subbands of the wavelet coefficients and the proposed 2-D approach utilizes the fact that ECG signals generally show redundancy between adjacent beats and between adjacent samples. An ECG signal is cut and aligned to form a 2-D data array, and then 2-D wavelet transform and the modified SPIHT can be applied. Records selected from the MIT-BIH arrhythmia database are tested. The experimental results show that the proposed method achieves high compression ratio with relatively low distortion and is effective for various kinds of ECG morphologies.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号