共查询到20条相似文献,搜索用时 15 毫秒
1.
There is an increasing number of image data produced in our life nowadays, which creates a big challenge to store and transmit them. For some fields requiring high fidelity, the lossless image compression becomes significant, because it can reduce the size of image data without quality loss. To solve the difficulty in improving the lossless image compression ratio, we propose an improved lossless image compression algorithm that theoretically provides an approximately quadruple compression combining the linear prediction, integer wavelet transform (IWT) with output coefficients processing and Huffman coding. A new hybrid transform exploiting a new prediction template and a coefficient processing of IWT is the main contribution of this algorithm. The experimental results on three different image sets show that the proposed algorithm outperforms state-of-the-art algorithms. The compression ratios are improved by at least 6.22% up to 72.36%. Our algorithm is more suitable to compress images with complex texture and higher resolution at an acceptable compression speed. 相似文献
2.
为了获得更加准确的残差图像概率分布,以便实现更好的无损图像压缩,提出了一种基于自适用算法编码(adaptive arithmetic coding,AAC)以及几何有限混合模型的分块无损图像压缩方案。首先,该方法将残差图像分成若干非重叠块,然后利用期望最大化算法,通过最大似然估计对混合参数进行了估计,并通过参数的混合几何分布对每个模块的统计数据进行建模。此外,将直方图截断方法用于每个预测误差块,以便减少算法编码中的符号数量,从而降低了出现零次符号的影响。实验结果显示在图像集的压缩比方面,提出的压缩方法优于先进的标准JPEG-2000和JPEG-LS。 相似文献
3.
In this paper, we propose a new wavelet-based lossless image coder that is based on a state-of-the-art algorithm, namely SPIHT (set partitioning in hierarchical trees). An algorithmic modification is introduced in order to increase its efficiency. This consists of adding a new test on direct descendants in the sets of type A to process the parent coefficients that are significant due to their nondirect descendants. Also, new sets of type C are defined to perform a separate sorting of the sets that have insignificant children. The idea behind the second proposition is to remove all tests over the entries (A, B and C) since the number of significant sets is much higher than that of insignificant sets. A number of experiments, carried out on various test images, demonstrates significant improvement over the conventional SPIHT for both greyscale and colour images. 相似文献
4.
提出了一种分组无损图像压缩编码方法,该方法预先确定码字,码字由组号和组内码字表示两部分组成,每组中包含了2n个组内码字,组内码字长度不等.灰度级根据概率大小重新排序,与码字一一对应,得到新的编码表,由此达到图像压缩的目的.将此压缩编码方法与Huffman编码进行比较,比较结果表明,该方法在压缩比和编解码效率等方面都要优于Huffman编码. 相似文献
5.
Copyright protection and information security have become serious problems due to the ever growing amount of digital data over the Internet. Reversible data hiding is a special type of data hiding technique that guarantees not only the secret data but also the cover media can be reconstructed without any distortion. Traditional schemes are based on spatial, discrete cosine transformation (DCT) and discrete wavelet transformation (DWT) domains. Recently, some vector quantization (VQ) based reversible data hiding schemes have been proposed. This paper proposes an improved reversible data hiding scheme based on VQ-index residual value coding. Experimental results show that our scheme outperforms two recently proposed schemes, namely side-match vector quantization (SMVQ)-based data hiding and modified fast correlation vector quantization (MFCVQ)-based data hiding. 相似文献
6.
针对大学计算机基础课程中信息编码与数据表示知识点,设计了一个使用哈夫曼编码实现图像无损压缩的案例,经过实际课堂实施的检验,表明案例对帮助学生理解信息编码和压缩有很好的效果。 相似文献
7.
Latest advancements in capture and display technologies demand better compression techniques for the storage and transmission of still images and video. High efficiency video coding (HEVC) is the latest video compression standard developed by the joint collaborative team on video coding (JCTVC) with this objective. Although the main design goal of HEVC is the compression of high resolution video, its performance in still image compression is at par with state-of-the-art still image compression standards. This work explores the possibility of incorporating the efficient intra prediction techniques employed in HEVC into the compression of high resolution still images. In the lossless coding mode of HEVC, sample- based angular intra prediction (SAP) methods have shown better prediction accuracy compared to the conventional block-based prediction (BP). In this paper, we propose an improved sample-based angular intra prediction (ISAP), which enhances the accuracy of the highly crucial intra prediction within HEVC. The experimental results show that ISAP in lossless compression of still images outclasses archival tools, state-of-the-art image compression standards and other HEVC-based lossless image compression codecs. 相似文献
8.
A new lossless image coding method competitive with the best known image coding techniques in terms of efficiency and complexity is suggested. It is based on adaptive color space transform, adaptive context coding, and improved prediction of pixel values of image color components. Examples of application of the new algorithm to a set of standard images are given and comparison with known algorithms is performed. 相似文献
9.
DEM是三维地形可视化基础,随着DEM数据量的不断增加,对DEM进行编码压缩已成为三维地形可视化的重要研究内容。算术编码是一种基于熵编码的无损压缩编码,能保留重要细节信息。目前基于算术编码的预测模型可分为简单线性预测、拉格朗日预测和最小二乘预测三类,对这三类算法进行了对比分析。指出了算术编码算法在实际运行中存在的问题,对其未来发展提出展望。 相似文献
10.
如何高效地压缩医学图像,以便减少存储空间和传输时间,已经成为迫切需要解决的问题。在分析现有的图像压缩方法和医学图像特点的基础上,针对医学图像有损压缩方法和无损压缩方法各自的不足,研究了一种基于ROI(Region of Interest)的医学图像无损压缩方法。 相似文献
11.
AbstractThe fractal image compression is a recent tool for encoding natural images. It builds on the local self-similarities and the generation of copies of blocks based on mathematical transformations. The technique seems interesting in both theory and application but have a drawback renders in real-time usage due to the high resource requirement when encoding big data. By another way, heuristics algorithms represent a set of approaches used to solve hard optimisation tasks with rational resources consumption. They are characterised with their fast convergence and reducing of research complexity. The purpose of this paper is to provide, and for the first time, more detailed study about the Wolf Pack Algorithm for the fractal image compression. The whole Image is considered as a space search where this space is divided on blocks, the scooting wolves explore the space to find other smaller block which have a similarity with based on its parameters. Scooting wolfs perused the whole space a selected the blocks with the best fitness. The process will be stopped after a fixed number of iterations or if no improvement in lead wolf solution. Results show that compared with the exhaustive search method, the proposed method greatly reduced the encoding time and obtained a rather best compression ratio. The performed experiments showed its effectiveness in the resolution of such problem. Moreover, a brief comparison with the different methods establishes this advantage. 相似文献
13.
Effective compression technique of on-board hyperspectral images has been an active topic in the field of hyperspectral remote sensintg.In order to solve the effective compression of on-board hyperspectral images,a new distributed near lossless compression algorithm based on multilevel coset codes is proposed.Due to the diverse importance of each band,a new adaptive rate allocation algorithm is proposed,which allocates rational rate for each band according to the size of weight factor defined for hyperspectral images subject to the target rate constraints.Multiband prediction is introduced for Slepian-Wolf lossless coding and an optimal quantization algorithm is presented under the correct reconstruction of Slepian-Wolf decoder,which minimizes the distortion of reconstructed hyperspectral images under the target rate.Then Slepian-Wolf encoder exploits the correlation of the quantized values to generate the final bit streams.Experimental results show that the proposed algorithm has both higher compression efficiency and lower encoder complexity than several existing classical algorithms. 相似文献
14.
For the compression of memoryless vector quantization (VQ), most of the lossless index coding algorithms are not suitable for various test images. As a result, we present a hybrid dynamic tree-coding scheme (DTCS) and modified search order coding scheme (MSOC) to re-encode the output index map efficiently without causing any extra coding distortion. The main idea behind this scheme is that the adjacent left and upper around the current processed block usually provide more useful information than its adjacent left-upper and right-upper block, thus we employ two different coding methods according to their corresponding left or upper spatial relations. In addition, we applied the HLIC method to the information hiding. The proposed method does not modify the contents of the secret data and the compressed image. Experimental results show that the newly proposed algorithm achieves significant reduction of bit rate compared to the other lossless index coding schemes for various test images and different codebook sizes. The proposed information hiding scheme can hide a huge amount of information in the index map of an image and allows complete reconstruction of the indexes of the image. 相似文献
15.
目前分布式算术编码研究都是基于先验概率已知的有损压缩,为了实现概率自适应的无损压缩,研究了采用结束字符和概率自适应的编码方式来实现编码,提出了无损自适应分布式算术编码。实验结果表明,该算法拥有更好的压缩效果和更低的解码复杂度,并且在实际应用中,编解码可以同时进行。由于无损自适应分布式算术编码具有编码简单、压缩效果好的优点,故将它和比特面编码结合实现超光谱图像压缩,并将仿真结果与3D-SPECK算法比较,结果表明了该方法可以使信噪比提高0.13-0.37dB。 相似文献
16.
提出了新的二进制(位级)无损图像压缩方法——将错误纠正BCH码引入到图像压缩算法中;将图像的二进制分为大小为7的码字,这些块进入到BCH解码器,消除了校验位后,使得原来的块的大小减少到4位。实验结果表明,此压缩算法是有效的,并给出了一个很好的压缩比,而且不丢失数据。BCH码的使用在提高压缩比方面比单纯霍夫曼压缩的结果要好。 相似文献
17.
提出了一种基于四叉树分割的分形图像编码的改进算法,通过调整父块与子块的误差来控制编码速度和解码质量。仿真实验表明,该算法有效地提高了图像编码的效率,具有可行性。 相似文献
18.
分析了超光谱遥感图像的特征,根据它对压缩算法的特殊要求,提出了基于位平面的无损压缩算法.对于相关性较高的高位位平面采用计算位平面的差值矩阵直接去相关,而相关性较差的低位位平面则采用四叉树划分的方法重组各个像块的大小、位置、灰度信息,从而得到图像的混合编码.实验结果表明,该算法与其它常用无损压缩算法压缩比相当,但压缩时间提高了50%左右.该算法简单实用,适合有实时性要求的超光谱遥感图像压缩. 相似文献
19.
考虑到压缩效率很高的静态图像压缩算法(Set Patitioning in Hierarchical Trees,SPIHT)的压缩效率尚可以进一步提高,提出一种改进的SPIHT算法:在原始算法的基础上,引入一种新的类型树,在初始化时最大限度地保存小波变换后的系数.将改进的SPIHT算法应用到医学图像的压缩中,取得良好的压缩效果. 相似文献
20.
This paper describes a color image compression technique based on block truncation coding using pattern fitting (BTC-PF). High degree of correlation between the RGB planes of a color image is reduced by transforming them to O1O2O3 planes. Each Oi plane (1? i?3) is then encoded using BTC-PF method. Size of the pattern book and the block size are selected based on the information content of the corresponding plane. The result of the proposed method is compared with that of several BTC based methods and the former is found superior. Though this method is a spatial domain technique, it is also compared with JPEG compression method, which is one of most popular frequency domain techniques. It is found that the performance of the proposed method is a little inferior to that of the JPEG in terms of quality of the reconstructed image. Decoding time is another important criterion where the compressed image is decoded frequently for various purposes. As the proposed method requires negligible decoding time compared to JPEG, the former is preferred over the latter in those cases. 相似文献
|