首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
A new algorithm for high-frequency subband error concealment in wavelet-based picture coding is presented. It is based on a wavelet patch repetition approach: the LBG algorithm, given by Linde et al., is used to generate a codebook of patches and, according to a boundary distance measure, one of these patches is selected to mask the damaged area. Experiments show noteworthy results.  相似文献   

3.
4.
Real-time rate-control for wavelet image coding requires characterization of the rate required to code quantized wavelet data. An ideal robust solution can be used with any wavelet coder and any quantization scheme. A large number of wavelet quantization schemes (perceptual and otherwise) are based on scalar dead-zone quantization of wavelet coefficients. A key to performing rate-control is, thus, fast, accurate characterization of the relationship between rate and quantization step size, the R-Q curve. A solution is presented using two invocations of the coder that estimates the slope of each R-Q curve via probability modeling. The method is robust to choices of probability models, quantization schemes and wavelet coders. Because of extreme robustness to probability modeling, a fast approximation to spatially adaptive probability modeling can be used in the solution, as well. With respect to achieving a target rate, the proposed approach and associated fast approximation yield average percentage errors around 0.5% and 1.0% on images in the test set. By comparison, 2-coding-pass rho-domain modeling yields errors around 2.0%, and post-compression rate-distortion optimization yields average errors of around 1.0% at rates below 0.5 bits-per-pixel (bpp) that decrease down to about 0.5% at 1.0 bpp; both methods exhibit more competitive performance on the larger images. The proposed method and fast approximation approach are also similar in speed to the other state-of-the-art methods. In addition to possessing speed and accuracy, the proposed method does not require any training and can maintain precise control over wavelet step sizes, which adds flexibility to a wavelet-based image-coding system.  相似文献   

5.
6.
7.
8.
9.
10.
Wavelet image decompositions generate a tree-structured set of coefficients, providing an hierarchical data-structure for representing images. A new class of previously proposed image compression algorithms has focused on new ways for exploiting dependencies between this hierarchy of wavelet coefficients using “zero-tree” data structures. This paper presents a new framework for understanding the efficiency of one specific algorithm in this class we introduced previously and dubbed the space-frequency quantization (SFQ)-based coder. It describes, at a higher level, how the SFQ-based image coder of our earlier work can be construed as a simplified attempt to design a global entropy-constrained vector quantizer (ECVQ) with two noteworthy features: (i) it uses an image-sized codebook dimension (departing from conventional small-dimensional codebooks that are applied to small image blocks); and (ii) it uses an on-line image-adaptive application of constrained ECVQ (which typically uses off-line training data in its codebook design phase). The principal insight offered by the new framework is that improved performance is achieved by more accurately characterizing the joint probabilities of arbitrary sets of wavelet coefficients. We also present an empirical statistical study of the distribution of the wavelet coefficients of high-frequency bands, which are responsible for most of the performance gain of the new class of algorithms. This study verifies that the improved performance achieved by the new class of algorithms like the SFQ-based coder can be attributed to its being designed around one conveniently structured and efficient collection of such sets, namely, the zero-tree data structure. The results of this study further inspire the design of alternative, novel data structures based on nonlinear morphological operators  相似文献   

11.
12.
Recently the wavelet-based contourlet transform (WBCT) is adopted for image coding because it matches better image textures of different orientations. However, its computational complexity is very high. In this paper, we propose three tools to enhance the WBCT coding scheme, in particular, on reducing its computational complexity. First, we propose short-length 2-D filters for directional transform. Second, the directional transform is applied to only a few selected subbands and the selection is done by a mean-shift-based decision procedure. Third, we fine-tune the context tables used by the arithmetic coder in WBCT coding to improve coding efficiency and to reduce computation. Simulations show that, at comparable coded image quality, the proposed scheme saves over 92% computing time of the original WBCT scheme. Comparing to the conventional 2-D wavelet coding schemes, it produces clearly better subjective image quality.  相似文献   

13.
14.
15.
Locally adaptive wavelet-based image interpolation.   总被引:2,自引:0,他引:2  
We describe a spatially adaptive algorithm for image interpolation. The algorithm uses a wavelet transform to extract information about sharp variations in the low-resolution image and then implicitly applies interpolation which adapts to the image local smoothness/singularity characteristics. The proposed algorithm yields images that are sharper compared to several other methods that we have considered in this paper. Better performance comes at the expense of higher complexity.  相似文献   

16.
17.
18.
19.
20.
基于零块编码的小波图像多表达容错压缩方法   总被引:1,自引:0,他引:1  
本文提出一种均衡小波图像多表达容错压缩方法-MD-EZBC算法。这种方法将图像压缩为一系列可独立传输和解码的多表达了编码,无需优先级的支持也可充分利用不可靠信道的传输带宽。实验表明,本文方法不仅有很好的压缩效率,同时能够把传输错误图像噪声均匀地分布于整个空间,获得稳定的图像质量。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号