首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到16条相似文献,搜索用时 140 毫秒
1.
H.264/AVC中基于上下文的自适应二进制算术编码   总被引:1,自引:0,他引:1  
周名芬  陈磊 《电视技术》2004,(9):18-19,32
基于上下文自适应二进制算术编码(CABAC)H.264/AVC采用的高效熵编码方法之一,它由二进制化、上下文建模、算术编码三个步骤构成。详细阐述了CABAC的整个编码过程,并对它与VLC/CAVLC在编码性能上作了比较。  相似文献   

2.
H.264/AVC是由国际电信联盟和国际标准化组织共同制定的新一代视频编码标准。在该标准中,规定了两种熵编码的模式,即基于上下文的自适应二进制算术编码(Context-bsaed Adaptive Binary Arithmetic Coding,CABAC)和基于上下文自适应可变长编码(Context-bsaed Adaptive Variable-Length Coding,CAVLC)。其中,CABAC作为一种新型的熵编码方法,将自适应技术、上下文模型化和二进制算术编码有地的结合在一起,达到了较高的压缩效率,CABAC的框架中还使用了一些新颖的方法,使得CABAC在软硬件的实现上更加方便。为了验证CABAC的实际效果,笔者应用参考程序对其进行了直观的测试,实验结果表明:在相同图像质量下,CABAC和CAVLC相比的确能节省较大的平均比特率。  相似文献   

3.
基于上下文的二进制算术编码(CABAC)是H.264/AVC中采用的一种高效的熵编码方法。本文简述算术编码的基本原理和CABAC的步骤.详细分析了二进制算术编码的过程。  相似文献   

4.
董洁  楼剑  陆亮  虞露 《电视技术》2003,(8):9-11
CABAC是一种适用于视频压缩的高效熵编码技术。它采用上下文建模来降低符号问的冗余度;采用递归的二进制算术编码使输出码字的信息量逼近符号的熵率,并有利于实时视频编码;采用自适应机制对视频流进行实时统计特性的跟踪。实验表明,CABAC与UVLC相比,能非常有效地节省码率,但是其复杂度要高一些。  相似文献   

5.
AVS+是我国2012年颁布的新一代视频编码标准。AVS+中采用了两种熵编码方法,一种是基于上下文的自适应变长编码CAVLC;另一种为基于上下文的自适应二进制算术编码CABAC。已经有人对H.264标准比较了两种编码体制的优劣,本文针对AVS+编码应用,简述分析二者算法原理,对照比较其特点,通过测试表明CABAC耗时稍长,但是比CAVLC更加高效。  相似文献   

6.
H.264在主要档次中采用了基于上下文的自适应二进制算术编码CABAC。CABAC是一种高效的熵编码方法,它在计算的复杂度和编码效率之间作了折衷,建立了基于查表的概率模型,对乘法运算也作了优化。阐述了CABAC的编解码过程,对归一化操作和ModelNumber的选取进行了分析,并将其与CAVLC在编码性能上做出比较。  相似文献   

7.
H.264标准中熵编码由指数哥伦布与基于上下文自适应的可变长编码(CAVLC)或基于上下文自适应的算术编码(CABAC)构成,本文全面介绍了H.264视频压缩标准中熵编码以及指数哥伦布的基本原理,重点阐述了指数哥伦布编码器的硬件实现方法并给出了硬件综合结果。  相似文献   

8.
H.264/AVC中二进制算术编码的分析与研究   总被引:1,自引:0,他引:1  
张杰  童胜 《电子科技》2003,(24):28-31
算术编码是一种高效的熵编码方法,已经广泛应用于图像和视频编码中。文中简述了算术编码的基本原理,介绍了可行的算术编码算法,详细分析了H.264/AVC的CABAC中采用的自适应二进制算术编码的算法并对其性能进行了测试。  相似文献   

9.
H.264/AVC是由联合视频小组(JVT)在2003年5月提出的最新视频压缩标准。基于上下文自适应二进制算术编码(CABAC)是H.264/AVC提高编码效率最重要的工具之一。本文提出了一种CABAC解码器体系结构。极大地提高系统性能和解码速度,可以实现对高清码流的解码。  相似文献   

10.
针对立体视频的安全性,该文提出一种基于熵编码的立体视频加密与信息隐藏算法。首先,结合立体视频编码结构,分析误差漂移的物理机制,并根据立体视觉掩蔽效应,确定左右视点的加密帧和隐秘信息待嵌入帧。其次,在基于上下文自适应二进制算术编码(CABAC)的熵编码中,通过等长码字替换技术,实现立体视频的加密和信息隐藏。实验结果表明,视频码流经加密与信息隐藏之后格式兼容、比特率不变,视频感知质量无明显下降,在计算复杂度和码率增加率上有显著优势。  相似文献   

11.
彭芬 《山西电子技术》2007,(3):86-87,91
CABAC是新一代视频压缩算法标准H.264/AVC中采用的新熵编码技术,使用它可以有效提高编码效率,节约码流。这里介绍了CABAC编码中算术编码理论的原理和内容模型的基本类型,并以运动矢量差值MVD的编码方法为例详细分析了CABAC的编码过程。  相似文献   

12.
In this paper, two context-based entropy coding schemes for AVS Part-2 video coding standard are presented. One is Context-based 2D Variable Length Coding (C2DVLC) as a low complexity entropy coding scheme for AVS Part-2 Jizhun profile. C2DVLC uses multiple 2D-VLC tables to exploit the statistical features of DCT coefficients for higher coding efficiency. Exponential–Golomb codes are applied in C2DVLC to code the pairs of the run-length of zero coefficients and the non-zero coefficients for lower storage requirement. The other is Context-based Binary Arithmetic Coding (CBAC) as an enhanced entropy coding scheme for AVS Part-2 Jiaqiang profile. CBAC utilizes all previously coded coefficient magnitudes in a DCT block for context modeling. This enables adaptive arithmetic coding to exploit the redundancy of the high-order Markov process in DCT domain with a few contexts. In addition, a context weighting technique is used to further improve CBAC's coding efficiency. Moreover, CBAC is designed to be compatible to C2DVLC in coding elements which simplifies the implementations. The experimental results demonstrate that both C2DVLC and CBAC can achieve comparable or even slightly higher coding performance when compared to Context-Adaptive Variable Length Coding (CAVLC) in H.264/AVC baseline profile and Context-Based Adaptive Binary Arithmetic Coding (CABAC) in H.264/AVC main profile respectively.  相似文献   

13.
This paper uses joint algorithm and architecture design to enable high coding efficiency in conjunction with high processing speed and low area cost. Specifically, it presents several optimizations that can be performed on Context Adaptive Binary Arithmetic Coding (CABAC), a form of entropy coding used in H.264/AVC, to achieve the throughput necessary for real-time low power high definition video coding. The combination of syntax element partitions and interleaved entropy slices, referred to as Massively Parallel CABAC, increases the number of binary symbols that can be processed in a cycle. Subinterval reordering is used to reduce the cycle time required to process each binary symbol. Under common conditions using the JM12.0 software, the Massively Parallel CABAC, increases the bins per cycle by 2.7 to 32.8× at a cost of 0.25 to 6.84% coding loss compared with sequential single slice H.264/AVC CABAC. It also provides a 2× reduction in area cost, and reduces memory bandwidth. Subinterval reordering reduces the critical path delay by 14 to 22%, while modifications to context selection reduces the memory requirement by 67%. This work demonstrates that accounting for implementation cost during video coding algorithms design can enable higher processing speed and reduce hardware cost, while still delivering high coding efficiency in the next generation video coding standard.  相似文献   

14.
High efficiency video coding (HEVC) video codec applies different techniques in order to achieve high compression ratios and video quality that supports real-time applications. One of the critical techniques in HEVC is the Context adaptive Binary Arithmetic Coding (CABAC) which is type of entropy coding. CABAC comes at the cost of increased computational complexity, especially for parallelization and pipeline of these blocks: binarization, context modeling and binary arithmetic encoding. The Binarization (BZ) and de-Binarization (DBZ) methods are considered as important techniques in HEVC CABAC encoder and decoder respectively. Indeed, an important goal is to get high throughput in hardware architectures of CABAC BZ and DBZ in order to achieve high resolution applications. This work is the only one found on recent literature which focuses on design and implementation of full BZ and full DBZ compatible with H.265 and H.264. Consequently, a hardware architectures of BZ and DBZ are designed and implemented by using VHDL language, targeted an FPGA virtex4 xc4vsx25-12ff668 board and emulated with ModelSim. As a result, the implementation of BZ and DBZ can process 2 bins/cycle for each syntax element when operated at 697.83 MHz and 789.26 MHz, respectively. The proposed designs exhibits an improved high-throughput of 1395.66 Mbins/s for BZ and 1578.52 Mbins/s for the DBZ. The obtained Area Efficiencies in our proposed BZ and DBZ are about 0.544 Mbins/s/slices and 0.606 Mbins/s/slices, respectively, and it is better than many recent works.  相似文献   

15.
Since context-based adaptive binary arithmetic coding (CABAC) as the entropy coding method in H.264/AVC was originally designed for lossy video compression, it is inappropriate for lossless video compression. Based on the fact that there are statistical differences of residual data between lossy and lossless video compression, we propose an efficient differential pixel value coding method in CABAC for H.264/AVC lossless video compression. Considering the observed statistical properties of the differential pixel value in lossless coding, we modified the CABAC encoding mechanism with the newly designed binarization table and the context-modeling method. Experimental results show that the proposed method achieves an approximately 12% bit saving, compared to the original CABAC method in the H.264/AVC standard.  相似文献   

16.
袁夫全  韩军 《中国有线电视》2006,(23):2298-2303
在H.264视频压缩标准中采用了高效的算术编码CABAC方案,CABAC提高了编码效率,但同时增加了编码复杂度。分析CABAC的复杂度,指出CABAC优化方向,提出其低复杂度实现算法。主要从上下文模型、重要图和宏块类型的CABAC编码3个方面对CABAC软件算法进行了优化,并在JM参考软件中进行了验证,实验结果表明CABAC的编码复杂度降低了40%。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号