首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this work, we give good concatenated code ensembles for the binary erasure channel (BEC). In particular, we consider repeat multiple-accumulate (RMA) code ensembles formed by the serial concatenation of a repetition code with multiple accumulators, and the hybrid concatenated code (HCC) ensembles recently introduced by Koller et al. (5th Int. Symp. on Turbo Codes & Rel. Topics, Lausanne, Switzerland) consisting of an outer multiple parallel concatenated code serially concatenated with an inner accumulator. We introduce stopping sets for iterative constituent code oriented decoding using maximum a posteriori erasure correction in the constituent codes. We then analyze the asymptotic stopping set distribution for RMA and HCC ensembles and show that their stopping distance hmin, defined as the size of the smallest nonempty stopping set, asymptotically grows linearly with the block length. Thus, these code ensembles are good for the BEC. It is shown that for RMA code ensembles, contrary to the asymptotic minimum distance dmin, whose growth rate coefficient increases with the number of accumulate codes, the hmin growth rate coefficient diminishes with the number of accumulators. We also consider random puncturing of RMA code ensembles and show that for sufficiently high code rates, the asymptotic hmin does not grow linearly with the block length, contrary to the asymptotic dmin, whose growth rate coefficient approaches the Gilbert-Varshamov bound as the rate increases. Finally, we give iterative decoding thresholds for the different code ensembles to compare the convergence properties.  相似文献   

2.
This paper is devoted to the finite-length analysis of turbo decoding over the binary erasure channel (BEC). The performance of iterative belief-propagation decoding of low-density parity-check (LDPC) codes over the BEC can be characterized in terms of stopping sets. We describe turbo decoding on the BEC which is simpler than turbo decoding on other channels. We then adapt the concept of stopping sets to turbo decoding and state an exact condition for decoding failure. Apply turbo decoding until the transmitted codeword has been recovered, or the decoder fails to progress further. Then the set of erased positions that will remain when the decoder stops is equal to the unique maximum-size turbo stopping set which is also a subset of the set of erased positions. Furthermore, we present some improvements of the basic turbo decoding algorithm on the BEC. The proposed improved turbo decoding algorithm has substantially better error performance as illustrated by the given simulation results. Finally, we give an expression for the turbo stopping set size enumerating function under the uniform interleaver assumption, and an efficient enumeration algorithm of small-size turbo stopping sets for a particular interleaver. The solution is based on the algorithm proposed by Garello et al. in 2001 to compute an exhaustive list of all low-weight codewords in a turbo code.  相似文献   

3.
Weight hierarchies of extremal non-chain binary codes of dimension4   总被引:2,自引:0,他引:2  
The weight hierarchy of a linear [n,k;q] code C over GF(q) is the sequence (d1,d2,···,dk ) where dr is the smallest support of an r-dimensional subcode of C. An [n,k;q] code is extremal nonchain if, for any r and s, where 1⩽rS(D)=dr, and wS (E)=ds. The possible weight hierarchies of such binary codes of dimension 4 are determined  相似文献   

4.
In this correspondence, we first investigate some analytical aspects of the recently proposed improved decoding algorithm for low-density parity-check (LDPC) codes over the binary erasure channel (BEC). We derive a necessary and sufficient condition for the improved decoding algorithm to successfully complete decoding when the decoder is initialized to guess a predetermined number of guesses after the standard message-passing terminates at a stopping set. Furthermore, we present improved bounds on the number of bits to be guessed for successful completion of the decoding process when a stopping set is encountered. Under suitable conditions, we derive a lower bound on the number of iterations to be performed for complete decoding of the stopping set. We then present a superior, novel improved decoding algorithm for LDPC codes over the binary erasure channel (BEC). The proposed algorithm combines the observation that a considerable fraction of unsatisfied check nodes in the neighborhood of a stopping set are of degree two, and the concept of guessing bits to perform simple and intuitive graph-theoretic manipulations on the Tanner graph. The proposed decoding algorithm has a complexity similar to previous improved decoding algorithms. Finally, we present simulation results of short-length codes over BEC that demonstrate the superiority of our algorithm over previous improved decoding algorithms for a wide range of bit error rates  相似文献   

5.
The Hamming weight hierarchy of a linear [n,k;q] code c over GF(q)is the sequence(d1,d2,…,dk),where dr is the smallest support weight of an r-dimensional subcode of c.According to some new necessary conditions,the VI class Hamming weight hierarchies of q -ary linear codes of dimension 5 can be divided into six subclasses. By using the finite projective geometry method, VI-2 subclass and determine were researched almost all weight hierarchies of the VI-2 subclass of weight hierarchies of q -ary linear codes with dimension 5.  相似文献   

6.
Bounds on the minimum support weights   总被引:6,自引:0,他引:6  
The minimum support weight, dr(C), of a linear code C over GF(q) is the minimal size of the support of an r-dimensional subcode of C. A number of bounds on dr(C) are derived, generalizing the Plotkin bound and the Griesmer bound, as well as giving two new existential bounds. As the main result, it is shown that there exist codes of any given rate R whose ratio dr/d1 is lower bounded by a number ranging from (qr-1)/(qr -qr-1) to r, depending on R  相似文献   

7.
This article contains results on the generalized Hamming weights (GHW) for the Goethals and Preparata codes over Z4. We give an upper bound on the rth generalized Hamming weights dr(m,j) for the Goethals code Gm(j) of length 2m over Z 4, when m is odd. We also determine d3.5(m,j) exactly. The upper bound is shown to be tight up to r=3.5. Furthermore, we determine the rth generalized Hamming weight dr(m) for the Preparata code of length 2m over Z4 when r=3.5 and r=4  相似文献   

8.
We introduce the notion of the stopping redundancy hierarchy of a linear block code as a measure of the tradeoff between performance and complexity of iterative decoding for the binary erasure channel. We derive lower and upper bounds for the stopping redundancy hierarchy via LovÁsz's Local Lemma (LLL) and Bonferroni-type inequalities, and specialize them for codes with cyclic parity-check matrices. Based on the observed properties of parity-check matrices with good stopping redundancy characteristics, we develop a novel decoding technique, termed automorphism group decoding, that combines iterative message passing and permutation decoding. We also present bounds on the smallest number of permutations of an automorphism group decoder needed to correct any set of erasures up to a prescribed size. Simulation results demonstrate that for a large number of algebraic codes, the performance of the new decoding method is close to that of maximum-likelihood (ML) decoding.   相似文献   

9.
Stopping set distribution of LDPC code ensembles   总被引:1,自引:0,他引:1  
Stopping sets determine the performance of low-density parity-check (LDPC) codes under iterative decoding over erasure channels. We derive several results on the asymptotic behavior of stopping sets in Tanner-graph ensembles, including the following. An expression for the normalized average stopping set distribution, yielding, in particular, a critical fraction of the block length above which codes have exponentially many stopping sets of that size. A relation between the degree distribution and the likely size of the smallest nonempty stopping set, showing that for a /spl radic/1-/spl lambda/'(0)/spl rho/'(1) fraction of codes with /spl lambda/'(0)/spl rho/'(1)<1, and in particular for almost all codes with smallest variable degree >2, the smallest nonempty stopping set is linear in the block length. Bounds on the average block error probability as a function of the erasure probability /spl epsi/, showing in particular that for codes with lowest variable degree 2, if /spl epsi/ is below a certain threshold, the asymptotic average block error probability is 1-/spl radic/1-/spl lambda/'(0)/spl rho/'(1)/spl epsi/.  相似文献   

10.
一种适用于Turbo译码的新型迭代停止算法   总被引:1,自引:0,他引:1  
针对Turbo码迭代译码延时大的问题,本文提出了一种加速译码的新型迭代停止方法,该方法利用了两个分量译码器输出的对数似然比LLR(Logarithm Likelihood Ratio)的统计特性,简称为LCB(LLR Characters Based)算法.同现存的经典停止准则相比,在相同的比特误码率性能下,新方法有效的降低了译码平均迭代次数,加速了Turbo码译码,可应用于流媒体信号传输系统.  相似文献   

11.
The design of low-density parity-check (LDPC) codes under hybrid iterative / maximum likelihood decoding is addressed for the binary erasure channel (BEC). Specifically, we focus on generalized irregular repeat-accumulate (GeIRA) codes, which offer both efficient encoding and design flexibility. We show that properly designed GeIRA codes tightly approach the performance of an ideal maximum distance separable (MDS) code, even for short block sizes. For example, our (2048,1024) code reaches a codeword error rate of 10-5 at channel erasure probability isin= 0.450, where an ideal (2048,1024) MDS code would reach the same error rate at isin = 0.453.  相似文献   

12.
The iterative decoding threshold of low-density parity-check (LDPC) codes over the binary erasure channel (BEC) fulfills an upper bound depending only on the variable and check nodes with minimum distance $2$. This bound is a consequence of the stability condition, and is here referred to as stability bound. In this paper, a stability bound over the BEC is developed for doubly-generalized LDPC codes, where variable and check nodes can be generic linear block codes, assuming maximum a posteriori erasure correction at each node. It is proved that also in this generalized context the bound depends only on the variable and check component codes with minimum distance $2$. A condition is also developed, namely, the derivative matching condition, under which the bound is achieved with equality. The stability bound leads to consider single parity-check codes used as variable nodes as an appealing option to overcome common problems created by generalized check nodes.   相似文献   

13.
Hammons et al. (see ibid., vol.40, p.301-19, 1994) showed that, when properly defined, the binary nonlinear Preparata code can be considered as the Gray map of a linear code over Z4, the so called Preparata code over Z4. We consider the rth generalized Hamming weight dr(m) of the Preparata code of length 2m over Z4. For any m⩾3, dr(m) is exactly determined for r=0.5, 1, 1.5, 2, 2.5 and 3.0. For a composite m, we give an upper bound on dr(m) using the lifting technique. For m=3, 4, 5, 6 and 8, the weight hierarchy is completely determined. In the case of m=7, the weight hierarchy is completely determined except for d4(7)  相似文献   

14.
We show how asymptotic estimates of powers of polynomials with nonnegative coefficients can be used in the analysis of low-density parity-check (LDPC) codes. In particular, we show how these estimates can be used to derive the asymptotic distance spectrum of both regular and irregular LDPC code ensembles. We then consider the binary erasure channel (BEC). Using these estimates we derive lower bounds on the error exponent, under iterative decoding, of LDPC codes used over the BEC. Both regular and irregular code structures are considered. These bounds are compared to the corresponding bounds when optimal (maximum-likelihood (ML)) decoding is applied.  相似文献   

15.
Mutual information transfer characteristics of soft in/soft out decoders are proposed as a tool to better understand the convergence behavior of iterative decoding schemes. The exchange of extrinsic information is visualized as a decoding trajectory in the extrinsic information transfer chart (EXIT chart). This allows the prediction of turbo cliff position and bit error rate after an arbitrary number of iterations. The influence of code memory, code polynomials as well as different constituent codes on the convergence behavior is studied for parallel concatenated codes. A code search based on the EXIT chart technique has been performed yielding new recursive systematic convolutional constituent codes exhibiting turbo cliffs at lower signal-to-noise ratios than attainable by previously known constituent codes  相似文献   

16.
Efficient algorithms are derived for maximum likelihood (ML) soft-decision decoding of some binary self-dual codes. A family of easily decodable self-dual codes is derived by modifying a known F24, which has a weight distribution resembling that of the [24, 12, 8] Golay code G24. The ML decoding of F24 is accomplished by only 227 real additions, compared to 651 required for G24, yet the error rates of the two decoders are similar for moderate noise conditions  相似文献   

17.
Message-passing iterative decoders for low-density parity-check (LDPC) block codes are known to be subject to decoding failures due to so-called pseudocodewords. These failures can cause the large signal-to-noise ratio (SNR) performance of message-passing iterative decoding to be worse than that predicted by the maximum-likelihood (ML) decoding union bound.   相似文献   

18.
Stopping sets, and in particular their numbers and sizes, play an important role in determining the performance of iterative decoders of linear codes over binary erasure channels. In the 2004 Shannon Lecture, McEliece presented an expression for the number of stopping sets of size three for a full-rank parity-check matrix of the Hamming code. In this correspondence, we derive an expression for the number of stopping sets of any given size for the same parity-check matrix.  相似文献   

19.
This correspondence studies the performance of the iterative decoding of low-density parity-check (LDPC) code ensembles that have linear typical minimum distance and stopping set size. We first obtain a lower bound on the achievable rates of these ensembles over memoryless binary-input output-symmetric channels. We improve this bound for the binary erasure channel. We also introduce a method to construct the codes meeting the lower bound for the binary erasure channel. Then, we give upper bounds on the rate of LDPC codes with linear minimum distance when their right degree distribution is fixed. We compare these bounds to the previously derived upper bounds on the rate when there is no restriction on the code ensemble.  相似文献   

20.
The correcting properties of concatenated codes with parallel decoding over an additive channel are investigated. The ith inner decoder's output is a codeword if the Euclidean distance between the received vector and some codeword is less than Δi and an erasure otherwise. The outer decoders correct errors and erasures. The error-correcting capability, which is taken to be the minimum length of any noise vector that can cause an error, is obtained for a bank of z inner and outer decoders as a function of the thresholds used. The set of thresholds that maximize the error-correcting capability is also found. It is shown that for a small number of branches, the error-correcting capability is nearly as large as any decoder  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号