首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2篇
  免费   0篇
无线电   2篇
  1998年   2篇
排序方式: 共有2条查询结果,搜索用时 15 毫秒
1
1.
This paper analyzes the performance of concatenated coding systems operating over the binary-symmetric channel (BSC) by examining the loss of capacity resulting from each of the processing steps. The techniques described in this paper allow the separate evaluation of codes and decoders and thus the identification of where loss of capacity occurs. They are, moreover, very useful for the overall design of a communications system, e.g., for evaluating the benefits of inner decoders that produce side information. The first two sections of this paper provide a general technique (based on the coset weight distribution of a binary linear code) for calculating the composite capacity of the code and a BSC in isolation. The later sections examine the composite capacities of binary linear codes, the BSC, and various decoders. The composite capacities of the (8,4) extended Hamming, (24, 12) extended Golay, and (48, 24) quadratic residue codes appear as examples throughout the paper. The calculations in these examples show that, in a concatenated coding system, having an inner decoder provide more information than the maximum-likelihood (ML) estimate to an outer decoder is not a computationally efficient technique, unless generalized minimum-distance decoding of an outer code is extremely easy. Specifically, for the (8,4) extended Hamming and (24, 12) extended Golay inner codes, the gains from using any inner decoder providing side information, instead of a strictly ML inner decoder, are shown to be no greater than 0.77 and 0.34 dB, respectively, for a BSC crossover probability of 0.1 or less, However, if computationally efficient generalized minimum distance decoders for powerful outer codes, e.g., Reed-Solomon codes, become available, they will allow the use of simple inner codes, since both simple and complex inner codes have very similar capacity losses  相似文献   
2.
This paper calculates new bounds on the size of the performance gap between random codes and the best possible codes. The first result shows that, for large block sizes, the ratio of the error probability of a random code to the sphere-packing lower bound on the error probability of every code on the binary symmetric channel (BSC) is small for a wide range of useful crossover probabilities. Thus even far from capacity, random codes have nearly the same error performance as the best possible long codes. The paper also demonstrates that a small reduction k-k˜ in the number of information bits conveyed by a codeword will make the error performance of an (n,k˜) random code better than the sphere-packing lower bound for an (n,k) code as long as the channel crossover probability is somewhat greater than a critical probability. For example, the sphere-packing lower bound for a long (n,k), rate 1/2, code will exceed the error probability of an (n,k˜) random code if k-k˜>10 and the crossover probability is between 0.035 and 0.11=H-1(1/2). Analogous results are presented for the binary erasure channel (BEC) and the additive white Gaussian noise (AWGN) channel. The paper also presents substantial numerical evaluation of the performance of random codes and existing standard lower bounds for the BEC, BSC, and the AWGN channel. These last results provide a useful standard against which to measure many popular codes including turbo codes, e.g., there exist turbo codes that perform within 0.6 dB of the bounds over a wide range of block lengths  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号