共查询到20条相似文献,搜索用时 15 毫秒
1.
Zamir R. 《IEEE transactions on information theory / Professional Technical Group on Information Theory》2002,48(2):523-528
Entropy coding is a well-known technique to reduce the rate of a quantizer. It plays a particularly important role in universal quantization, where the quantizer codebook is not matched to the source statistics. We investigate the gain due to entropy coding by considering the entropy of the index of the first codeword, in a mismatched random codebook, that D-matches the source word. We show that the index entropy is strictly lower than the "uncoded" rate of the code, provided that the entropy is conditioned on the codebook. The number of bits saved by conditional entropy coding is equal to the divergence between the "favorite type" (the limiting empirical distribution of the first D-matching codeword) and the codebook-generating distribution. Specific examples are provided 相似文献
2.
《IEEE transactions on information theory / Professional Technical Group on Information Theory》1978,24(3):331-338
It is known that the expected codeword lengthL_{UD} of the best uniquely decodable (UD) code satisfiesH(X)leqL_{UD}. LetX be a random variable which can take on n values. Then it is shown that the average codeword lengthL_{1:1} for the best one-to-one (not necessarily uniquely decodable) code forX is shorter than the average codeword lengthL_{UD} for the best uniquely decodable code by no more than(log_{2} log_{2} n)+ 3 . LetY be a random variable taking on a finite or countable number of values and having entropyH . Then it is proved thatL_{1:1}geq H-log_{2}(H + 1)-log_{2}log_{2}(H + 1 )-cdots -6 . Some relations are established among the Kolmogorov, Chaitin, and extension complexities. Finally it is shown that, for all computable probability distributions, the universal prefix codes associated with the conditional Chaitin complexity have expected codeword length within a constant of the Shannon entropy. 相似文献
3.
Falkowski B.J. Shixing Yan 《IEEE transactions on circuits and systems. I, Regular papers》2006,53(5):1119-1129
The mutual relationships between Hadamard-Haar and Arithmetic transforms and their corresponding spectra in the form of matrix decomposition as layered vertical and horizontal Kronecker matrices are discussed here together with their proofs, fast algorithms, and computational costs. The new relations apply to an arbitrary dimension of the transform matrices and allow performing direct conversions between Arithmetic and Hadamard-Haar functions and their corresponding spectra. In addition, analysis of butterfly diagrams for these new relations is also introduced and it is shown that they are more efficient than the matrix decomposition method. 相似文献
4.
《IEEE transactions on information theory / Professional Technical Group on Information Theory》1973,19(4):533-536
We consider ann -dimensional vector space overGF(q) which has a probability distribution defined on it. The sum of the probabilities over a properk -dimensional subspace is compared to a sum over a coset of this subspace. The difference of these set probabilities is related to a sum of the Fourier transforms of the distribution over a subset of the domain of the transforms. We demonstrate the existence of a coset and both an upper and a lower bound on the difference associated with this coset. The bounds depend on the maximum and nonzero minimum of the transforms as defined on a special subset of the transform domain. Two examples from coding theory are presented. The first deals with aq -ary symmetric channel while the second is concerned with a binary compound channel. 相似文献
5.
本文提出一种将回波抵消器(EC)作为LD-CELP编解码器的部分工作的方法,将EC功能融入LD-CELP编解码器之中。文中讨论了用两片TMS320C30实时实现LD-CELP与EC于一个系统中的可能性。 相似文献
6.
Survey and comparative analysis of entropy and relative entropy thresholding techniques 总被引:1,自引:0,他引:1
Chang C.-I. Du Y. Wang J. Guo S.-M. Thouin P.D. 《Vision, Image and Signal Processing, IEE Proceedings -》2006,153(6):837-850
Entropy-based image thresholding has received considerable interest in recent years. Two types of entropy are generally used as thresholding criteria: Shannon's entropy and relative entropy, also known as Kullback-Leibler information distance, where the former measures uncertainty in an information source with an optimal threshold obtained by maximising Shannon's entropy, whereas the latter measures the information discrepancy between two different sources with an optimal threshold obtained by minimising relative entropy. Many thresholding methods have been developed for both criteria and reported in the literature. These two entropy-based thresholding criteria have been investigated and the relationship among entropy and relative entropy thresholding methods has been explored. In particular, a survey and comparative analysis is conducted among several widely used methods that include Pun and Kapur's maximum entropy, Kittler and Illingworth's minimum error thresholding, Pal and Pal's entropy thresholding and Chang et al.'s relative entropy thresholding methods. In order to objectively assess these methods, two measures, uniformity and shape, are used for performance evaluation 相似文献
7.
Malone D. Sullivan W.G. 《IEEE transactions on information theory / Professional Technical Group on Information Theory》2004,50(3):525-526
We derive the moments of the guesswork , the number of attempts required to correctly guess the output of a random source, for a source determined by a Markov chain via a large deviations type estimate. These moments are related to the Perron-Frobenius eigenvalue of the matrix formed by element-wise powers of the Markov chain's transition matrix. 相似文献
8.
近年来,UML已经被广泛应用于软件的分析和设计,然而,由于软件系统的复杂性,在UML模型中,难免会引入不同图表间特别是动态视图之间的不一致性。提出了一种用于验证UML2.0模型状态图和顺序图一致性的方法。首先,用XYZ/E来形式化描述状态图并将其转化为Promela输入语言;然后,用LTL来表示顺序图间的相互作用;最后利用模型检测工具Spin通过检查Promela描述的状态图是否满足LTL公式来达到检测模型一致性的目的。 相似文献
9.
Dorrer C. Doerr C.R. Kang I. Ryf R. Leuthold J. Winzer P.J. 《Lightwave Technology, Journal of》2005,23(1):178-186
We demonstrate the characterization of optical sources with high sensitivity, high temporal resolution, and phase sensitivity using linear optical sampling. Eye diagrams and constellation diagrams are reconstructed using the interference of the source under test with a train of sampling pulses. This concept is implemented using a waveguide optical hybrid, which splits and recombines the sources and adjusts the phase between the recombined signals to provide optimal detection. This diagnostic is used to characterize on-off keyed (OOK) waveforms at rates up to 640 Gb/s and various phase-shift keyed (PSK) signals at 10 and 40 Gb/s. 相似文献
10.
《IEEE transactions on information theory / Professional Technical Group on Information Theory》1985,31(5):589-593
A unified approach is given for constructing cross entropy and dissimilarity measures between probability distributions, based on a given entropy function or a diversity measure. Special properties of quadratic entropy introduced by Rao [7] are described. In particular it is shown that the square root of the Jensen difference (dissimilarity measure) arising out of a quadratic entropy provides a metric on a probability space. Several characterizations of quadratic entropy are obtained. 相似文献
11.
Ziv J. Merhav N. 《IEEE transactions on information theory / Professional Technical Group on Information Theory》1993,39(4):1270-1279
A new notion of empirical informational divergence (relative entropy) between two individual sequences is introduced. If the two sequences are independent realizations of two finite-order, finite alphabet, stationary Markov processes, the empirical relative entropy converges to the relative entropy almost surely. This empirical divergence is based on a version of the Lempel-Ziv data compression algorithm. A simple universal algorithm for classifying individual sequences into a finite number of classes, which is based on the empirical divergence, is introduced. The algorithm discriminates between the classes whenever they are distinguishable by some finite-memory classifier for almost every given training set and almost any test sequence from these classes. It is universal in the sense that it is independent of the unknown sources 相似文献
12.
A recent paper has presented a method of state assignment based upon the maximisation of true (or false) minterms in the next-state functions. This method has the advantage of very elegant implementation. This letter seeks to interpret these results in the light of the concept of entropy. It is shown that there is some justification for using the maximisation of true (false) minterms for optimal state assignment and some indication is given of the limitations of the method. 相似文献
13.
A multilayer hierarchical structure for an intelligent analysis system is described in this paper. Four levels (patients', measurement, Web-based, and interpreting) are used to collect massive amounts from clinical information and analyze it with both traditional and artificial intelligent methods. To support this, a novel fuzzy pain demand (FPD) index derived from the interval of each bolus of patient-controlled analgesia (PCA) is designed and documented in a large-scale clinical survey. The FPD index is modeled according to a fuzzy modeling algorithm to interpret the self-titration of the drug delivery. A total of 255 patients receiving intravenous PCA using morphine (1 mg/ml) tested this index by offline analysis from this system. We found the FPD index modeled from a fuzzy modeling algorithm to interpret the self-titration of the drug delivery can show the patients' dynamic demand and past efforts to overcome the postoperative pain. Moreover, it could become an online system to monitor patients' demand or intent to treat their pain so these factors could be entered into a patient's chart along with temperature, blood pressure, pulse, and respiration rates when medical practitioners check the patients. 相似文献
14.
Banihashemi A.H. Blake I.F. 《IEEE transactions on information theory / Professional Technical Group on Information Theory》1998,44(5):1829-1847
This paper presents results on trellis complexity and low-complexity trellis diagrams of lattices. We establish constructive upper bounds on the trellis complexity of lattices. These bounds both improve and generalize the similar results of Tarokh and Vardy (see ibid., vol.43, p.1294-1300, 1997). We also construct trellis diagrams with minimum number of paths for some important lattices. Such trellises are called minimal. The constructed trellises, which are novel in many cases, can be employed to efficiently decode the lattices via the Viterbi algorithm. In particular, a general structure for minimal trellis diagrams of Dn lattices is obtained. This structure corresponds to a new code formula for Dn. Moreover, we develop some important duality results which are used in both deriving the upper bounds, and finding the minimal trellises. All the discussions are based on a universal approach to the construction and analysis of trellis diagrams of lattices using their bases 相似文献
15.
Lin Yuan Kesavan H.K. 《IEEE transactions on systems, man and cybernetics. Part C, Applications and reviews》1998,28(3):488-491
Kapur et al. (1995) introduced the MinMax information measure, which is based on both maximum and minimum entropy. The major obstacle for using this measure, in practice, is the difficulty in finding the minimum entropy. An analytical expression has already been developed for calculating the minimum entropy when only variance is specified. An analytical formula is obtained for calculating the minimum entropy when only mean is specified, and numerical examples are given for illustration 相似文献
16.
Maximum entropy and conditional probability 总被引:2,自引:0,他引:2
《IEEE transactions on information theory / Professional Technical Group on Information Theory》1981,27(4):483-489
It is well-known that maximum entropy distributions, subject to appropriate moment constraints, arise in physics and mathematics. In an attempt to find a physical reason for the appearance of maximum entropy distributions, the following theorem is offered. The conditional distribution ofX_{l} given the empirical observation(1/n)sum^{n}_{i}=_{l}h(X_{i})=alpha , whereX_{1},X_{2}, cdots are independent identically distributed random variables with common densityg converges tof_{lambda}(x)=e^{lambda^{t}h(X)}g(x) (Suitably normalized), wherelambda is chosen to satisfyint f_{lambda}(x)h(x)dx= alpha . Thus the conditional distribution of a given random variableX is the (normalized) product of the maximum entropy distribution and the initial distribution. This distribution is the maximum entropy distribution wheng is uniform. The proof of this and related results relies heavily on the work of Zabell and Lanford. 相似文献
17.
Encoding and retrieval of information in maps and diagrams 总被引:1,自引:0,他引:1
18.
为了使用客观指标评价与预测室内汉语言清晰度,通过室内汉语言清晰度主观评价实验,探讨了客观指标有益有害声能比U50与汉语言清晰度之间的相关关系,并得到两者之间的回归预测方程。研究表明汉语言清晰度SI与客观指标U50之间有着很强的相关关系,可以使用U50有效地评价汉语言清晰度。 相似文献
19.
We have developed a simple method for determining coincidence attenuation-correction factors C (the inverse of the total attenuation factors) from collimated singles (SPECT) and coincidence [positron emission tomography (PET)] projections without transmission data. Attenuation-correction factor estimates are determined for individual lines of response (LOR's) independently. The required data can be acquired using a gamma-camera system with coincidence capabilities. A first-order approximation (R) of C for an LOR is given by the product of the singles count rates, taken at each end of the LOR divided by the square of the coincidence count rate. The method was tested using simulated singles and coincidence projections starting with emission and attenuation maps from patient PET scans. Noise and resolution effects were modeled in separate studies. In the noise-free, high-resolution simulations, a scatter plot of the C values versus the corresponding R values for all LOR's produces a well-defined trajectory with little variance. Values of lnR were reconstructed into good quality attenuation maps that compare favorably with the originals. We conclude that the method works well on ideal data. The introduction of noise results in degraded images. In a simulated patient study, lung and outer body boundaries were visible in images produced with 3.2 x 10(4) coincidence counts. 相似文献
20.
《IEEE transactions on information theory / Professional Technical Group on Information Theory》1987,33(2):263-266
A new proof of Birkhoff's ergodic theorem is given using a sample path covering idea, an idea created by Ornstein and Weiss in their extension of the information convergence theorem to random fields. A sketch of the Ornstein-Weiss proof for processes is included in the Appendix. 相似文献