首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
The encoding of independent data symbols as a sequence of discrete amplitude, real variables with given power spectrum is considered. The maximum rate of such an encoding is determined by the achievable entropy of the discrete sequence with the given constraints. An upper bound to this entropy is expressed in terms of the rate distortion function for a memoryless finite alphabet source and mean-square error distortion measure. A class of simple dc-free power spectra is considered in detail, and a method for constructing Markov sources with such spectra is derived. It is found that these sequences have greater entropies than most codes with similar spectra that have been suggested earlier, and that they often come close to the upper bound. When the constraint on the power spectrum is replaced by a constraint On the variance of the sum of the encoded symbols, a stronger upper bound to the rate of dc-free codes is obtained. Finally, the optimality of the binary biphase code and of the ternary bipolar code is decided.  相似文献   

2.
The global maximum of an entropy function with different decision levels for a three-level scalar quantizer performed after a discrete wavelet transform was derived. Herein, we considered the case of entropy-constrained scalar quantization capable of avoiding many compression ratio reductions as the mean squared error was minimized. We also dealt with the problem of minimum entropy with an error bound, which was referred to as the rate distortion function. For generalized Gaussian distributed input signals, the Shannon bound would decrease monotonically when the parameter of distribution γ was to leave from 2. That is Gaussian distributions would contain the highest Shannon bound among the generalized Gaussian distributions. Additionally, we proposed two numerical approaches of the secant and false position methods implemented in real cases to solve the problems of entropy-constrained scalar quantization and minimum entropy with an error bound. The convergence condition of the secant method was also addressed  相似文献   

3.
The covariance of complex random variables and processes, when defined consistently with the corresponding notion for real random variables, is shown to be determined by the usual complex covariance together with a quantity called the pseudo-covariance. A characterization of uncorrelatedness and wide-sense stationarity in terms of covariance and pseudo-covariance is given. Complex random variables and processes with a vanishing pseudo-covariance are called proper. It is shown that properness is preserved under affine transformations and that the complex-multivariate Gaussian density assumes a natural form only for proper random variables. The maximum-entropy theorem is generalized to the complex-multivariate case. The differential entropy of a complex random vector with a fixed correlation matrix is shown to be maximum if and only if the random vector is proper, Gaussian, and zero-mean. The notion of circular stationarity is introduced. For the class of proper complex random processes, a discrete Fourier transform correspondence is derived relating circular stationarity in the time domain to uncorrelatedness in the frequency domain. An application of the theory is presented  相似文献   

4.
We provide new bounds on the expected length L of a binary one-to-one code for a discrete random variable X with entropy H. We prove that L⩾H-log(H+1)-Hlog(1+1/H). This bound improves on previous results. Furthermore, we provide upper bounds on the expected length of the best code as function of H and the most likely source letter probability  相似文献   

5.
Guy Jumarie 《电信纪事》1993,48(5-6):243-259
This paper provides new results on the entropy of functions on the one hand, and exhibits a unified approach to entropy of functions and quantum entropy of matrices with or without probability. The entropy of continuously differentiable functions is extended to stair-wise functions, a measure of relative information between two functions is obtained, which is fully consistent with Kullback cross-entropy, Renyi cross-entropy and Fisher information. The theory is then applied to stochastic processes to yield some concepts of random geometrical entropies defined on path space, which are related to fractal dimension in the special case when the process is a fractional Brownian. Then it shows how one can obtain Shannon entropy of random variables by combining the maximum entropy principle with Hartley entropy. Lastly quantum entropy of non probabilistic matrices (extension of Von Neumann quantum mechanical entropy) is derived as a consequence of Shannon entropy of random variables.  相似文献   

6.
We present a simple lower bound on the entropy of a quantized signal and a method for determining the minimum entropy quantization, subject to a maximum distortion, for a discrete memoryless random process. This quantization allows more efficient use of variable-order models for compression of images and sound.  相似文献   

7.
A general trellis coding scheme has been described previously for error control ofLsimultaneous users of a discrete memoryless multiple-access channel. A new random coding performance bound for this trellis coding is given in an algebraic form. Since this bound lacks ease of computation, a transfer function bound is derived as a more tractable alternative.  相似文献   

8.
Given n discrete random variables Ω={X1,…,Xn}, associated with any subset α of {1,2,…,n}, there is a joint entropy H(Xα) where Xα={Xi: i∈α}. This can be viewed as a function defined on 2{1,2,…,n} taking values in [0, +∞). We call this function the entropy function of Ω. The nonnegativity of the joint entropies implies that this function is nonnegative; the nonnegativity of the conditional joint entropies implies that this function is nondecreasing; and the nonnegativity of the conditional mutual information implies that this function is two-alternative. These properties are the so-called basic information inequalities of Shannon's information measures. An entropy function can be viewed as a 2n -1-dimensional vector where the coordinates are indexed by the subsets of the ground set {1,2,…,n}. As introduced by Yeng (see ibid., vol.43, no.6, p.1923-34, 1997) Γn stands for the cone in IR(2n-1) consisting of all vectors which have all these properties. Let Γn* be the set of all 2n -1-dimensional vectors which correspond to the entropy functions of some sets of n discrete random variables. A fundamental information-theoretic problem is whether or not Γ¯n*=Γn. Here Γ¯n * stands for the closure of the set Γn*. We show that Γ¯n* is a convex cone, Γ2*=Γ2, Γ3*≠Γ3, but Γ¯3 *=Γ3. For four random variables, we have discovered a conditional inequality which is not implied by the basic information inequalities of the same set of random variables. This lends an evidence to the plausible conjecture that Γ¯n*≠Γn for n>3  相似文献   

9.
A general coding scheme for the nonrestricted memoryless discrete two-way channel is presented based on the introduction of auxiliary random variables forming a stationary Markov process. The coding scheme yields an achievable rate region which exceeds the inner bound of Shannon in the general case. A finite cardinality bound for the auxiliary random variables is given, showing that the region is computable. Finally, the capacity region for the memoryless Gaussian two-way channel is established.  相似文献   

10.
Using ideas from one-dimensional maximum entropy spectral estimation a two-dimensional spectral estimator is derived by extrapolating the two-dimensional sampled autocorrelation (or covariance) function. The method used maximizes the entropy of a set of random variables. The extrapolation (or prediction) process under this maximum entropy condition is shown to correspond to the most random extension or equivalently to the maximization of the mean-square prediction error when the optimum predictor is used. The two-dimensional extrapolation must he terminated by the investigator. The Fourier transform of the extrapolated autocorrelation function is the two-dimensional spectral estimator. Using this method one can apply windowing prior to calculating the spectral estimate. A specific algorithm for estimating the two-dimensional spectrum is presented, and its computational complexity is estimated. The algorithm has been programmed and computer examples are presented.  相似文献   

11.
A new information inequality of non-Shannon type is proved for three discrete random variables under conditional independence constraints, using the framework of entropy functions and polymatroids. Tightness of the inequality is described via quasi-groups  相似文献   

12.
The principle of minimum entropy of error estimation (MEEE) is formulated for discrete random variables. In the case when the random variable to be estimated is binary, we show that the MEEE is given by a Neyman-Pearson-type strictly monotonous test. In addition, the asymptotic behavior of the error probabilities is proved to be equivalent to that of the Bayesian test  相似文献   

13.
Generalizing the Fano inequality   总被引:1,自引:0,他引:1  
The Fano inequality gives a lower bound on the mutual information between two random variables that take values on an M-element set, provided at least one of the random variables is equiprobable. The authors show several simple lower bounds on mutual information which do not assume such a restriction. In particular, this ran be accomplished by replacing log M with the infinite-order Renyi entropy in the Fano inequality. Applications to hypothesis testing are exhibited along with bounds on mutual information in terms of the a priori and a posteriori error probabilities  相似文献   

14.
The entropy of a sequence of random variables under order restrictions is examined. A theorem that shows the amount of entropy reduction when the sequence is ordered is presented. Upper and lower bounds to the entropy reduction and conditions under which they are achieved are derived. Some interesting properties of the entropy of the individual order statistics are also presented. It is shown that the difference between the average entropy of the individual order statistics and the entropy of a member of the original independent identically distributed (IID) population is a constant, regardless of the original distribution. Finally, the entropies of the individual order statistics are found to be symmetric about the median when the probability density function (PDF) of the original IID sequence is symmetric about its mean  相似文献   

15.
Capacity results for the discrete memoryless network   总被引:1,自引:0,他引:1  
A discrete memoryless network (DMN) is a memoryless multiterminal channel with discrete inputs and outputs. A sequence of inner bounds to the DMN capacity region is derived by using code trees. Capacity expressions are given for three classes of DMNs: (1) a single-letter expression for a class with a common output, (2) a two-letter expression for a binary-symmetric broadcast channel (BC) with partial feedback, and (3) a finite-letter expression for push-to-talk DMNs. The first result is a consequence of a new capacity outer bound for common output DMNs. The third result demonstrates that the common practice of using a time-sharing random variable does not include all time-sharing possibilities, namely, time sharing of channels. Several techniques for improving the bounds are developed: (1) causally conditioned entropy and directed information simplify the inner bounds, (2) code trellises serve as simple code trees, (3) superposition coding and binning with code trees improves rates. Numerical computations show that the last technique enlarges the best known rate regions for a multiple-access channel (MAC) and a BC, both with feedback. In addition to the rate bounds, a sequence of inner bounds to the DMN reliability function is derived. A numerical example for a two-way channel illustrates the behavior of the error exponents.  相似文献   

16.
In communications networks, the capacity region of multisource network coding is given in terms of the set of entropy functions $Gamma ^{ast }$. More broadly, determination of $Gamma ^{ast }$ would have an impact on converse theorems for multi-terminal problems in information theory. This paper provides several new dualities between entropy functions and network codes. Given a function $ggeq 0$ defined on all subsets of $N$ random variables, we provide a construction for a network multicast problem which is “solvable” if and only if $g$ is the entropy function of a set of quasi-uniform random variables. The underlying network topology is fixed and the multicast problem depends on $g$ only through link capacities and source rates. A corresponding duality is developed for linear network codes, where the constructed multicast problem is linearly solvable if and only if $g$ is linear group characterizable. Relaxing the requirement that the domain of $g$ be subsets of random variables, we obtain a similar duality between polymatroids and the linear programming bound. These duality results provide an alternative proof of the insufficiency of linear (and abelian) network codes, and demonstrate the utility of non-Shannon inequalities to tighten outer bounds on network coding capacity regions.   相似文献   

17.
Cumulative residual entropy: a new measure of information   总被引:8,自引:0,他引:8  
In this paper, we use the cumulative distribution of a random variable to define its information content and thereby develop an alternative measure of uncertainty that extends Shannon entropy to random variables with continuous distributions. We call this measure cumulative residual entropy (CRE). The salient features of CRE are as follows: 1) it is more general than the Shannon entropy in that its definition is valid in the continuous and discrete domains, 2) it possesses more general mathematical properties than the Shannon entropy, and 3) it can be easily computed from sample data and these computations asymptotically converge to the true values. The properties of CRE and a precise formula relating CRE and Shannon entropy are given in the paper. Finally, we present some applications of CRE to reliability engineering and computer vision.  相似文献   

18.
A conditional entropy bound for a pair of discrete random variables   总被引:1,自引:0,他引:1  
LetX, Ybe a pair of discrete random variables with a given joint probability distribution. For0 leq x leq H(X), the entropy ofX, define the functionF(x)as the infimum ofH(Ymid W), the conditional entropy ofYgivenW, with respect to all discrete random variablesWsuch that a)H(Xmid W) = x, and b)WandYare conditionally independent givenX. This paper concerns the functionF, its properties, its calculation, and its applications to several problems in information theory.  相似文献   

19.
Upper bounds on the entropy of a countable integer-valued random variable are furnished in terms of the expectation of the logarithm function. In particular, an upper bound is derived that is sharper than that of P. Elias (ibid., vol.IT-21, no.2, p.194-203, 1975), for all values of Ep(log). Bounds that are better only for large values of Ep than the previous known upper bounds are also provided  相似文献   

20.
In this paper, we consider the robust filtering problem for discrete time-varying systems with delayed sensor measurement subject to norm-bounded parameter uncertainties. The delayed sensor measurement is assumed to be a linear function of a stochastic variable that satisfies the Bernoulli random binary distribution law. An upper bound for the actual covariance of the uncertain stochastic parameter system is derived and used for estimation variance constraints. Such an upper bound is then minimized over the filter parameters for all stochastic sensor delays and admissible deterministic uncertainties. It is shown that the desired filter can be obtained in terms of solutions to two discrete Riccati difference equations of a form suitable for recursive computation in online applications. An illustrative example is presented to show the applicability of the proposed method.   相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号