首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Wavelet-domain hidden Markov models have proven to be useful tools for statistical signal and image processing. The hidden Markov tree (HMT) model captures the key features of the joint probability density of the wavelet coefficients of real-world data. One potential drawback to the HMT framework is the need for computationally expensive iterative training to fit an HMT model to a given data set (e.g., using the expectation-maximization algorithm). We greatly simplify the HMT model by exploiting the inherent self-similarity of real-world images. The simplified model specifies the HMT parameters with just nine meta-parameters (independent of the size of the image and the number of wavelet scales). We also introduce a Bayesian universal HMT (uHMT) that fixes these nine parameters. The uHMT requires no training of any kind, while extremely simple, we show using a series of image estimation/denoising experiments that these new models retain nearly all of the key image structure modeled by the full HMT. Finally, we propose a fast shift-invariant HMT estimation algorithm that outperforms other wavelet-based estimators in the current literature, both visually and in mean square error.  相似文献   

2.
In this paper, we propose a Bayesian sampling solution to the noisy blind separation of generalized hyperbolic signals. Generalized hyperbolic models, introduced by Barndorff-Nielsen in 1977, represent a parametric family able to cover a wide range of real signal distributions. The alternative construction of these distributions as a normal mean variance (continuous) mixture leads to an efficient implementation of the Markov chain Monte Carlo method applied to source separation. The incomplete data structure of the generalized hyperbolic distribution is indeed compatible with the hidden variable nature of the source separation problem. Both overdeterminate and underdeterminate noisy mixtures are solved by the same algorithm without a prewhitening step. Our algorithm involves hyperparameters estimation as well. Therefore, it can be used, independently, to fitting the parameters of the generalized hyperbolic distribution to real data.  相似文献   

3.
The main problems in hyperspectral image analysis are spectral classification, segmentation, and data reduction. In this paper, we propose a Bayesian estimation approach which gives a joint solution for these problems. The problem is modeled as a blind sources separation (BSS). The data are M hyperspectral images and the sources are K < M images which are composed of compact homogeneous regions and have mutually disjoint supports. The set of all these regions cover the total surface of the observed scene. To insure these properties, we propose a hierarchical Markov model for the sources with a common hidden classification field which is modeled via a Potts-Markov field. The joint Bayesian estimation of the hidden variable, sources, and the mixing matrix of the BSS gives a solution for all three problems: spectra classification, segmentation, and data reduction of hyperspectral images. The mean field approximation (MFA) algorithm for the posterior laws is proposed for the effective Bayesian computation. Finally, some results of the application of the proposed methods on simulated and real data are given to illustrate the performance of the proposed method compared to other classical methods, such as PCA and ICA.  相似文献   

4.
Dual-tree complex wavelet hidden Markov tree model for image denoising   总被引:2,自引:0,他引:2  
《Electronics letters》2007,43(18):973-975
A new non-training complex wavelet hidden Markov tree (HMT) model, which is based on the dual-tree complex wavelet transform and a fast parameter estimation technique, is proposed for image denoising. This new model can mitigate the two problems (high computational cost and shift-variance) of the conventional wavelet HMT model simultaneously. Experiments show that the denoising approach with this new model achieves better performance than other related HMT- based image denoising algorithms.  相似文献   

5.
We propose applying the hidden Markov models (HMM) theory to the problem of blind channel estimation and data detection. The Baum-Welch (BW) algorithm, which is able to estimate all the parameters of the model, is enriched by introducing some linear constraints emerging from a linear FIR hypothesis on the channel. Additionally, a version of the algorithm that is suitable for time-varying channels is also presented. Performance is analyzed in a GSM environment using standard test channels and is found to be close to that obtained with a nonblind receiver  相似文献   

6.
Wavelet-based statistical signal processing techniques such as denoising and detection typically model the wavelet coefficients as independent or jointly Gaussian. These models are unrealistic for many real-world signals. We develop a new framework for statistical signal processing based on wavelet-domain hidden Markov models (HMMs) that concisely models the statistical dependencies and non-Gaussian statistics encountered in real-world signals. Wavelet-domain HMMs are designed with the intrinsic properties of the wavelet transform in mind and provide powerful, yet tractable, probabilistic signal models. Efficient expectation maximization algorithms are developed for fitting the HMMs to observational signal data. The new framework is suitable for a wide range of applications, including signal estimation, detection, classification, prediction, and even synthesis. To demonstrate the utility of wavelet-domain HMMs, we develop novel algorithms for signal denoising, classification, and detection  相似文献   

7.
Most simple nonlinear thresholding rules for wavelet-based denoising assume that the wavelet coefficients are independent. However, wavelet coefficients of natural images have significant dependencies. We only consider the dependencies between the coefficients and their parents in detail. For this purpose, new non-Gaussian bivariate distributions are proposed, and corresponding nonlinear threshold functions (shrinkage functions) are derived from the models using Bayesian estimation theory. The new shrinkage functions do not assume the independence of wavelet coefficients. We show three image denoising examples in order to show the performance of these new bivariate shrinkage rules. In the second example, a simple subband-dependent data-driven image denoising system is described and compared with effective data-driven techniques in the literature, namely VisuShrink, SureShrink, BayesShrink, and hidden Markov models. In the third example, the same idea is applied to the dual-tree complex wavelet coefficients.  相似文献   

8.
We introduce the notion of a generalized mixture and propose some methods for estimating it, along with applications to unsupervised statistical image segmentation. A distribution mixture is said to be "generalized" when the exact nature of the components is not known, but each belongs to a finite known set of families of distributions. For instance, we can consider a mixture of three distributions, each being exponential or Gaussian. The problem of estimating such a mixture contains thus a new difficulty: we have to label each of three components (there are eight possibilities). We show that the classical mixture estimation algorithms-expectation-maximization (EM), stochastic EM (SEM), and iterative conditional estimation (ICE)-can be adapted to such situations once as we dispose of a method of recognition of each component separately. That is, when we know that a sample proceeds from one family of the set considered, we have a decision rule for what family it belongs to. Considering the Pearson system, which is a set of eight families, the decision rule above is defined by the use of "skewness" and "kurtosis". The different algorithms so obtained are then applied to the problem of unsupervised Bayesian image segmentation, We propose the adaptive versions of SEM, EM, and ICE in the case of "blind", i.e., "pixel by pixel", segmentation. "Global" segmentation methods require modeling by hidden random Markov fields, and we propose adaptations of two traditional parameter estimation algorithms: Gibbsian EM (GEM) and ICE allowing the estimation of generalized mixtures corresponding to Pearson's system. The efficiency of different methods is compared via numerical studies, and the results of unsupervised segmentation of three real radar images by different methods are presented.  相似文献   

9.
A Markov model for blind image separation by a mean-field EM algorithm.   总被引:1,自引:0,他引:1  
This paper deals with blind separation of images from noisy linear mixtures with unknown coefficients, formulated as a Bayesian estimation problem. This is a flexible framework, where any kind of prior knowledge about the source images and the mixing matrix can be accounted for. In particular, we describe local correlation within the individual images through the use of Markov random field (MRF) image models. These are naturally suited to express the joint pdf of the sources in a factorized form, so that the statistical independence requirements of most independent component analysis approaches to blind source separation are retained. Our model also includes edge variables to preserve intensity discontinuities. MRF models have been proved to be very efficient in many visual reconstruction problems, such as blind image restoration, and allow separation and edge detection to be performed simultaneously. We propose an expectation-maximization algorithm with the mean field approximation to derive a procedure for estimating the mixing matrix, the sources, and their edge maps. We tested this procedure on both synthetic and real images, in the fully blind case (i.e., no prior information on mixing is exploited) and found that a source model accounting for local autocorrelation is able to increase robustness against noise, even space variant. Furthermore, when the model closely fits the source characteristics, independence is no longer a strict requirement, and cross-correlated sources can be separated, as well.  相似文献   

10.
SAR speckle reduction using wavelet denoising and Markov random field modeling   总被引:28,自引:0,他引:28  
The granular appearance of speckle noise in synthetic aperture radar (SAR) imagery makes it very difficult to visually and automatically interpret SAR data. Therefore, speckle reduction is a prerequisite for many SAR image processing tasks. In this paper, we develop a speckle reduction algorithm by fusing the wavelet Bayesian denoising technique with Markov-random-field-based image regularization. Wavelet coefficients are modeled independently and identically by a two-state Gaussian mixture model, while their spatial dependence is characterized by a Markov random field imposed on the hidden state of Gaussian mixtures. The Expectation-Maximization algorithm is used to estimate hyperparameters and specify the mixture model, and the iterated-conditional-modes method is implemented to optimize the state configuration. The noise-free wavelet coefficients are finally estimated by a shrinkage function based on local weighted averaging of the Bayesian estimator. Experimental results show that the proposed method outperforms standard wavelet denoising techniques in terms of the signal-to-noise ratio and the equivalent-number-of-looks measures in most cases. It also achieves better performance than the refined Lee filter.  相似文献   

11.
Iterative learning algorithms for linear Gaussian observation models   总被引:1,自引:0,他引:1  
In this paper, we consider a signal/parameter estimation problem that is based on a linear model structure and a given setting of statistical models with unknown hyperparameters. We consider several combinations of Gaussian and Laplacian models. We develop iterative algorithms based on two typical machine learning methods - the evidence-based method and the integration-based method - to deal with the hyperparameters. We have applied the proposed algorithms to adaptive prediction and wavelet denoising. In linear prediction, we show that the proposed algorithms are efficient tools for tackling a difficult problem of adapting simultaneously the order and the coefficients of the predictor. In wavelet denoising, we show that by using the proposed algorithms, the noisy wavelet coefficients are subject to shrinkage and thresholding.  相似文献   

12.
Images, captured with digital imaging devices, often contain noise. In literature, many algorithms exist for the removal of white uncorrelated noise, but they usually fail when applied to images with correlated noise. In this paper, we design a new denoising method for the removal of correlated noise, by modeling the significance of the noise-free wavelet coefficients in a local window using a new significance measure that defines the “signal of interest” and that is applicable to correlated noise. We combine the intrascale model with a hidden Markov tree model to capture the interscale dependencies between the wavelet coefficients. We propose a denoising method based on the combined model and a less redundant wavelet transform. We present results that show that the new method performs as well as the state-of-the-art wavelet-based methods, while having a lower computational complexity.   相似文献   

13.
Improved hidden Markov models in the wavelet-domain   总被引:11,自引:0,他引:11  
Wavelet-domain hidden Markov models (HMMs), in particular the hidden Markov tree (HMT) model, have been introduced and applied to signal and image processing, e.g., signal denoising. We develop a simple initialization scheme for the efficient HMT model training and then propose a new four-state HMT model called HMT-2. We find that the new initialization scheme fits the HMT-2 model well. Experimental results show that the performance of signal denoising using the HMT-2 model is often improved over the two-state HMT model developed by Crouse et al. (see ibid., vol.46, p.886-902, 1998)  相似文献   

14.
The hidden Markov tree models were introduced by Crouse et al. in 1998 for modeling nonindependent, non-Gaussian wavelet transform coefficients. In their paper, they developed the equivalent of the forward-backward algorithm for hidden Markov tree models and called it the "upward-downward algorithm". This algorithm is subject to the same numerical limitations as the forward-backward algorithm for hidden Markov chains (HMCs). In this paper, adapting the ideas of Devijver from 1985, we propose a new "upward-downward" algorithm, which is a true smoothing algorithm and is immune to numerical underflow. Furthermore, we propose a Viterbi-like algorithm for global restoration of the hidden state tree. The contribution of those algorithms as diagnosis tools is illustrated through the modeling of statistical dependencies between wavelet coefficients with a special emphasis on local regularity changes.  相似文献   

15.
We develop a hidden Markov mixture model based on a Dirichlet process (DP) prior, for representation of the statistics of sequential data for which a single hidden Markov model (HMM) may not be sufficient. The DP prior has an intrinsic clustering property that encourages parameter sharing, and this naturally reveals the proper number of mixture components. The evaluation of posterior distributions for all model parameters is achieved in two ways: 1) via a rigorous Markov chain Monte Carlo method; and 2) approximately and efficiently via a variational Bayes formulation. Using DP HMM mixture models in a Bayesian setting, we propose a novel scheme for music analysis, highlighting the effectiveness of the DP HMM mixture model. Music is treated as a time-series data sequence and each music piece is represented as a mixture of HMMs. We approximate the similarity of two music pieces by computing the distance between the associated HMM mixtures. Experimental results are presented for synthesized sequential data and from classical music clips. Music similarities computed using DP HMM mixture modeling are compared to those computed from Gaussian mixture modeling, for which the mixture modeling is also performed using DP. The results show that the performance of DP HMM mixture modeling exceeds that of the DP Gaussian mixture modeling.  相似文献   

16.
A stochastic maximum likelihood approach for blind estimation of co-channel signals received at an antenna array is proposed in this letter. A hidden Markov model formulation of the problem is introduced and the Baum-Welch algorithm for the associated stochastic maximum likelihood estimation procedure is modified. The performance of the proposed algorithm based on the evaluation of approximate Cramer-Rao bounds is studied. Finally, some simulation results are presented.  相似文献   

17.
目前THz 自由空间成像面临的挑战主要有大气损耗和水分吸收,辐射功率低,成像要获得高的信噪比,需要有更高功率的辐射源;数据获取时间长;图像质量仍需改善。分析了THz 成像技术的最新发展趋势及国内外发展现状。阐述了利用THz 辐射进行合成孔径成像、THz 压缩感知成像的基本原理,并对两种种成像方法形成的THz 图像的特点进行了分析。应用Wiener2,基于熵标准的ddencmp 选定小波系数阈值降噪法、Donoho 提出的小波系数阈值降噪法以及基于小波系数幅值渐近最优降噪法等图像降噪算法对THz 图像进行处理效果从均方根误差、信噪比、相关系数等方面进行了定性、定量的比较。提出将小波域马尔可夫随机场应用于THz 图像降噪中。主要完成了以下几个方面:对每个小波系数引入两个状态,一个状态对应图像的非平稳区域,如边缘;另一个状态对应图像平稳区。每个状态下的小波系数用高斯分布函数来描述,虽然每个状态下的小波系数服从高斯分布,但每个小波系数的两个状态混合模型服从非高斯分布。然后利用EM(Expectation Maximization)算法估计混合模型中的参数,采用贝叶斯准则初步确定理想图像小波系数的收缩因子。最后将小波域隐马尔可夫模型的降噪算法进行对比试验,仿真结果表明小波域隐马尔可夫模型的降噪算法更具有效性和优异性。  相似文献   

18.
We propose a new method for detecting activation in functional magnetic resonance imaging (fMRI) data. We project the fMRI time series on a low-dimensional subspace spanned by wavelet packets in order to create projections that are as non-Gaussian as possible. Our approach achieves two goals: it reduces the dimensionality of the problem by explicitly constructing a sparse approximation to the dataset and it also creates meaningful clusters allowing the separation of the activated regions from the clutter formed by the background time series. We use a mixture of Gaussian densities to model the distribution of the wavelet packet coefficients. We expect activated areas that are connected, and impose a spatial prior in the form of a Markov random field. Our approach was validated with in vivo data and realistic synthetic data, where it outperformed a linear model equipped with the knowledge of the true hemodynamic response.  相似文献   

19.
Due to the enormous quantity of radar images acquired by satellites and through shuttle missions, there is an evident need for efficient automatic analysis tools. This paper describes unsupervised classification of radar images in the framework of hidden Markov models and generalized mixture estimation. Hidden Markov chain models, applied to a Hilbert-Peano scan of the image, constitute a fast and robust alternative to hidden Markov random field models for spatial regularization of image analysis problems, even though the latter provide a finer and more intuitive modeling of spatial relationships. We here compare the two approaches and show that they can be combined in a way that conserves their respective advantages. We also describe how the distribution families and parameters of classes with constant or textured radar reflectivity can be determined through generalized mixture estimation. Sample results obtained on real and simulated radar images are presented.  相似文献   

20.
Directional multiscale modeling of images using the contourlet transform.   总被引:43,自引:0,他引:43  
The contourlet transform is a new two-dimensional extension of the wavelet transform using multiscale and directional filter banks. The contourlet expansion is composed of basis images oriented at various directions in multiple scales, with flexible aspect ratios. Given this rich set of basis images, the contourlet transform effectively captures smooth contours that are the dominant feature in natural images. We begin with a detailed study on the statistics of the contourlet coefficients of natural images: using histograms to estimate the marginal and joint distributions and mutual information to measure the dependencies between coefficients. This study reveals the highly non-Gaussian marginal statistics and strong interlocation, interscale, and interdirection dependencies of contourlet coefficients. We also find that conditioned on the magnitudes of their generalized neighborhood coefficients, contourlet coefficients can be approximately modeled as Gaussian random variables. Based on these findings, we model contourlet coefficients using a hidden Markov tree (HMT) model with Gaussian mixtures that can capture all interscale, interdirection, and interlocation dependencies. We present experimental results using this model in image denoising and texture retrieval applications. In denoising, the contourlet HMT outperforms other wavelet methods in terms of visual quality, especially around edges. In texture retrieval, it shows improvements in performance for various oriented textures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号