首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Minimum mean squared error equalization using a priori information   总被引:11,自引:0,他引:11  
A number of important advances have been made in the area of joint equalization and decoding of data transmitted over intersymbol interference (ISI) channels. Turbo equalization is an iterative approach to this problem, in which a maximum a posteriori probability (MAP) equalizer and a MAP decoder exchange soft information in the form of prior probabilities over the transmitted symbols. A number of reduced-complexity methods for turbo equalization have been introduced in which MAP equalization is replaced with suboptimal, low-complexity approaches. We explore a number of low-complexity soft-input/soft-output (SISO) equalization algorithms based on the minimum mean square error (MMSE) criterion. This includes the extension of existing approaches to general signal constellations and the derivation of a novel approach requiring less complexity than the MMSE-optimal solution. All approaches are qualitatively analyzed by observing the mean-square error averaged over a sequence of equalized data. We show that for the turbo equalization application, the MMSE-based SISO equalizers perform well compared with a MAP equalizer while providing a tremendous complexity reduction  相似文献   

2.
A joint source-channel decoding (JSCD) scheme which exploits the combined a priori information of source and channel in an iterative manner is proposed. A sequence minimum mean-square error (SMMSE) estimator based on bit or symbol error transition probability of the channel with memory is proposed and used in the iterative decoding process. Simulation results show that our proposed scheme leads to significant improvement over the scheme without using the a priori information of the source or channel.  相似文献   

3.
The two-dimensional inverse electromagnetic-scattering problem of reconstructing the material properties of inhomogeneous lossy dielectric cylindrical objects is considered. The material properties are reconstructed from measured far-field scattering data, provided by the USAF Rome Laboratory Electromagnetic Measurement Facility in Ipswich. The targets in the Ipswich data set include perfectly electrically conducting (PEC) targets, penetrable (PEN) targets, and hybrid targets. A new method, which incorporates a priori information about the material properties, is proposed to solve the nonlinear inverse-scattering problem  相似文献   

4.
We introduce a method for multichannel restoration of images in which there is severely limited knowledge about the undegraded signal, and possibly the noise. We assume that we know the channel degradations and that there will be a significant noise reduction in a postprocessing stage in which multiple realizations are combined. This post-restoration noise reduction is often performed when working with micrographs of biological macromolecules. The restoration filters are designed to enforce a projection constraint upon the entire system. This projection constraint results in a system that provides an oblique projection of the input signal into the subspace defined by the reconstruction device in a direction orthogonal to a space defined by the channel degradations and the restoration filters. The approach achieves noise reduction without distorting the signal by exploiting the redundancy of the measurements.  相似文献   

5.
On the use of a priori information for sparse signal approximations   总被引:1,自引:0,他引:1  
Recent results have underlined the importance of incoherence in redundant dictionaries for a good behavior of decomposition algorithms like matching and basis pursuit. However, appropriate dictionaries for a given application may not be able to meet the incoherence condition. In such a case, decomposition algorithms may completely fail in the retrieval of the sparsest approximation. This paper studies the effect of introducing a priori knowledge when recovering sparse approximations over redundant dictionaries. Theoretical results show how the use of reliable a priori information (which in this paper appears under the form of weights) can improve the performances of standard approaches such as greedy algorithms and relaxation methods. Our results reduce to the classical case when no prior information is available. Examples validate and illustrate our theoretical statements. EDICS: 2-NLSP.  相似文献   

6.
The concept of combining multiple experts in a unified framework to generate a combined decision based on individual decisions delivered by the cooperating experts has been exploited in solving the problem of handwritten and machine printed character recognition. The level of performance achieved in terms of the absolute recognition performance and increased confidences associated with these decisions is very encouraging. However, the underlying philosophy behind this success is still not completely understood. The authors analyse the problem of decision combination of multiple experts from a completely different perspective. It is demonstrated that the success or failure of the decision combination strategy largely depends on the extent to which the various possible sources of information are exploited in designing the decision combination framework. Seven different multiple expert decision combination strategies are evaluated in terms of this information management issue. It is demonstrated that it is possible to treat the comparative evaluation of the multiple expert decision combination approaches based on their capability for exploiting diverse information extracted from the various sources as a yardstick in estimating the level of performance that is achievable from these combined configurations  相似文献   

7.
A Bayesian method is presented for simultaneously segmenting and reconstructing emission computed tomography (ECT) images and for incorporating high-resolution, anatomical information into those reconstructions. The anatomical information is often available from other imaging modalities such as computed tomography (CT) or magnetic resonance imaging (MRI). The Bayesian procedure models the ECT radiopharmaceutical distribution as consisting of regions, such that radiopharmaceutical activity is similar throughout each region. It estimates the number of regions, the mean activity of each region, and the region classification and mean activity of each voxel. Anatomical information is incorporated by assigning higher prior probabilities to ECT segmentations in which each ECT region stays within a single anatomical region. This approach is effective because anatomical tissue type often strongly influences radiopharmaceutical uptake. The Bayesian procedure is evaluated using physically acquired single-photon emission computed tomography (SPECT) projection data and MRI for the three-dimensional (3-D) Hoffman brain phantom. A clinically realistic count level is used. A cold lesion within the brain phantom is created during the SPECT scan but not during the MRI to demonstrate that the estimation procedure can detect ECT structure that is not present anatomically.  相似文献   

8.
A new approach is presented to determine atmospheric temperature profiles by combining measurements coming from different sources and taking into account evolution models derived by conventional meteorological observations. Using a historical database of atmospheric parameters and related microwave brightness temperatures, the authors have developed a data assimilation procedure based on the geostatistical Kriging method and the Kalman filtering suitable for processing satellite radiometric measurements available at each satellite pass, data of a ground-based radiometer, and temperature profiles from radiosondes released at specific times and locations. The Kalman filter technique and the geostatistical Kriging method as well as the principal component analysis have proved very powerful in exploiting climatological a priori information to build spatial and temporal evolution models of the atmospheric temperature field. The use of both historical radiosoundings (RAOBs) and a radiative transfer code allowed the estimation of the statistical parameters that appears in the models themselves (covariance and cross-covariance matrices, observation matrix, etc.). The authors have developed an algorithm, based on a Kalman filter supplemented with a Kriging geostatistical interpolator, that shows a significant improvement of accuracy in vertical profile estimations with respect to the results of a standard Kalman filter when applied to real satellite radiometric data  相似文献   

9.
Brain magnetic resonance imaging segmentation is accomplished in this work by applying nonparametric density estimation, using the mean shift algorithm in the joint spatial-range domain. The quality of the class boundaries is improved by including an edge confidence map, that represents the confidence of truly being in the presence of a border between adjacent regions; an adjacency graph is then constructed with the labeled regions, and analyzed and pruned to merge adjacent regions. In order to assign image regions to a cerebral tissue type, a spatial normalization between image data and standard probability maps is carried out, so that for each structure a maximum a posteriori probability criterion is applied. The method was applied to synthetic and real images, keeping all parameters constant throughout the process for each type of data. The combination of region segmentation and edge detection proved to be a robust technique, as adequate clusters were automatically identified, regardless of the noise level and bias. In a comparison with reference segmentations, average Tanimoto indexes of 0.90-0.99 were obtained for synthetic data and of 0.59-0.99 for real data, considering gray matter, white matter, and background.  相似文献   

10.
A method that incorporates a priori uniform or nonuniform source distribution probabilistic information and data fluctuations of a Poisson nature is presented. The source distributions are modeled in terms of a priori source probability density functions. Maximum a posteriori probability solutions, as determined by a system of equations, are given. Interactive Bayesian imaging algorithms for the solutions are derived using an expectation maximization technique. Comparisons of the a priori uniform and nonuniform Bayesian algorithms to the maximum-likelihood algorithm are carried out using computer-generated noise-free and Poisson randomized projections. Improvement in image reconstruction from projections with the Bayesian algorithm is demonstrated. Superior results are obtained using the a priori nonuniform source distribution.  相似文献   

11.
在个人通信以及联网寻呼中,怎样能够以最小的网络负载和最少的费用最快地期望的用户,最值得研究的问题。本文基于智能网提出了先验位置信息的概念,并给出了这种信息在位置登记器中的应用算法。计算机性能模拟显示它使原有的位置登记器中的智能网性能得到了改善。  相似文献   

12.
Features of the digital filtering of signals with an unknown structure are investigated. The analysis of combined application of various digital filtering algorithms aimed at the increase of the signal-to-noise ratio is provided. A quasi-optimal filtering algorithm for the input signal with a priori unknown structural parameters is developed.  相似文献   

13.
To combat the effects of intersymbol interference (ISI), the optimal equalizer to be used is based on maximum a posteriori (MAP) detection. In this paper, we consider the case where the MAP equalizer is fed with a priori information on the transmitted data and propose to study analytically their impact on the MAP equalizer performance. We assume that the channel is not perfectly estimated and show that the use of both the a priori information and the channel estimate is equivalent to a shift in terms of signal-to-noise ratio (SNR) for which we provide an analytical expression. Simulation results show that the analytical expression approximates well the equalizer behavior.  相似文献   

14.
This paper presents a retrieval algorithm that estimates spatial and temporal distribution of volumetric soil moisture content, at an approximate depth of 5 cm, using multitemporal ENVISAT Advanced Synthetic Aperture Radar (ASAR) alternating polarization images, acquired at low incidence angles (i.e., from 15/spl deg/ to 31/spl deg/). The algorithm appropriately assimilates a priori information on soil moisture content and surface roughness in order to constrain the inversion of theoretical direct models, such as the integral equation method model and the geometric optics model. The a priori information on soil moisture content is obtained through simple lumped water balance models, whereas that on soil roughness is derived by means of an empirical approach. To update prior estimates of surface parameters, when no reliable a priori information is available, a technique based solely on the use of multitemporal SAR information is proposed. The developed retrieval algorithm is assessed on the Matera site (Italy) where multitemporal ground and ASAR data were simultaneously acquired in 2003. Simulated and experimental results indicate the possibility of attaining an accuracy of approximately 5% in the retrieved volumetric soil moisture content, provided that sufficiently accurate a priori information on surface parameters (i.e., within 20% of their whole variability range) is available. As an example, multitemporal soil moisture maps at watershed scale, characterized by a spatial resolution of approximately 150 m, are derived and illustrated in the paper.  相似文献   

15.
This paper presents an innovative microwave technique, which is suitable for the detection of defects in nondestructive-test and nondestructive-evaluation (NDT/NDE) applications where a lot of a priori information is available. The proposed approach is based on the equations of the inverse scattering problem, which are solved by means of a minimization procedure based on a genetic algorithm. To reduce the number of problem unknowns, the available a priori information is efficiently exploited by introducing an updating procedure for the electric field computation based on the Sherman-Morrison-Woodbury formula. The results of a representative set of numerical experiments as well as comparisons with state-of-the-art methods are reported. They confirm the effectiveness, feasibility, and robustness of the proposed approach, which shows some interesting features by a computational point of view as well.  相似文献   

16.
Nonparametric snakes.   总被引:1,自引:0,他引:1  
Active contours, or so-called snakes, require some parameters to determine the form of the external force or to adjust the tradeoff between the internal forces and the external forces acting on the active contour. However, the optimal values of these parameters cannot be easily identified in a general sense. The usual way to find these required parameters is to run the algorithm several times for a different set of parameters, until a satisfactory performance is obtained. Our nonparametric formulation translates the problem of seeking these unknown parameters into the problem of seeking a good edge probability density estimate. Density estimation is a well-researched field, and our nonparametric formulation allows using well-known concepts of density estimation to get rid of the exhaustive parameter search. Indeed, with the use of kernel density estimation these parameters can be defined locally, whereas, in the original snake approach, all the shape parameters are defined globally. We tested the proposed method on synthetic and real images and obtained comparatively better results.  相似文献   

17.
Sequential detection under conditions of a priori uncertainty is investigated. A MAP sequential detector is developed and its performance is evaluated using mean path approximation. The result obtained are verified via comparison with previously published computer simulation research. The comparison shows a good agreement between theory and experiment. The sequential approach is shown to provide a greatly reduced error rate as compared with one nonsequential approach under the same signal/noise conditions.  相似文献   

18.
Interconnections are quickly becoming a dominant factor in the design of computer chips. Techniques to estimate interconnection lengths a priori (very early in the design flow) therefore gain attention and will become important for making the right design decisions when one still has the freedom to do so. However, at that time, one also knows least about the possible results of subsequent design steps. Conventional models for a priori estimation of wire lengths in computer chips use Rent's rule to estimate the number of terminals needed for communication between sets of gates. The number of interconnections then follows by taking into account that most nets are point-to-point connections. In this paper, we apply our previously introduced model for multiterminal nets to show that such nets have a fundamentally different influence on the wire length estimations than point-to-point nets. We then estimate the wire length distribution of Steiner tree lengths for applications related to routing resource estimation. Experiments show that the new estimated Steiner-length distributions capture the multiterminal effects much better than the previous point-to-point length distributions. The accuracy of the estimated values is still too low, as for the conventional point-to-point models, because we are still lacking a good model for placement optimization. However, the new results are a step closer to the application of wire length estimation techniques in real-world situations.  相似文献   

19.
One of the difficulties in the development of a reliable artificial pancreas for people with type 1 diabetes mellitus (T1DM) is the lack of accurate models of an individual's response to insulin. Most control algorithms proposed to control the glucose level in subjects with T1DM are model-based. Avoiding postprandial hypoglycemia ( 60 mg/dl) while minimizing prandial hyperglycemia ( > 180 mg/dl) has shown to be difficult in a closed-loop setting due to the patient-model mismatch. In this paper, control-relevant models are developed for T1DM, as opposed to models that minimize a prediction error. The parameters of these models are chosen conservatively to minimize the likelihood of hypoglycemia events. To limit the conservatism due to large intersubject variability, the models are personalized using a priori patient characteristics. The models are implemented in a zone model predictive control algorithm. The robustness of these controllers is evaluated in silico, where hypoglycemia is completely avoided even after large meal disturbances. The proposed control approach is simple and the controller can be set up by a physician without the need for control expertise.  相似文献   

20.
This paper presents a new minimal and backward stable QR-LSL algorithm obtained through the proper interpretation of the system matrix that describes the adaptation and filtering operations of QR-RLS algorithms. The new algorithm is based on a priori prediction errors normalized by the a posteriori prediction error energy-as suggested by the interpretation of the system matrix-and uses the fact that the latter quantities can be computed via a lattice structure. Backward consistency and backward stability become guaranteed under simple numerical conventions. In contrast with the known a posteriori QR-LSL algorithm, the new algorithm present; fewer numerical complexity, and backward consistency is guaranteed without the constraint of passive rotations in the recursive lattice section. Furthermore, reordering of some operations results in a version with identical numerical behavior and inherent parallelism that can be exploited for fast implementations. Both a priori and a posteriori QR-LSL algorithms are compared by means of simulations. For small mantissa wordlengths and forgetting factors λ not too close to 1, the proposed algorithm performs better due to dispensing with passive rotations. For forgetting factors very close to one and small wordlengths, both algorithms are sensitive to the accuracy of some well-identified computations  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号