共查询到20条相似文献,搜索用时 46 毫秒
1.
Vauhkonen M. Vadasz D. Karjalainen P.A. Somersalo E. Kaipio J.P. 《IEEE transactions on medical imaging》1998,17(2):285-293
The solution of impedance distribution in electrical impedance tomography is a nonlinear inverse problem that requires the use of a regularization method. The generalized Tikhonov regularization methods have been popular in the solution of many inverse problems. The regularization matrices that are usually used with the Tikhonov method are more or less ad hoc and the implicit prior assumptions are, thus, in many cases inappropriate. In this paper, the authors propose an approach to the construction of the regularization matrix that conforms to the prior assumptions on the impedance distribution. The approach is based on the construction of an approximating subspace for the expected impedance distributions. It is shown by simulations that the reconstructions obtained with the proposed method are better than with two other schemes of the same type when the prior is compatible with the true object. On the other hand, when the prior is incompatible with the true object, the method will still give reasonable estimates 相似文献
2.
Ghodrati A Brooks DH Tadmor G MacLeod RS 《IEEE transactions on bio-medical engineering》2006,53(9):1821-1831
We introduce two wavefront-based methods for the inverse problem of electrocardiography, which we term wavefront-based curve reconstruction (WBCR) and wavefront-based potential reconstruction (WBPR). In the WBCR approach, the epicardial activation wavefront is modeled as a curve evolving on the heart surface, with the evolution governed by factors derived phenomenologically from prior measured data. The body surface potential/wavefront relationship is modeled via an intermediate mapping of wavefront to epicardial potentials, again derived phenomenologically. In the WBPR approach, we iteratively construct an estimate of epicardial potentials from an estimated wavefront curve according to a simplified model and use it as an initial solution in a Tikhonov regularization scheme. Initial simulation results using measured canine epicardial data show considerable improvement in reconstructing activation wavefronts and epicardial potentials with respect to standard Tikhonov solutions. In particular the WBCR method accurately finds the anisotropic propagation early after epicardial pacing, and the WBPR method finds the wavefront (regions of sharp gradient of the potential) both accurately and with minimal smoothing. 相似文献
3.
Numeric regularization methods most often used to solve the ill-posed inverse problem of electrocardiography are spatial and ignore the temporal nature of the problem. In this paper, a Kalman filter reformulation incorporated temporal information to regularize the inverse problem, and was applied to reconstruct left ventricular endocardial electrograms based on cavitary electrograms measured by a noncontact, multielectrode probe. These results were validated against in situ electrograms measured with an integrated, multielectrode basket-catheter. A three-dimensional, probe-endocardium model was determined from multiplane fluoroscopic images. The boundary element method was applied to solve the boundary value problem and determine a linear relationship between endocardial and probe potentials. The Duncan and Horn formulation of the Kalman filter was employed and was compared to the commonly used zero- and first-order Tikhonov spatial regularization as well as the Twomey temporal regularization method. Endocardial electrograms were reconstructed during both sinus and paced rhythms. The Paige and Saunders solution of the Duncan and Horn formulation reconstructed endocardial electrograms at an amplitude relative error of 13% (potential amplitude) which was superior to solutions obtained with zero-order Tikhonov (relative error, 31%), first-order Tikhonov (relative error, 19%), and Twomey regularization (relative error, 44%). Likewise, activation time error in the inverse solution using the Duncan and Horn formulation (2.9 ms) was smaller than that of zero-order Tikhonov (4.8 ms), first-order Tikhonov (5.4 ms), and Twomey regularization (5.8 ms). Therefore, temporal regularization based on the Duncan and Horn formulation of the Kalman filter improves the solution of the inverse problem of electrocardiography. 相似文献
4.
5.
This paper explores regularization options for the ill-posed spline coefficient equations in the realistic Laplacian computation. We investigate the use of the Tikhonov regularization, truncated singular value decomposition, and the so-called lambda-correction with the regularization parameter chosen by the L-curve, generalized cross-validation, quasi-optimality, and the discrepancy principle criteria. The provided range of regularization techniques is much wider than in the previous works. The improvement of the realistic Laplacian is investigated by simulations on the three-shell spherical head model. The conclusion is that the best performance is provided by the combination of the Tikhonov regularization and the generalized cross-validation criterion-a combination that has never been suggested for this task before. 相似文献
6.
Alberto Salinas R. Richardson C. Abidi M.A. Gonzalez R.C. 《Industrial Electronics, IEEE Transactions on》1996,43(3):355-363
Data fusion provides tools for solving problems which are characterized by distributed and diverse information sources. In this paper, the authors focus on the problem of extracting features such as image discontinuities from both synthetic and real images. Since edge detection and surface reconstruction are ill-posed problems in the sense of Hadamard, Tikhonov's regularization paradigm is proposed as the basic tool for solving this inversion problem and restoring well-posedness. The proposed framework includes: (1) a systematic view of oneand two-dimensional regularization; (2) extension of the standard Tikhonov regularization method by allowing space-variant regularization parameters; and (3) further extension of the regularization paradigm by adding multiple data sources to allow for data fusion. The theoretical approach is complemented by developing a series of algorithms, then solving the early vision problems of color edge detection and surface reconstruction. An evaluation of these methods reveals that this new analytical data fusion technique output is consistently better than each of the individual RGB edge maps, and noisy corrupted images are reconstructed into smooth noiseless surfaces 相似文献
7.
8.
Crimi A Lillholm M Nielsen M Ghosh A de Bruijne M Dam EB Sporring J 《IEEE transactions on medical imaging》2011,30(8):1514-1526
The estimation of covariance matrices is a crucial step in several statistical tasks. Especially when using few samples of a high dimensional representation of shapes, the standard maximum likelihood estimation (ML) of the covariance matrix can be far from the truth, is often rank deficient, and may lead to unreliable results. In this paper, we discuss regularization by prior knowledge using maximum a posteriori (MAP) estimates. We compare ML to MAP using a number of priors and to Tikhonov regularization. We evaluate the covariance estimates on both synthetic and real data, and we analyze the estimates' influence on a missing-data reconstruction task, where high resolution vertebra and cartilage models are reconstructed from incomplete and lower dimensional representations. Our results demonstrate that our methods outperform the traditional ML method and Tikhonov regularization. 相似文献
9.
10.
《IEEE transactions on bio-medical engineering》1997,44(6):447-454
The authors have previously proposed two novel solutions to the inverse problem of electrocardiography, the generalized eigensystem technique (GES) and the modified generalized eigensystem technique (tGES), and have compared these techniques with other numerical techniques using both homogeneous and inhomogeneous eccentric spheres model problems. In those studies the authors found their generalized eigensystem approaches generally gave superior performance over both truncated singular value decomposition (SVD) and zero-order Tikhonov regularization (TIK). Here, the authors extend the comparison to the case of a realistic heart-torso geometry. With this model, the GES and tGES approaches again provide smaller relative errors between the true potentials and the numerically derived potentials than the other methods studied. In addition, the isopotential maps recovered using GES and tGES appear to be more accurate than the maps recovered using either SVD and TIK 相似文献
11.
Li M Cao X Liu F Zhang B Luo J Bai J 《IEEE transactions on bio-medical engineering》2012,59(7):1799-1803
In fluorescence molecular tomography, the highly scattering property of biological tissues leads to an ill-posed inverse problem and reduces the accuracy of detection and localization of fluorescent targets. Regularization technique is usually utilized to obtain a stable solution. Conventional Tikhonov regularization is based on singular value decomposition (SVD) and L-curve strategy, which attempts to find a tradeoff between the residual norm and image norm. It is computationally demanding and may fail when there is no optimal turning point in the L-curve plot. In this letter, a neighborhood regularization method is presented. It achieves a reliable solution by computing the geometric mean of multiple regularized solutions. These solutions correspond to different regularization parameters with neighbor orders of magnitude. The main advantages lie in three aspects. First, it can produce comparable or better results in comparison with the conventional Tikhonov regularization with L-curve routine. Second, it replaces the time-consuming SVD computation with a trace-based pseudoinverse strategy, thus greatly reducing the computational cost. Third, it is robust and practical even when the L-curve technique fails. Results from numerical and phantom experiments demonstrate that the proposed method is easy to implement and effective in improving the quality of reconstruction. 相似文献
12.
13.
盲图像恢复就是在点扩散函数未知情况下从降质观测图像恢复出原图像.该文提出了一种交替使用小波去噪和全变差正则化的盲图像恢复算法.观测模型首先被分解成两个相互关联的子模型,这种分解转化盲恢复问题成为图像去噪和图像恢复两个问题,可以交替采用图像去噪和图像恢复算法求解.模糊辨识阶段,使用全变差正则化算法估计点扩散函数;图像恢复阶段,使用小波去噪和全变差正则化相结合的算法恢复图像.实验结果和与其它方法的比较表明该文算法能够获得更好的恢复效果. 相似文献
14.
This paper describes a new approach to reconstruction of the conductivity field in electrical impedance tomography. Our goal is to improve the tradeoff between the quality of the images and the numerical complexity of the reconstruction method. In order to reduce the computational load, we adopt a linearized approximation to the forward problem that describes the relationship between the unknown conductivity and the measurements. In this framework, we focus on finding a proper way to cope with the ill-posed nature of the problem, mainly caused by strong attenuation phenomena; this is done by devising regularization techniques well suited to this particular problem. First, we propose a solution which is based on Tikhonov regularization of the problem. Second, we introduce an original regularized reconstruction method in which the regularization matrix is determined by space-uniformization of the variance of the reconstructed conductivities. Both methods are nonsupervised, i.e., all tuning parameters are automatically determined from the measured data. Tests performed on simulated and real data indicate that Tikhonov regularization provides results similar to those obtained with iterative methods, but with a much smaller amount of computations. Regularization using a variance uniformization constraint yields further improvements, particularly in the central region of the unknown object where attenuation is most severe. We anticipate that the variance uniformization approach could be adapted to iterative methods that preserve the nonlinearity of the forward problem. More generally, it appears as a useful tool for solving other severely ill-posed reconstruction problems such as eddy current tomography 相似文献
15.
16.
Tikhonov regularization with a modified total variation regularization functional is used to recover an image from noisy, blurred data. This approach is appropriate for image processing in that it does not place a priori smoothness conditions on the solution image. An efficient algorithm is presented for the discretized problem that combines a fixed point iteration to handle nonlinearity with a new, effective preconditioned conjugate gradient iteration for large linear systems. Reconstructions, convergence results, and a direct comparison with a fast linear solver are presented for a satellite image reconstruction application. 相似文献
17.
The performance of two methods for selecting the corner in the L-curve approach to Tikhonov regularization is evaluated via computer simulation. These methods are selecting the corner as the point of maximum curvature in the L-curve, and selecting it as the point where the product of abcissa and ordinate is a minimum. It is shown that both these methods resulted in significantly better regularization parameters than that obtained with an often-used empirical Composite REsidual and Smoothing Operator approach, particularly in conditions where correlated geometry noise exceeds Gaussian measurement noise. It is also shown that the regularization parameter that results with the minimum-product method is identical to that selected with another empirical zero-crossing approach proposed earlier. 相似文献
18.
Direct inference of heart-surface potentials from body-surface potentials has been the goal of most recent work on electrocardiographic inverse solutions. We developed and tested indirect methods for inferring heart-surface potentials based on estimation of regularized multipole sources. Regularization was done using Tikhonov, constrained-least-squares, and multipole-truncation techniques. These multipole-equivalent methods (MEMs) were compared to the conventional mixed boundary-value method (BVM) in a realistic torso model with up to 20% noise added to body-surface potentials and +/-1 cm error in heart position and size. Optimal regularization was used for all inverse solutions. The relative error of inferred heart-surface potentials of the MEM was significantly less (p < 0.05) than that of the BVM using zeroth-order Tikhonov regularization in 10 of the 12 cases tested. These improvements occurred with a fourth-degree (24 coefficients) or smaller multipole moment. From these multipole coefficients, heart-surface potentials can be found at an unlimited number of heart-surface locations. Our indirect methods for estimating heart-surface potentials based on multipole inference appear to offer significant improvement over the conventional direct approach. 相似文献
19.
Singular value decomposition-based reconstruction algorithm forseismic traveltime tomography 总被引:1,自引:0,他引:1
Lin-Ping Song Shu-Yi Zhang 《IEEE transactions on image processing》1999,8(8):1152-1154
A reconstruction method is given for seismic transmission traveltime tomography. The method is implemented via the combinations of singular value decomposition, appropriate weighting matrices, and variable regularization parameter. The problem is scaled through the weighting matrices so that the singular spectrum is normalized. Matching the normalized singular values, a regularization parameter varies within the interval [0, 1], and linearly increases with singular value index from a small, initial value rather than a fixed one to eliminate the impacts of smaller singular values' components. The experimental results show that the proposed method is superior to the ordinary singular value decomposition (SVD) methods such as truncated SVD and Tikhonov (1977) regularization. 相似文献
20.
基于改进R矩阵方法的光谱反射率重建研究 总被引:1,自引:1,他引:0
为了解决传统R矩阵方法重建光谱反射率过程中的病态问题,提出了一种正则化R矩阵改进方法。研究了如何使用Tikhonov正则化来限制R矩阵方法求解Moore-Penrose伪逆矩阵中所产生的病态问题,同时使用L曲线方法从训练样本数据中得到有效限制病态情况的最优正则化参数。实验结果表明:与R矩阵方法相比,改进的正则化R矩阵方法重建的光谱精度的平均均方根误差值降低了0.004 25,平均适应度系数值提高了1.325%,色度精度的平均色差值降低了0.141 9,并且在实际的应用中具有较好的表现。该方法能够满足精度较高的文物艺术品颜色再现的需要。 相似文献