首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper presents a geometric mean scheme (GMS) to determine an optimal regularization factor for Tikhonov regularization technique in the system identification problems of linear elastic continua. The characteristics of non‐linear inverse problems and the role of the regularization are investigated by the singular value decomposition of a sensitivity matrix of responses. It is shown that the regularization results in a solution of a generalized average between the a priori estimates and the a posteriori solution. Based on this observation, the optimal regularization factor is defined as the geometric mean between the maximum singular value and the minimum singular value of the sensitivity matrix of responses. The validity of the GMS is demonstrated through two numerical examples with measurement errors and modelling errors. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

2.
In this paper, we discuss a deterministic regularization algorithm to handle the missing cone problem of three-dimensional optical diffraction tomography (ODT). The missing cone problem arises in most practical applications of ODT and is responsible for elongation of the reconstructed shape and underestimation of the value of the refractive index. By applying positivity and piecewise-smoothness constraints in an iterative reconstruction framework, we effectively suppress the missing cone artifact and recover sharp edges rounded out by the missing cone, and we significantly improve the accuracy of the predictions of the refractive index. We also show the noise-handling capability of our algorithm in the reconstruction process.  相似文献   

3.
Diffuse tomography with near-infrared light has biomedical application for imaging hemoglobin, water, lipids, cytochromes, or exogenous contrast agents and is being investigated for breast cancer diagnosis. A Newton-Raphson inversion algorithm is used for image reconstruction of tissue optical absorption and transport scattering coefficients from frequency-domain measurements of modulated phase shift and light intensity. A variant of Tikhonov regularization is examined in which radial variation is allowed in the value of the regularization parameter. This method minimizes high-frequency noise in the reconstructed image near the source-detector locations and can produce constant image resolution and contrast across the image field.  相似文献   

4.
5.
For optical coherence tomography (OCT), ultrasound, synthetic-aperture radar, and other coherent ranging methods, speckle can cause spurious detail that detracts from the utility of the image. It is a problem inherent to imaging densely scattering objects with limited bandwidth. Using a method of regularization by minimizing Csiszar's I-divergence measure, we derive a method of speckle minimization that produces an image that both is consistent with the known data and extrapolates additional detail based on constraints on the magnitude of the image. This method is demonstrated on a test image and on an OCT image of a Xenopus laevis tadpole.  相似文献   

6.
实验观测数据的最优正则平滑方法   总被引:1,自引:0,他引:1  
为了滤除测量噪声 ,提出了一种对实验观测数据进行最优化正则平滑的数据处理方法 .文中阐述了方法的基本原理 ,并就稳定泛函和正则参数的选择等关键问题作了分析和论述 .通过一个数学模拟实例对正则化平滑方法的效果进行了验证 ,这种正则化平滑方法在数学物理反问题求解等领域具有独特的优点  相似文献   

7.
A new nonlinear level-set regularization method to reconstruct the complex refractive index distribution with in-line phase contrast tomography measurements is presented under the assumption that the index is piecewise constant. The nonlinear iterative approach is based on the Fréchet derivative of the intensity recorded at a single propagation distance and for several projection angles. The algorithm is successfully applied to a multi-material object for several noise levels. Better reconstruction results are achieved with a stochastic perturbation of the level-set function. This evolution corresponds to a stochastic evolution of the shape of the reconstructed regions. The reconstruction errors can be further decreased with topological derivatives. The different algorithms are tested on various multi-material objects.  相似文献   

8.
Chen LY  Pan MC  Pan MC 《Applied optics》2012,51(1):43-54
In this study, we first propose the use of edge-preserving regularization in optimizing an ill-conditioned problem in the reconstruction procedure for diffuse optical tomography to prevent unwanted edge smoothing, which usually degrades the attributes of images for distinguishing tumors from background tissues when using Tikhonov regularization. In the edge-preserving regularization method presented here, a potential function with edge-preserving properties is introduced as a regularized term in an objective function. With the minimization of this proposed objective function, an iterative method to solve this optimization problem is presented in which half-quadratic regularization is introduced to simplify the minimization task. Both numerical and experimental data are employed to justify the proposed technique. The reconstruction results indicate that edge-preserving regularization provides a superior performance over Tikhonov regularization.  相似文献   

9.
This paper focuses on optimization of the geometrical parameters of peripheral milling tools by taking into account the dynamic effect. A substructure synthesis technique is used to calculate the frequency response function of the tool point, which is adopted to determine the stability lobe diagram. Based on the Taguchi design method, simulations are first conducted for varying combinations of tool overhang length, helix angle, and teeth number. The optimal geometrical parameters of the tool are determined through an orthogonal analysis of the maximum axial depth of cut, which is obtained from the predicted stability lobe diagram. It was found that the sequence of every factor used to determine the optimal tool geometrical parameters is the tool overhang length, teeth number, and helix angle. Finally, a series of experiments were carried out as a parameter study to determine the influence of the tool overhang length, helix angle, and teeth number on the cutting stability of a mill. The same conclusion as that obtained through the simulation was observed.The full text can be downloaded at https://link.springer.com/content/pdf/10.1007%2Fs40436-018-0226-9.pdf  相似文献   

10.
J Feng  C Qin  K Jia  S Zhu  K Liu  D Han  X Yang  Q Gao  J Tian 《Applied optics》2012,51(19):4501-4512
Regularization methods have been broadly applied to bioluminescence tomography (BLT) to obtain stable solutions, including l2 and l1 regularizations. However, l2 regularization can oversmooth reconstructed images and l1 regularization may sparsify the source distribution, which degrades image quality. In this paper, the use of total variation (TV) regularization in BLT is investigated. Since a nonnegativity constraint can lead to improved image quality, the nonnegative constraint should be considered in BLT. However, TV regularization with a nonnegativity constraint is extremely difficult to solve due to its nondifferentiability and nonlinearity. The aim of this work is to validate the split Bregman method to minimize the TV regularization problem with a nonnegativity constraint for BLT. The performance of split Bregman-resolved TV (SBRTV) based BLT reconstruction algorithm was verified with numerical and in vivo experiments. Experimental results demonstrate that the SBRTV regularization can provide better regularization quality over l2 and l1 regularizations.  相似文献   

11.
Diffuse optical tomographic imaging is known to be an ill-posed problem, and a penalty/regularization term is used in image reconstruction (inverse problem) to overcome this limitation. Two schemes that are prevalent are spatially varying (exponential) and constant (standard) regularizations/penalties. A scheme that is also spatially varying but uses the model information is introduced based on the model-resolution matrix. This scheme, along with exponential and standard regularization schemes, is evaluated objectively based on model-resolution and data-resolution matrices. This objective analysis showed that resolution characteristics are better for spatially varying penalties compared to standard regularization; and among spatially varying regularization schemes, the model-resolution based regularization fares well in providing improved data-resolution and model-resolution characteristics. The verification of the same is achieved by performing numerical experiments in reconstructing 1% noisy data involving simple two- and three-dimensional imaging domains.  相似文献   

12.
Han D  Yang X  Liu K  Qin C  Zhang B  Ma X  Tian J 《Applied optics》2010,49(36):6930-6937
Fluorescence molecular tomography (FMT) is a promising technique for in vivo small animal imaging. In this paper, the sparsity of the fluorescent sources is considered as the a priori information and is promoted by incorporating L1 regularization. Then a reconstruction algorithm based on stagewise orthogonal matching pursuit is proposed, which treats the FMT problem as the basis pursuit problem. To evaluate this method, we compare it to the iterated-shrinkage-based algorithm with L1 regularization. Numerical simulations and physical experiments show that the proposed method can obtain comparable or even slightly better results. More importantly, the proposed method was at least 2 orders of magnitude faster in these experiments, which makes it a practical reconstruction algorithm.  相似文献   

13.
The method of Tarantola1 based on Bayesian statistical theory for solving general inverse problems is applied to inverse elasticity problems and is compared to the spatial regularization technique presented in Schnur and Zabaras.2 It is shown that when normal Gaussian distributions are assumed and the error in the data is uncorrelated, the Bayesian statistical theory takes a form similar to the deterministic regularization method presented earlier in Schnur and Zabaras,2 As such, the statistical theory can be used to provide a statistical interpretation of regularization and to estimate error in the solution of the inverse problem. Examples are presented to demonstrate the effect of the regularization parameters and the error in the initial data on the solution.  相似文献   

14.
The nonlinear variation regularization algorithm (NVRA) is an effective method to enhance the contrast and robustness of the reconstruction in medical imaging. In order to improve the reconstruction quality the variation regularization is introduced to the nonlinear algorithm based on one component of the magnetic flux density by injecting one current. Firstly, we propose a novel algorithm for magnetic resonance electrical impedance tomography (MREIT) using NVRA, and clarify the implementation of this algorithm. Secondly, we analyze the performance of the proposed nonlinear algorithm and the linear sensitivity method with noisy data in the phantom models. Finally, in the case of 0.36 T low field intensity magnetic resonance scanner, we present the method for reducing the electrode model error, and evaluate the performance of two reconstruction algorithms in the agar gel model. The results indicate that the NVRA is able to improve the reconstruction quality with sharp contrast and more robust to noise in comparison to the sensitivity method. In addition, this study shows that with just one current injection and one component of the magnetic flux density we can obtain a high quality imaging, which promotes the MREIT in clinical application. © 2015 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 25, 68–76, 2015  相似文献   

15.
Cone-beam X-ray luminescence computed tomography (CB-XLCT) is an attractive hybrid imaging modality, and it has the potential of monitoring the metabolic processes of nanophosphors-based drugs in vivo. However, the XLCT imaging suffers from a severe ill-posed problem. In this work, a sparse nonconvex Lp (0?p?L1 regularization. Further, an iteratively reweighted split augmented lagrangian shrinkage algorithm (IRW_SALSA-Lp) was proposed to efficiently solve the non-convex Lp (0?p?p-values (1/16, 1/8, 1/4, 3/8, 1/2, 5/8, 3/4, 7/8) in both 3D digital mouse experiments and in vivo experiments. The results demonstrate that the proposed non-convex methods outperform L2 and L1 regularization in accurately recovering sparse targets in CB-XLCT. And among all the non-convex p-values, our Lp(1/4?p?相似文献   

16.
Akesson EO  Daun KJ 《Applied optics》2008,47(3):407-416
Deconvolution of optically collected axisymmetric flame data is equivalent to solving an ill-posed problem subject to severe error amplification. Tikhonov regularization has recently been shown to be well suited for stabilizing this deconvolution, although the success of this method hinges on choosing a suitable regularization parameter. Incorporating a parameter selection scheme transforms this technique into a reliable automatic algorithm that outperforms unregularized deconvolution of a smoothed data set, which is currently the most popular way to analyze axisymmetric data. We review the discrepancy principle, L-curve curvature, and generalized cross-validation parameter selection schemes and conclude that the L-curve curvature algorithm is best suited to this problem.  相似文献   

17.
Yi H  Chen D  Qu X  Peng K  Chen X  Zhou Y  Tian J  Liang J 《Applied optics》2012,51(7):975-986
In this paper, a multilevel, hybrid regularization method is presented for fluorescent molecular tomography (FMT) based on the hp-finite element method (hp-FEM) with a continuous wave. The hybrid regularization method combines sparsity regularization and Landweber iterative regularization to improve the stability of the solution of the ill-posed inverse problem. In the first coarse mesh level, considering the fact that the fluorescent probes are sparsely distributed in the entire reconstruction region in most FMT applications, the sparse regularization method is employed to take full advantage of this sparsity. In the subsequent refined mesh levels, since the reconstruction region is reduced and the initial value of the unknown parameters is provided from the previous mesh, these mesh levels seem to be different from the first level. As a result, the Landweber iterative regularization method is applied for reconstruction. Simulation experiments on a 3D digital mouse atlas and physical experiments on a phantom are conducted to evaluate the performance of our method. The reconstructed results show the potential and feasibility of the proposed approach.  相似文献   

18.
There is a trade-off between uniformity and diffraction efficiency in the design of diffractive optical elements. It is caused by the inherent ill-posedness of the design problem itself. For the optimal design, the optimum trade-off needs to be obtained. The trade-off between uniformity and diffraction efficiency in the design of diffractive optical elements is theoretically investigated based on the Tikhonov regularization theory. A novel scheme of an iterative Fourier transform algorithm with regularization to obtain the optimum trade-off is proposed.  相似文献   

19.
Several mathematical models have been developed separately for determining production and recycled lot quantities to minimize total production cost and the determination of optimal process mean setting to minimize total quality cost. For a single stage discrete part production system, this paper presents a mathematical model that combines these two inter-related aspects. The production process has a controllable parameter, the process mean, which determines the output lot quality. One-sided tolerance is used to decide the quality of finished goods. Bad units are reworked before they can be reprocessed with the fresh input. For such a situation, the model determines production lot size, recycling lot quantity, and a process mean while minimizing total system cost. The model has been validated using sample data from a pharmaceutical company. Results indicate that the production lot size and the recycled quantity have an inverse relationship with process mean and, a direct relationship with process variance.  相似文献   

20.
Ma L  Cai W 《Applied optics》2008,47(21):3751-3759
We investigate the simultaneous tomographic reconstruction of temperature and species concentration using hyperspectral absorption spectroscopy. Previous work on absorption tomography has relied on a small number of wavelengths, resulting in the requirement of a large number of projections and limited measurement capability. Here we develop a tomographic inversion method to exploit the increased spectral information content enabled by recent advancement in laser technologies. The simulation results clearly demonstrate that the use of hyperspectral absorption data significantly reduces the number of projections, enables simultaneous mapping of temperature and species concentration, and provides more stable reconstruction compared with traditional absorption tomographic techniques.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号