首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
We present a regularized phenomenological multiscale model where elastic properties are computed using direct homogenization and subsequently evolved using a simple three‐parameter orthotropic continuum damage model. The salient feature of the model is a unified regularization framework based on the concept of effective softening strain. The unified regularization scheme has been employed in the context of constitutive law rescaling and the staggered nonlocal approach. We show that an element erosion technique for crack propagation when exercised with one of the two regularization schemes (1) possesses a characteristic length, (2) is consistent with fracture mechanics approach, and (3) does not suffer from pathological mesh sensitivity. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

2.
We present an efficient adjoint-based framework for computing sensitivities of quantities of interest with respect to material parameters for coupled fluid-structural acoustic systems with explicit interface coupling. The fluid is modeled using the Helmholtz equation and the structure is modeled using the Navier-Cauchy equations. Sensitivities are used to drive a gradient based optimization algorithm to solve important problems in structural acoustics, viz noise minimization and vibration isolation. For each problem, we consider two different priors: one where the optimal solution has a smooth variation and another with a bimaterial distribution. These priors are imposed with the help of suitable regularization terms. The effectiveness of this approach is demonstrated on both interior and exterior structural acoustic problems.  相似文献   

3.
ABSTRACT

Medical, satellite or microscopic images differ in the imaging techniques used, hence their underlying noise distribution also are different. Most of the restoration methods including regularization models make prior assumptions about the noise to perform an efficient restoration. Here we propose a system that estimates and classifies the noise into different distributions by extracting the relevant features. The system provides information about the noise distribution and then it gets directed into the restoration module where an appropriate regularization method (based on the non-local framework) has been employed to provide an efficient restoration of the data. We have effectively addressed the distortion due to data-dependent noise distributions such as Poisson and Gamma along with data uncorrelated Gaussian noise. The studies have shown a 97.7% accuracy in classifying noise in the test data. Moreover, the system also shows the capability to cater to other popular noise distributions such as Rayleigh, Chi, etc.  相似文献   

4.
Electrical capacitance tomography (ECT) attempts to image the permittivity distribution of an object by measuring the electrical capacitance between sets of electrodes placed around its periphery. Image reconstruction in ECT is a nonlinear ill-posed inverse problem, and regularization methods are needed to stabilize this inverse problem. The reconstruction of complex shapes (sharp edges) and absolute permittivity values is a more difficult task in ECT, and the commonly used regularization methods in Tikhonov minimization are unable to solve these problems. In the standard Tikhonov regularization method, the regularization matrix has a Laplacian-type structure, which encourages smoothing reconstruction. A Helmholtz-type regularization scheme has been implemented to solve the inverse problem with complicated-shape objects and the absolute permittivity values. The Helmholtz-type regularization has a wavelike property and encourages variations of permittivity. The results from experimental data demonstrate the advantage of the Helmholtz-type regularization for recovering sharp edges over the popular Laplacian-type regularization in the framework of Tikhonov minimization. Furthermore, this paper presents examples of the reconstructed absolute value permittivity map in ECT using experimental phantom data.   相似文献   

5.
We consider tomographic imaging problems where the goal is to obtain both a reconstructed image and a corresponding segmentation. A classical approach is to first reconstruct and then segment the image; more recent approaches use a discrete tomography approach where reconstruction and segmentation are combined to produce a reconstruction that is identical to the segmentation. We consider instead a hybrid approach that simultaneously produces both a reconstructed image and segmentation. We incorporate priors about the desired classes of the segmentation through a Hidden Markov Measure Field Model, and we impose a regularization term for the spatial variation of the classes across neighbouring pixels. We also present an efficient implementation of our algorithm based on state-of-the-art numerical optimization algorithms. Simulation experiments with artificial and real data demonstrate that our combined approach can produce better results than the classical two-step approach.  相似文献   

6.
In the past, nonlinear unconstrained optimization of the optical imaging problem has focused on Newton-Raphson techniques. Besides requiring expensive computation of the Jacobian, the unconstrained minimization with Tikhonov regularization can pose significant storage problems for large-scale reconstructions, involving a large number of unknowns necessary for realization of optical imaging. We formulate the inverse optical imaging problem as both simple-bound constrained and unconstrained minimization problems in order to illustrate the reduction in computational time and storage associated with constrained image reconstructions. The forward simulator of excitation and generated fluorescence, consisting of the Galerkin finite-element formulation, is used in an inverse algorithm to find the spatial distribution of absorption and lifetime that minimizes the difference between predicted and synthetic frequency-domain measurements. The inverse approach employs the truncated Newton method with trust region and a modification of automatic reverse differentiation to speed the computation of the optimization problem. The reconstruction results confirm that the physically based, constrained minimization with efficient optimization schemes may offer a more logical approach to the large-scale optical imaging problem than unconstrained minimization with regularization.  相似文献   

7.
We develop rate‐dependent regularization approaches for three‐dimensional frictional contact constraints based on the Kelvin and Maxwell viscoelastic constitutive models. With the present regularization schemes, we aim to provide a basis to better model friction and to stabilize the contact analysis while keeping the contact model as simple as possible. The key feature of the regularization approaches, implemented using an implicit time integrator, is that one can recover in the limit the widely used rate‐independent elastoplastic regularization framework without encountering numerical difficulties. Intermediate contact tractions are defined in terms of the relative displacement, the relative velocity, and the regularization parameters. The projection operators operate on the intermediate tractions and yield contact tractions that satisfy all the discretized contact constraints. The use of projection operators allows a systematic implementation of the present regularization schemes. Through numerical simulations, we observed that the Maxwell‐type regularization effectively avoids convergence problems, even for relatively large time step sizes, while the Kelvin‐type regularization does not. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

8.
We discuss a one-dimensional inverse material profile reconstruction problem that arises in layered media underlain by a rigid bottom, when total wavefield surficial measurements are used to guide the reconstruction. To tackle the problem, we adopt the systematic framework of PDE-constrained optimization and construct an augmented misfit functional that is further endowed by a regularization scheme. We report on a comparison of spatial regularization schemes such as Tikhonov and total variation against a temporal scheme that treats the model parameters as time-dependent. We study numerically the effects of inexact initial estimates, data noise, and regularization parameter choices for all three schemes, and report inverted profiles for the modulus, and for simultaneous inversion of both the modulus and viscous damping. Our numerical experiments demonstrate comparable or superior performance of the time-dependent regularization over the Tikhonov and total variation schemes for both smooth and sharp target profiles, albeit at increased computational cost. Support for this work was provided by the US National Science Foundation under grant awards ATM-0325125 and CMS-0348484.  相似文献   

9.
Components based on shape‐memory alloys are often subjected to several loading cycles that result in substantial alteration of material behavior. In such a framework, accurate models, as well as robust and efficient numerical approaches, become essential to allow for the simulation of complex devices. The present paper focuses on the numerical simulation of quasi‐static problems involving shape‐memory alloy structures or components subjected to multiple loading‐unloading cycles. A novel state‐update procedure for a three‐dimensional phenomenological model able to describe the saturation of permanent inelasticity, including degradation effects, is proposed here. The algorithm, being of the predictor‐corrector type and relying on an incremental energy minimization approach, is based on elastic checks, closed‐form solutions of polynomial equations, and nonlinear scalar equations solved through a combination of Newton‐Raphson and bisection methods. This allows for an easy implementation of model equations and to avoid the use of regularization parameters for the treatment of nonsmooth functions. Numerical results assess the good performances of the proposed approach in predicting both pseudoelastic and shape‐memory material behavior under cyclic loading as well as algorithm robustness.  相似文献   

10.
This article presents a systematic approach to analysing linear integer multi-objective optimization problems with uncertainty in the input data. The goal of this approach is to provide decision makers with meaningful information to facilitate the selection of a solution that meets performance expectations and is robust to input parameter uncertainty. Standard regularization techniques often deal with global stability concepts. The concept presented here is based on local quasi-stability and includes a local regularization technique that may be used to increase the robustness of any given efficient solution or to transform efficient solutions that are not robust (i.e. unstable), into robust solutions. An application to a multi-objective problem drawn from the mining industry is also presented.  相似文献   

11.
Bilateral filtering for structural topology optimization   总被引:1,自引:0,他引:1  
Filtering has been a major approach used in the homogenization‐based methods for structural topology optimization to suppress the checkerboard pattern and relieve the numerical instabilities. In this paper a bilateral filtering technique originally developed in image processing is presented as an efficient approach to regularizing the topology optimization problem. A non‐linear bilateral filtering process leads to a suitable problem regularization to eliminate the checkerboard instability, pronounced edge preserving smoothing characteristics to favour the 0–1 convergence of the mass distribution, and computational efficiency due to its single pass and non‐iterative nature. Thus, we show that the application of the bilateral filtering brings more desirable effects of checkerboard‐free, mesh independence, crisp boundary, computational efficiency and conceptual simplicity. The proposed bilateral technique has a close relationship with the conventional domain filtering and range filtering. The proposed method is implemented in the framework of a power‐law approach based on the optimality criteria and illustrated with 2D examples of minimum compliance design that has been extensively studied in the recent literature of topology optimization and its efficiency and accuracy are highlighted. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

12.
We present an adaptive regularization approach to retrieve vertical state parameter profiles from limb-sounding measurements with high accuracy. This is accomplished by introducing a dedicated regularization functional based on a reasonable assumption of the profile characteristics. The approach results in shape-dependent weighting during least-squares computations and relies on a Cholesky decomposition of a preselected L(T)L matrix. Our method is compared with established regularization functionals such as optimal estimation and Tikhonov with respect to errors and achievable height resolution. The results show an improved height resolution of the retrieved profiles together with a reduction of absolute and relative errors obtained by test-bed simulations.  相似文献   

13.
本文基于对偶理论对椭圆变分不等式的正则化方法提供一个相对全面的后验误差分析.我们分别考虑了摩擦接触问题和障碍问题,通过选取一种不同形式的有界算子和泛函,推导了其对偶形式并给出了正则化方法的 $H^1$ 范后验误差估计.最后,利用凸分析中的对偶理论建立了障碍问题的残量型后验误差估计的一般框架.同时我们选取一种特殊的对偶变量和泛函形式得到该问题的残量型误差估计及其有效性.数值解的后验误差估计是发展有效自适应算法的基础而模型误差的后验误差估计在分析问题中数据的不确定影响时是非常有用的.  相似文献   

14.
We introduce a modified Tikhonov regularization method to include three-dimensional x-ray mammography as a prior in the diffuse optical tomography reconstruction. With simulations we show that the optical image reconstruction resolution and contrast are improved by implementing this x-ray-guided spatial constraint. We suggest an approach to find the optimal regularization parameters. The presented preliminary clinical result indicates the utility of the method.  相似文献   

15.
This contribution is concerned with a coupling approach for nonconforming NURBS patches in the framework of an isogeometric formulation for solids in boundary representation. The boundary representation modeling technique in CAD is the starting point of this approach. We parameterize the solid according to the scaled boundary finite element method and employ NURBS basis functions for the approximation of the solution. Therefore, solid surfaces consist of several sections, which can be regarded as patches and discretized independently. The main objective of this study is to derive an approach for the connection of independent sections in order to allow for local refinement and thus an accurate and efficient discretization of the computational domain. Nonconforming sections are coupled with a mortar approach within a master-slave framework. The coupling of adjacent sections ensures the equality of mutual deformations along the interface in a weak sense and is enforced by constraining the NURBS basis functions on the interface. We apply this approach to nonlinear problems in two dimensions and compare the results with conforming discretizations.  相似文献   

16.
In this article, we study the balancing principle for Tikhonov regularization in Hilbert scales for deterministic and statistical nonlinear inverse problems. While the rates of convergence in deterministic setting is order optimal, they prove to be order optimal up to a logarithmic term in the stochastic framework. The two-step approach allows us to consider a data-driven algorithm in a general error model for which an exponential behaviour of the tail of the estimator chosen in the first step is valid. Finally, we compute the overall rate of convergence for a Hammerstein operator equation and for a parameter identification problem. Moreover, we illustrate these rates for the last application after we study some large sample properties of the local polynomial estimator in a general stochastic framework.  相似文献   

17.
Multi-agent optimization method is a nature-inspired framework that supports the cooperative search of an optimal solution of an optimization problem by a group of algorithmic agents connected through an environment with certain predefined information sharing protocol. In this work, we propose a novel heterogeneous multi-agent optimization (HTMAO) framework. The proposed framework is validated using a set of benchmark problems a real-world synthesizing radioactive waste blending problem. The optimal radioactive waste blending problem is formulated as a mixed integer nonlinear programming. The total frit used for vitrification process is minimized subject to thermodynamic properties and process model constraints. The model simultaneously determines the optimal decisions that include the combination of the waste tanks that form each waste blend and the amount of frit needed for the vitrification of each waste blend. In developing the HTMAO framework, efficient ant colony optimization algorithms; efficient simulated annealing; efficient genetic algorithm; and sequential quadratic programming solver are considered as algorithmic agents. We illustrate this approach through a real-world case study of the optimal radioactive waste blending of Hanford site in Southern Washington where nuclear waste is stored.  相似文献   

18.
Regularization in statistics   总被引:2,自引:1,他引:1  
This paper is a selective review of the regularization methods scattered in statistics literature. We introduce a general conceptual approach to regularization and fit most existing methods into it. We have tried to focus on the importance of regularization when dealing with today's high-dimensional objects: data and models. A wide range of examples are discussed, including nonparametric regression, boosting, covariance matrix estimation, principal component estimation, subsampling.  相似文献   

19.
Magnetic resonance imaging (MRI) reconstruction model based on total variation (TV) regularization can deal with problems such as incomplete reconstruction, blurred boundary, and residual noise. In this article, a non‐convex isotropic TV regularization reconstruction model is proposed to overcome the drawback. Moreau envelope and minmax‐concave penalty are firstly used to construct the non‐convex regularization of L2 norm, then it is applied into the TV regularization to construct the sparse reconstruction model. The proposed model can extract the edge contour of the target effectively since it can avoid the underestimation of larger nonzero elements in convex regularization. In addition, the global convexity of the cost function can be guaranteed under certain conditions. Then, an efficient algorithm such as alternating direction method of multipliers is proposed to solve the new cost function. Experimental results show that, compared with several typical image reconstruction methods, the proposed model performs better. Both the relative error and the peak signal‐to‐noise ratio are significantly improved, and the reconstructed images also show better visual effects. The competitive experimental results indicate that the proposed approach is not limited to MRI reconstruction, but it is general enough to be used in other fields with natural images.  相似文献   

20.
Many methods have been proposed in the literature to assess the probability of a sum-of-products. This problem has been shown computationally hard (namely #P-hard). Therefore, algorithms can be compared only from a practical point of view.In this article, we propose first an efficient implementation of the pivotal decomposition method. This kind of algorithms is widely used in the Artificial Intelligence framework. It is unfortunately almost never considered in the reliability engineering framework, but as a pedagogical tool. We report experimental results that show that this method is in general much more efficient than classical methods that rewrite the sum-of-products under study into an equivalent sum of disjoint products.Then, we derive from our method a factorization algorithm to be used as a preprocessing method for binary decision diagrams. We show by means of experimental results that this latter approach outperforms the formers.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号