首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper, we introduce a new framework for generating synthetic vascular trees, based on rigorous model-based mathematical optimization. Our main contribution is the reformulation of finding the optimal global tree geometry into a nonlinear optimization problem (NLP). This rigorous mathematical formulation accommodates efficient solution algorithms such as the interior point method and allows us to easily change boundary conditions and constraints applied to the tree. Moreover, it creates trifurcations in addition to bifurcations. A second contribution is the addition of an optimization stage for the tree topology. Here, we combine constrained constructive optimization (CCO) with a heuristic approach to search among possible tree topologies. We combine the NLP formulation and the topology optimization into a single algorithmic approach. Finally, we attempt the validation of our new model-based optimization framework using a detailed corrosion cast of a human liver, which allows a quantitative comparison of the synthetic tree structure with the tree structure determined experimentally down to the fifth generation. The results show that our new framework is capable of generating asymmetric synthetic trees that match the available physiological corrosion cast data better than trees generated by the standard CCO approach.  相似文献   

2.
In this paper we develop an analytical framework we refer to as “Becoming an Engineer” that focuses upon changes occurring over time as students traverse their undergraduate educations in engineering. This analytical framework involves three related dimensions that we track over time: disciplinary knowledge, identification, and navigation. Our analysis illustrates how these three dimensions enable us to understand how students become, or do not become, engineers by examining how these three interrelated dimensions unfold over time. This study is based on longitudinal ethnographic data from which we have developed “person‐centered ethnographies” focused on individual students' pathways through engineering. We present comparative analysis, spanning four schools and four years. We also present person‐centered ethnographic case studies that illustrate how our conceptual dimensions interrelate. Our discussion draws some educational implications from our analysis and proposes further lines of research.  相似文献   

3.
Freed AD  Einstein DR  Sacks MS 《Acta Mechanica》2010,213(1-2):205-222
In Part I, a novel hypoelastic framework for soft-tissues was presented. One of the hallmarks of this new theory is that the well-known exponential behavior of soft-tissues arises consistently and spontaneously from the integration of a rate based formulation. In Part II, we examine the application of this framework to the problem of biaxial kinematics, which are common in experimental soft-tissue characterization. We confine our attention to an isotropic formulation in order to highlight the distinction between non-linearity and anisotropy. In order to provide a sound foundation for the membrane extension of our earlier hypoelastic framework, the kinematics and kinetics of in-plane biaxial extension are revisited, and some enhancements are provided. Specifically, the conventional stress-to-traction mapping for this boundary value problem is shown to violate the conservation of angular momentum. In response, we provide a corrected mapping. In addition, a novel means for applying loads to in-plane biaxial experiments is proposed. An isotropic, isochoric, hypoelastic, constitutive model is applied to an in-plane biaxial experiment done on glutaraldehyde treated bovine pericardium. The experiment is comprised of eight protocols that radially probe the biaxial plane. Considering its simplicity (two adjustable parameters) the model does a reasonably good job of describing the non-linear normal responses observed in these experimental data, which are more prevalent than are the anisotropic responses exhibited by this tissue.  相似文献   

4.
A very general and robust approach to solving optimization problems involving probabilistic uncertainty is through the use of Probabilistic Ordinal Optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the probabilistic merits of local design alternatives, rather than on precise quantification of the alternatives. Thus, we simply ask the question: “Is that alternative better or worse than this one?” to some level of statistical confidence we require, not: “HOW MUCH better or worse is that alternative to this one?”. In this paper we illustrate an elementary application of probabilistic ordinal concepts in a 2-D optimization problem. Two uncertain variables contribute to uncertainty in the response function. We use a simple Coordinate Pattern Search non-gradient-based optimizer to step toward the statistical optimum in the design space. We also discuss more sophisticated implementations, and some of the advantages and disadvantages versus other approaches to optimization under uncertainty.  相似文献   

5.
《工程(英文)》2020,6(3):310-345
Recent progress in deep learning is essentially based on a “big data for small tasks” paradigm, under which massive amounts of data are used to train a classifier for a single narrow task. In this paper, we call for a shift that flips this paradigm upside down. Specifically, we propose a “small data for big tasks” paradigm, wherein a single artificial intelligence (AI) system is challenged to develop “common sense,” enabling it to solve a wide range of tasks with little training data. We illustrate the potential power of this new paradigm by reviewing models of common sense that synthesize recent breakthroughs in both machine and human vision. We identify functionality, physics, intent, causality, and utility (FPICU) as the five core domains of cognitive AI with humanlike common sense. When taken as a unified concept, FPICU is concerned with the questions of “why” and “how,” beyond the dominant “what” and “where” framework for understanding vision. They are invisible in terms of pixels but nevertheless drive the creation, maintenance, and development of visual scenes. We therefore coin them the “dark matter” of vision. Just as our universe cannot be understood by merely studying observable matter, we argue that vision cannot be understood without studying FPICU. We demonstrate the power of this perspective to develop cognitive AI systems with humanlike common sense by showing how to observe and apply FPICU with little training data to solve a wide range of challenging tasks, including tool use, planning, utility inference, and social learning. In summary, we argue that the next generation of AI must embrace “dark” humanlike common sense for solving novel tasks.  相似文献   

6.
We propose a unification framework for three-dimensional shape reconstruction using physically based models. A variety of 3D shape reconstruction techniques have been developed in the past two decades, such as shape from stereopsis, from shading, from texture gradient, and from structured lighting. However, the lack of a general theory that unifies these shape reconstruction techniques into one framework hinders the effort of a synergistical image interpretation scheme using multiple sensors/information sources. Most shape-from-X techniques use an “observable” (e.g., the stereo disparity, intensity, or texture gradient) and a model, which is based on specific domain knowledge (e.g., the triangulation principle, reflectance function, or texture distortion equation) to predict the observable, in 3D shape reconstruction. We show that all these “observable–prediction-model” types of techniques can be incorporated into our framework of energy constraint on a flexible, deformable image frame. In our algorithm, if the observable does not confirm to the predictions obtained using the corresponding model, a large “error” potential results. The error potential gradient forces the flexible image frame to deform in space. The deformation brings the flexible image frame to “wrap” onto the surface of the imaged 3D object. Surface reconstruction is thus achieved through a “package wrapping” or a “shape deformation” process by minimizing the discrepancy in the observable and the model prediction. The dynamics of such a wrapping process are governed by the least action principle which is physically correct. A physically based model is essential in this general shape reconstruction framework because of its capability to recover the desired 3D shape, to provide an animation sequence of the reconstruction, and to include the regularization principle into the theory of surface reconstruction.  相似文献   

7.
The significance of human values in everyday life highlights the integral role of this concept in any design that aims to improve the quality of human life. By emphasizing the need for a comprehensive value framework for design, the present study explores a new value framework to be used as a common ground in design. For this purpose, we empirically investigate how different people group human values. By spreading the link of our Human Values Survey worldwide via the internet, a variety of participants with different cultural backgrounds were reached, and hierarchical cluster analysis was used to analyze the data. As a result, 568 complete answers were collected, from which nine value groups were concluded: “carefulness”, “justice”, “ecology”, “respect for others”, “meaningfulness”, “status”, “pleasure”, “respect for oneself” and “personal development”. After clustering our data, we propose a value framework with four themes, nine value groups, 42 key values, and 135 extra values. This framework, raising designers’ awareness and widening their view of human values, provides the opportunity to address a diverse range of human values in design.  相似文献   

8.
Rocks can be anisotropic due to a variety of reasons. When estimating rock velocities from seismic data, failure to introduce anisotropy into earth models could generate distortions in the final images that can have enormous economic impact. To estimate anisotropic earth velocities by tomographic methods, it is necessary to trace rays or to solve the wave equation in models where anisotropy has been properly considered. Thus, in this work we present a 3-D generalized ellipsoidal travel time formulation that allow us to trace rays in an anisotropic medium. We propose to trace rays in anisotropic media by solving a set of nonlinear optimization problems, where the group velocities for P and S wave propagation modes are 3-D ellipsoidal approximations that have been recently obtained. Moreover, we prove that this 3-D ellipsoidal anisotropic ray tracing formulation is a convex nonlinear optimization problem, and therefore any solution of the problem is a global minimum. Each optimization problem is solved by the global spectral gradient method, which requires first order information and has low computation and low storage requirements. Our approach for tracing rays in anisotropic media is a generalization in the sense that handles titled axis of symmetry and, close to the axis of symmetry, it is an accurate formulation for 2-D transversely isotropic media and 3-D orthorhombic media, depending on the input parameters. Moreover, this formulation gives the exact ray trajectories in 2-D and 3-D homogeneous isotropic media. The simplicity of the formulation and the low computational cost of the optimization method allow us to present a variety of numerical results that illustrate the behavior and computational advantages of the approach, and the difficulties when working in anisotropic media. Partially supported by Fonacit project UCV-97-003769  相似文献   

9.
Automatic design optimization is highly sensitive to problem formulation. The choice of objective function, constraints and design parameters can dramatically impact on the computational cost of optimization and the quality of the resulting design. The best formulation varies from one application to another. A design engineer will usually not know the best formulation in advance. To address this problem, we have developed a system that supports interactive formulation, testing and reformulation of design optimization strategies. Our system includes an executable, data-flow language for representing optimization strategies. The language allows an engineer to define multiple stages of optimization, each using different approximations of the objective and constraints or different abstractions of the design space. We have also developed a set of transformations that reformulate strategies represented in our language. The transformations can approximate objective and constraint functions, abstract or reparameterize search spaces, or divide an optimization process into multiple stages. The system is applicable in principle to any design problem that can be expressed in terms of constrained optimization; however, we expect the system to be most useful when the design artifact is governed by algebraic and ordinary differential equations. We have tested the system on problems of racing yacht design and jet engine nozzle design. We report experimental results demonstrating that our reformulation techniques can significantly improve the performance of automatic design optimization. Our research demonstrates the viability of a reformulation methodology that combines symbolic program transformation with numerical experimentation. It is an important first step in a research program aimed at automating the entire strategy formulation process.  相似文献   

10.
In this paper we study incompressible fluids described by constitutive equations from a different perspective, than that usually adopted, namely that of expressing kinematical quantities in terms of the stress. Such a representation is the appropriate way to express fluids like the classical Bingham fluid or fluids whose material moduli depend on the pressure. We consider models wherein the symmetric part of the velocity gradient is given by a “power-law” of the stress. This stress power-law model automatically satisfies the constraint of incompressibility without our having to introduce a Lagrange multiplier to enforce the constraint. The model also includes the classical incompressible Navier–Stokes model as a special subclass. We compare the stress power-law model with the classical power-law models and we show that the stress power-law model can, for certain parameter values, exhibit qualitatively different response characteristics than the classical power-law models and—on the other hand—it can be, for certain parameter values, used as a substitute for the classical power-law models. Using a stress power-law model we study several steady flow problems and obtain exact analytical solutions, and we argue that the possibility to obtain an exact analytical solution suggests, among others, that using these models provides an interesting alternative to the classical power-law models for which reasonable exact analytical solutions cannot be obtained. Finally, we discuss the issue of the choice of boundary conditions, and we show that the choice of boundary conditions has, at least for one of the problems that we study, a profound impact on the solvability of the boundary value problem.  相似文献   

11.
We study the TV-L1 image approximation model from primal and dual perspective, based on a proposed equivalent convex formulations. More specifically, we apply a convex TV-L1 based approach to globally solve the discrete constrained optimization problem of image approximation, where the unknown image function $u(x)∈\{f_1 ,... , f_n\}$, $∀x ∈ Ω$. We show that the TV-L1 formulation does provide an exact convex relaxation model to the non-convex optimization problem considered. This result greatly extends recent studies of Chan et al., from the simplest binary constrained case to the general gray-value constrained case, through the proposed rounding scheme. In addition, we construct a fast multiplier-based algorithm based on the proposed primal-dual model, which properly avoids variability of the concerning TV-L1 energy function. Numerical experiments validate the theoretical results and show that the proposed algorithm is reliable and effective.  相似文献   

12.
This paper deals with automated guided vehicles (AGVs) which transport containers between the quay and the stack on automated container terminals. The focus is on the assignment of transportation jobs to AGVs within a terminal control system operating in real time. First, we describe a rather common problem formulation based on due times for the jobs and solve this problem both with a greedy priority rule based heuristic and with an exact algorithm. Subsequently, we present an alternative formulation of the assignment problem, which does not include due times. This formulation is based on a rough analogy to inventory management and is solved using an exact algorithm. The idea behind this alternative formulation is to avoid estimates of driving times, completion times, due times, and tardiness because such estimates are often highly unreliable in practice and do not allow for accurate planning. By means of simulation, we then analyze the different approaches. We show that the inventory-based model leads to better productivity on the terminal than the due-time-based formulation.  相似文献   

13.
A transient finite strain viscoplastic model is implemented in a gradient‐based topology optimization framework to design impact mitigating structures. The model's kinematics relies on the multiplicative split of the deformation gradient, and the constitutive response is based on isotropic hardening viscoplasticity. To solve the mechanical balance laws, the implicit Newmark‐beta method is used together with a total Lagrangian finite element formulation. The optimization problem is regularized using a partial differential equation filter and solved using the method of moving asymptotes. Sensitivities required to solve the optimization problem are derived using the adjoint method. To demonstrate the capability of the algorithm, several protective systems are designed, in which the absorbed viscoplastic energy is maximized. The numerical examples demonstrate that transient finite strain viscoplastic effects can successfully be combined with topology optimization.  相似文献   

14.
“What being walks sometimes on two feet, sometimes on three, and sometimes on four, and is weakest when it has the most?” —The Sphinx's Riddle Pattern recognition is one of the most important functionalities for intelligent behavior and is displayed by both biological and artificial systems. Pattern recognition systems have four major components: data acquisition and collection, feature extraction and representation, similarity detection and pattern classifier design, and performance evaluation. In addition, pattern recognition systems are successful to the extent that they can continuously adapt and learn from examples; the underlying framework for building such systems is predictive learning. The pattern recognition problem is a special case of the more general problem of statistical regression; it seeks an approximating function that minimizes the probability of misclassification. In this framework, data representation requires the specification of a basis set of approximating functions. Classification requires an inductive principle to design and model the classifier and an optimization or learning procedure for classifier parameter estimation. Pattern recognition also involves categorization: making sense of patterns not previously seen. The sections of this paper deal with the categorization and functional approximation problems; the four components of a pattern recognition system; and trends in predictive learning, feature selection using “natural” bases, and the use of mixtures of experts in classification. © 2000 John Wiley & Sons, Inc. Int J Imaging Syst Technol 11, 101–116, 2000  相似文献   

15.
We review progress in designing and transforming multi-functional yield-stress fluids and give a perspective on the current state of knowledge that supports each step in the design process. We focus mainly on the rheological properties that make yield-stress fluids so useful and the trade-offs which need to be considered when working with these materials. Thinking in terms of “design with” and “design of” yield-stress fluids motivates how we can organize our scientific understanding of this field. “Design with” involves identification of rheological property requirements independent of the chemical formulation, e.g. for 3D direct-write printing which needs to accommodate a wide range of chemistry and material structures. “Design of” includes microstructural considerations: conceptual models relating formulation to properties, quantitative models of formulation-structure-property relations, and chemical transformation strategies for converting effective yield-stress fluids to be more useful solid engineering materials. Future research directions are suggested at the intersection of chemistry, soft-matter physics, and material science in the context of our desire to design useful rheologically-complex functional materials.  相似文献   

16.
A challenge in engineering design is to choose suitable objectives and constraints from many quantities of interest, while ensuring an optimization is both meaningful and computationally tractable. We propose an optimization formulation that can take account of more quantities of interest than existing formulations, without reducing the tractability of the problem. This formulation searches for designs that are optimal with respect to a binary relation within the set of designs that are optimal with respect to another binary relation. We then propose a method of finding such designs in a single optimization by defining an overall ranking function to use in optimizers, reducing the cost required to solve this formulation. In a design under uncertainty problem, our method obtains the most robust design that is not stochastically dominated faster than a multiobjective optimization. In a car suspension design problem, our method obtains superior designs according to a k-optimality condition than previously suggested multiobjective approaches to this problem. In an airfoil design problem, our method obtains designs closer to the true lift/drag Pareto front using the same computational budget as a multiobjective optimization.  相似文献   

17.
We present a numerical technique to model the buckling of a rolled thin sheet. It consists in coupling, within the Arlequin framework, a three dimensional model based on 8-nodes tri-linear hexahedron, used in the sheet part located upstream the roll bite, and a well-suited finite element shell model, in the roll bite downstream sheet part, in order to cope with buckling phenomena. The resulting nonlinear problem is solved by the Asymptotic Numerical Method (ANM) that is efficient to capture buckling instabilities. The originalities of the paper ly, first in an Arlequin procedure with moving meshes, second in an efficient application to a thin sheet rolling process. The suggested algorithm is applied to very thin sheet rolling scenarios involving “edges-waves” and “center-waves” defects. The obtained results show the effectiveness of our global approach.  相似文献   

18.
We present a strategy for the recovery of a sparse solution of a common problem in acoustic engineering, which is the reconstruction of sound source levels and locations applying microphone array measurements. The considered task bears similarities to the basis pursuit formalism but also relies on additional model assumptions that are challenging from a mathematical point of view. Our approach reformulates the original task as a convex optimisation model. The sought solution shall be a matrix with a certain desired structure. We enforce this structure through additional constraints. By combining popular splitting algorithms and matrix differential theory in a novel framework we obtain a numerically efficient strategy. Besides a thorough theoretical consideration we also provide an experimental setup that certifies the usability of our strategy. Finally, we also address practical issues, such as the handling of inaccuracies in the measurement and corruption of the given data. We provide a post processing step that is capable of yielding an almost perfect solution in such circumstances.  相似文献   

19.
An optimization procedure is developed to address the problem of minimizing the drive system weight of high speed prop-rotor aircraft which are required to demonstrate fixed-wing-like efficiencies in high speed forward flight and maintain acceptable hover figure of merit similar to helicopters. The optimization is performed using the method of feasible directions. A hybrid approximate analysis procedure is also used to reduce the computational effort of using exact analysis for every function evaluation necessary within the optimizer. The results compared to a reference rotor show significant weight reductions. The aerodynamic performance of the optimized rotor, analyzed at “off-design” points to judge the strength of the optimization problem formulation and the validity of the resulting design, shows considerable improvements. The results are compared to the reference values and significant reduction in the weight is achieved.  相似文献   

20.
Due to natural or man-made disasters, the evacuation of a whole region or city may become necessary. Apart from private traffic, the emergency services also need to consider transit-dependent evacuees which have to be transported from collection points to secure shelters outside the endangered region with the help of a bus fleet. We consider a simplified version of the arising bus evacuation problem (BEP), which is a vehicle scheduling problem that aims at minimizing the network clearance time, i.e., the time needed until the last person is brought to safety. In this paper, we consider an adjustable robust formulation without recourse for the BEP, the robust bus evacuation problem (RBEP), in which the exact numbers of evacuees are not known in advance. Instead, a set of likely scenarios is known. After some reckoning time, this uncertainty is eliminated and planners are given exact figures. The problem is to decide for each bus, if it is better to send it right away—using uncertain information on the evacuees—or to wait until the the scenario becomes known. We present a mixed-integer linear programming formulation for the RBEP and discuss solution approaches; in particular, we present a tabu search framework for finding heuristic solutions of acceptable quality within short computation time. In computational experiments using both randomly generated instances and the real-world scenario of evacuating the city of Kaiserslautern, Germany, we compare our solution approaches.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号