全文获取类型
收费全文 | 107篇 |
免费 | 0篇 |
专业分类
化学工业 | 1篇 |
机械仪表 | 1篇 |
建筑科学 | 2篇 |
轻工业 | 2篇 |
无线电 | 1篇 |
一般工业技术 | 25篇 |
冶金工业 | 4篇 |
自动化技术 | 71篇 |
出版年
2020年 | 1篇 |
2019年 | 1篇 |
2018年 | 3篇 |
2017年 | 2篇 |
2016年 | 6篇 |
2015年 | 3篇 |
2014年 | 1篇 |
2013年 | 2篇 |
2012年 | 2篇 |
2010年 | 5篇 |
2009年 | 3篇 |
2008年 | 2篇 |
2007年 | 4篇 |
2006年 | 7篇 |
2005年 | 3篇 |
2004年 | 12篇 |
2003年 | 3篇 |
2002年 | 3篇 |
2001年 | 2篇 |
2000年 | 1篇 |
1999年 | 2篇 |
1998年 | 5篇 |
1997年 | 5篇 |
1996年 | 2篇 |
1995年 | 2篇 |
1994年 | 3篇 |
1993年 | 2篇 |
1992年 | 2篇 |
1991年 | 3篇 |
1990年 | 1篇 |
1989年 | 5篇 |
1988年 | 2篇 |
1987年 | 1篇 |
1985年 | 1篇 |
1981年 | 2篇 |
1980年 | 1篇 |
1979年 | 1篇 |
1972年 | 1篇 |
排序方式: 共有107条查询结果,搜索用时 15 毫秒
1.
2.
J. Lee R. T. Haftka O. H. Griffin Jr. L. T. Watson M. D. Sensmeier 《Structural and Multidisciplinary Optimization》1994,8(2-3):93-100
The present study proposes a detection technique for delaminations in a laminated beam. The proposed technique optimizes the spatial distribution of harmonic excitation so as to magnify the difference in response between the delaminated and intact beam. The technique is evaluated by numerical simulation of two-layered aluminum beams. Effects of measurement and geometric noise are included in the analysis. A finite element model for a delaminated composite, based on a layer-wise laminated plate theory is used in conjunction with a step function to simulate delaminations 相似文献
3.
4.
Sunil Kumar Richard J. Pippy Erdem Acar Nam H. Kim Raphael T. Haftka 《Structural and Multidisciplinary Optimization》2009,38(6):613-626
Probabilistic structural design deals with uncertainties in response (e.g. stresses) and capacity (e.g. failure stresses).
The calculation of the structural response is typically expensive (e.g., finite element simulations), while the capacity is
usually available from tests. Furthermore, the random variables that influence response and capacity are often disjoint. In
previous work we have shown that this disjoint property can be used to reduce the cost of obtaining the probability of failure
via Monte Carlo simulations. In this paper we propose to use this property for an approximate probabilistic optimization based
on exact capacity and approximate response distributions (ECARD). In Approximate Probabilistic Optimization Using ECARD, the
change in response distribution is approximated as the structure is re-designed while the capacity distribution is kept exact,
thus significantly reducing the number of expensive response simulations. ECARD may be viewed as an extension of SORA (Sequential
Optimization and Reliability Assessment), which proceeds with deterministic optimization iterations. In contrast, ECARD has
probabilistic optimization iterations, but in each iteration, the response distribution is approximated so as not to require
additional response calculations. The use of inexpensive probabilistic optimization allows easy incorporation of system reliability
constraints and optimal allocation of risk between failure modes. The method is demonstrated using a beam problem and a ten-bar
truss problem. The former allocates risk between two different failure modes, while the latter allocates risk between members.
It is shown that ECARD provides most of the improvement from risk re-allocation that can be obtained from full probabilistic
optimization. 相似文献
5.
This paper introduces an approach for dealing with constraints when using particle swarm optimization. The constrained, single objective optimization problem is converted into an unconstrained, bi-objective optimization problem that is solved using a multi-objective implementation of the particle swarm optimization algorithm. A specialized bi-objective particle swarm optimization algorithm is presented and an engineering example problem is used to illustrate the performance of the algorithm. An additional set of 13 test problems from the literature is used to further validate the performance of the newly proposed algorithm. For the example problems considered here, the proposed algorithm produced promising results, indicating that it is an approach that deserves further consideration. The newly proposed algorithm provides performance similar to that of a tuned penalty function approach, without having to tune any penalty parameters. 相似文献
6.
7.
8.
Schutte JF Reinbolt JA Fregly BJ Haftka RT George AD 《International journal for numerical methods in engineering》2004,61(13):2296-2315
Present day engineering optimization problems often impose large computational demands, resulting in long solution times even on a modern high-end processor. To obtain enhanced computational throughput and global search capability, we detail the coarse-grained parallelization of an increasingly popular global search method, the particle swarm optimization (PSO) algorithm. Parallel PSO performance was evaluated using two categories of optimization problems possessing multiple local minima-large-scale analytical test problems with computationally cheap function evaluations and medium-scale biomechanical system identification problems with computationally expensive function evaluations. For load-balanced analytical test problems formulated using 128 design variables, speedup was close to ideal and parallel efficiency above 95% for up to 32 nodes on a Beowulf cluster. In contrast, for load-imbalanced biomechanical system identification problems with 12 design variables, speedup plateaued and parallel efficiency decreased almost linearly with increasing number of nodes. The primary factor affecting parallel performance was the synchronization requirement of the parallel algorithm, which dictated that each iteration must wait for completion of the slowest fitness evaluation. When the analytical problems were solved using a fixed number of swarm iterations, a single population of 128 particles produced a better convergence rate than did multiple independent runs performed using sub-populations (8 runs with 16 particles, 4 runs with 32 particles, or 2 runs with 64 particles). These results suggest that (1) parallel PSO exhibits excellent parallel performance under load-balanced conditions, (2) an asynchronous implementation would be valuable for real-life problems subject to load imbalance, and (3) larger population sizes should be considered when multiple processors are available. 相似文献
9.
10.
A probabilistic sufficiency factor approach is proposed that combines safety factor and probability of failure. The probabilistic sufficiency factor approach represents a factor of safety relative to a target probability of failure. It provides a measure of safety that can be used more readily than the probability of failure or the safety index by designers to estimate the required weight increase to reach a target safety level. The probabilistic sufficiency factor can be calculated from the results of Monte Carlo simulation with little extra computation. The paper presents the use of probabilistic sufficiency factor with a design response surface approximation, which fits it as a function of design variables. It is shown that the design response surface approximation for the probabilistic sufficiency factor is more accurate than that for the probability of failure or for the safety index. Unlike the probability of failure or the safety index, the probabilistic sufficiency factor does not suffer from accuracy problems in regions of low probability of failure when calculated by Monte Carlo simulation. The use of the probabilistic sufficiency factor accelerates the convergence of reliability-based design optimization. 相似文献