首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   14篇
  免费   0篇
建筑科学   1篇
自动化技术   13篇
  2018年   2篇
  2017年   2篇
  2016年   4篇
  2015年   1篇
  2014年   1篇
  2013年   3篇
  2012年   1篇
排序方式: 共有14条查询结果,搜索用时 15 毫秒
1.
2.
A new metaheuristic optimization algorithm is developed to solve truss optimization problems. The new algorithm, called cuckoo search (CS), is examined by solving five truss design optimization problems with increasing numbers of design variables and complexity in constraints. The performance of the CS algorithm is further compared with various classical and advanced algorithms, selected from a wide range of the state‐of‐the‐art algorithms in the area. The results identify that the final solutions obtained by the CS are superior compared with the best solutions obtained by the other algorithms. Finally, the unique search features used in the CS and the implications for future researches are discussed in detail. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   
3.
This study proposes a novel chaotic cuckoo search (CCS) optimization method by incorporating chaotic theory into cuckoo search (CS) algorithm. In CCS, chaos characteristics are combined with the CS with the intention of further enhancing its performance. Further, the elitism scheme is incorporated into CCS to preserve the best cuckoos. In CCS method, 12 chaotic maps are applied to tune the step size of the cuckoos used in the original CS method. Twenty-seven benchmark functions and an engineering case are utilized to investigate the efficiency of CCS. The results clearly demonstrate that the performance of CCS together with a suitable chaotic map is comparable as well as superior to that of the CS and other metaheuristic algorithms.  相似文献   
4.
Many design problems in engineering are typically multiobjective, under complex nonlinear constraints. The algorithms needed to solve multiobjective problems can be significantly different from the methods for single objective optimization. Computing effort and the number of function evaluations may often increase significantly for multiobjective problems. Metaheuristic algorithms start to show their advantages in dealing with multiobjective optimization. In this paper, we formulate a new cuckoo search for multiobjective optimization. We validate it against a set of multiobjective test functions, and then apply it to solve structural design problems such as beam design and disc brake design. In addition, we also analyze the main characteristics of the algorithm and their implications.  相似文献   
5.
The performance of an optimization tool is largely determined by the efficiency of the search algorithm used in the process. The fundamental nature of a search algorithm will essentially determine its search efficiency and thus the types of problems it can solve. Modern metaheuristic algorithms are generally more suitable for global optimization. This paper carries out extensive global optimization of unconstrained and constrained problems using the recently developed eagle strategy by Yang and Deb in combination with the efficient differential evolution. After a detailed formulation and explanation of its implementation, the proposed algorithm is first verified using twenty unconstrained optimization problems or benchmarks. For the validation against constrained problems, this algorithm is subsequently applied to thirteen classical benchmarks and three benchmark engineering problems reported in the engineering literature. The performance of the proposed algorithm is further compared with various, state-of-the-art algorithms in the area. The optimal solutions obtained in this study are better than the best solutions obtained by the existing methods. The unique search features used in the proposed algorithm are analyzed, and their implications for future research are also discussed in detail.  相似文献   
6.

A variant of particle swarm optimization (PSO) is represented to solve the infinitive impulse response (IIR) system identification problem. Called improved PSO (IPSO), it makes significant enhancement over PSO. To begin with, the population initialization step makes use of golden ratio to segment solution space so as to obtain high-quality solutions. It is followed by all particles using different inertia weights in velocity updating step, which is beneficial for preserving the balance between global search and local search. Subsequently, IPSO uses normal distribution to disturb the global best particle, which enhances its capacity of escaping from the local optimums. The above three operations cannot only guarantee high-quality solutions, strong global search capacity, and fast convergence rate, but also avoid low diversity, excessive local search, and premature stagnation. These properties of IPSO make it much better suited for IIR system identification problems. IPSO is applied on 12 examples. The experimental results amply demonstrate the capability of IPSO toward obtaining the best objective function values in all the cases. Compared with the other four PSO approaches, IPSO has stronger convergence and higher stability which clearly points out its desirable performance in search accuracy and identifying efficiency.

  相似文献   
7.
Cuckoo search (CS) is a relatively new algorithm, developed by Yang and Deb in 2009, and the same has been found to be efficient in solving global optimization problems. In this paper, we review the fundamental ideas of cuckoo search and the latest developments as well as its applications. We analyze the algorithm and gain insight into its search mechanisms and find out why it is efficient. We also discuss the essence of algorithms and its link to self-organizing systems, and finally, we propose some important topics for further research.  相似文献   
8.
9.
Feng  Yanhong  Wang  Gai-Ge  Deb  Suash  Lu  Mei  Zhao  Xiang-Jun 《Neural computing & applications》2017,28(7):1619-1634
Neural Computing and Applications - This paper presents a novel binary monarch butterfly optimization (BMBO) method, intended for addressing the 0–1 knapsack problem (0–1 KP). Two...  相似文献   
10.
The performance of any algorithm will largely depend on the setting of its algorithm-dependent parameters. The optimal setting should allow the algorithm to achieve the best performance for solving a range of optimization problems. However, such parameter tuning itself is a tough optimization problem. In this paper, we present a framework for self-tuning algorithms so that an algorithm to be tuned can be used to tune the algorithm itself. Using the firefly algorithm as an example, we show that this framework works well. It is also found that different parameters may have different sensitivities and thus require different degrees of tuning. Parameters with high sensitivities require fine-tuning to achieve optimality.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号