首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The simulated annealing (SA) algorithm has been recognized as a powerful technique for minimizing complicated functions. However, a critical disadvantage of the SA algorithm is its high computational cost. Therefore, it is the goal of this paper to investigate the use of the critical temperature in SA to reduce its computational cost. This paper presents a systematic study of the critical temperature and its applications in the minimization of functions of continuous variables with the SA algorithm. Based on this study, a new algorithm was developed to exploit the unique feature of the critical temperature in SA. The new algorithm combines SA and local search to determine global minimum effectively. Extensive tests on a variety of functions demonstrated that the new algorithm provides comparable performance to well-established SA techniques. Furthermore, the new algorithm also improves the determination of the starting temperature for the SA algorithm. The results obtained in this study are expected to be useful for improving the efficiency of SA algorithms, and for facilitating the development of temperature parallel SA algorithms.  相似文献   

2.
Jaya is a population-based heuristic optimization algorithm proposed for solving constrained and unconstrained optimization problems. The peculiar distinct feature of Jaya from the other population-based algorithms is that it updates the positions of artificial agent in the population by considering the best and worst individuals. This is an important property for the algorithm to balance exploration and exploitation on the solution space. However, the basic Jaya cannot be applied to binary optimization problems because the solution space is discretely structured for this type of optimization problems and the decision variables of the binary optimization problems can be element of set [0,1]. In this study, we first focus on discretization of Jaya by using a logic operator, exclusive or – xor. The proposed idea is simple but effective because the solution update rule of Jaya is replaced with the xor operator, and when the obtained results are compared with the state-of-art algorithms, it is seen that the Jaya-based binary optimization algorithm, JayaX for short, produces better quality results for the binary optimization problems dealt with the study. The benchmark problems in this study are uncapacitated facility location problems and CEC2015 numeric functions, and the performance of the algorithms is compared on these problems. In order to improve the performance of the proposed algorithm, a local search module is also integrated with the JayaX. The obtained results show that the proposed algorithm is better than the compared algorithms in terms of solution quality and robustness.  相似文献   

3.
The area under the ROC curve (AUC) provides a good scalar measure of ranking performance without requiring a specific threshold for performance comparison among classifiers. AUC is useful for imprecise environments since it operates independently with respect to class distributions and misclassification costs. A direct optimization of this AUC criterion thus becomes a natural choice for binary classifier design. However, a direct formulation based on the AUC criterion would require a high computational cost due to the drastically increasing input pair features. In this paper, we propose an online learning algorithm to circumvent this computational problem for binary classification. Different from those conventional recursive formulations, the proposed formulation involves a pairwise cost function which pairs up a newly arrived data point with those of opposite class in stored data. Moreover, with incorporation of a sparse learning into the online formulation, the computational effort can be significantly reduced. Our empirical results on three different scales of public databases show promising potential in terms of classification AUC, accuracy, and computational efficiency.  相似文献   

4.
Differential evolution (DE) is an efficient population based algorithm used to solve real-valued optimization problems. It has the advantage of incorporating relatively simple and efficient mutation and crossover operators. However, the DE operator is based on floating-point representation only, and is difficult to use when solving combinatorial optimization problems. In this paper, a modified binary differential evolution (MBDE) based on a binary bit-string framework with a simple and new binary mutation mechanism is proposed. Two test functions are applied to verify the MBDE framework with the new binary mutation mechanism, and four structural topology optimization problems are used to study the performance of the proposed MBDE algorithm. The experimental studies show that the proposed MBDE algorithm is not only suitable for structural topology optimization, but also has high viability in terms of solving numerical optimization problems.  相似文献   

5.
This paper applies a revised configuration generation mechanism of the Simulated Annealing (SA) algorithm to obtain the minimum total tardiness in job shop scheduling problems. In addition to always generating feasible configurations, this revised mechanism can also exclude some cost non-decreasing configurations in advance. The revised SA method is also compared with a more tailored algorithm (MEHA) and two other SA approaches. Computational results indicate that the solution quality of the SA approaches outperform MEHA. Among the three SA approaches, the revised SA has the best performance. Moreover, the SA approaches differ insignificantly in terms of computational time.  相似文献   

6.
Simulated Annealing (SA) is a single-solution-based metaheuristic technique based on the annealing process in metallurgy. It is also one of the best-known metaheuristic algorithms due to its simplicity and good performance. Despite its interesting characteristics, SA suffers from several limitations such as premature convergence. On the other hand, Japanese swordsmithing refers to the manual-intensive process for producing high-quality bladed weapons from impure raw metals. During this process, Japanese smiths fold and reheat pieces of metal multiple times in order to eliminate impurities and defects. In this paper, an improved version of the SA algorithm is presented. In the new approach, a population of agents is considered. Each agent conducts a search strategy based on a modification of the SA scheme. The proposed algorithm modifies the original SA incorporating two new operators, folding and reheating, inspired by the ancient Japanese Swordsmithing technique. Under the new approach, the process of folding is conceived as a compression of the search space, while the reheating mechanism considers a reinitialization of the cooling process in the original SA scheme. With this inclusion, the new algorithm maintains the computational structure of the SA method but improving its search capacities. In order to evaluate its performance, the proposed algorithm is tested in a set of 28 benchmark functions, which include multimodal, unimodal, composite and shifted functions, and 3 real world optimization problems. The results demonstrate the high performance of the proposed method when compared to the original SA and other popular state-of-the-art algorithms.  相似文献   

7.
Feature selection is the basic pre-processing task of eliminating irrelevant or redundant features through investigating complicated interactions among features in a feature set. Due to its critical role in classification and computational time, it has attracted researchers’ attention for the last five decades. However, it still remains a challenge. This paper proposes a binary artificial bee colony (ABC) algorithm for the feature selection problems, which is developed by integrating evolutionary based similarity search mechanisms into an existing binary ABC variant. The performance analysis of the proposed algorithm is demonstrated by comparing it with some well-known variants of the particle swarm optimization (PSO) and ABC algorithms, including standard binary PSO, new velocity based binary PSO, quantum inspired binary PSO, discrete ABC, modification rate based ABC, angle modulated ABC, and genetic algorithms on 10 benchmark datasets. The results show that the proposed algorithm can obtain higher classification performance in both training and test sets, and can eliminate irrelevant and redundant features more effectively than the other approaches. Note that all the algorithms used in this paper except for standard binary PSO and GA are employed for the first time in feature selection.  相似文献   

8.
In this article we identify a class of two-dimensional knapsack problems with binary weights and related three-criteria unconstrained combinatorial optimization problems that can be solved in polynomial time by greedy algorithms. Starting from the knapsack problem with two equality constraints we show that this problem can be solved efficiently by using an appropriate partitioning of the items with respect to their binary weights. Based on the results for this problem we derive an algorithm for the three-criteria unconstrained combinatorial optimization problem with two binary objectives that explores the connectedness of the set of efficient knapsacks with respect to a combinatorial definition of adjacency. Furthermore, we prove that our approach is asymptotically optimal and provide extensive computational experiments that shows that we can solve the three-criteria problem with up to one million items in less than half an hour. Finally, we derive an efficient algorithm for the two-dimensional knapsack problems with binary constraints that only takes into account the results we obtained for the unconstrained three-criteria problem with binary weights.  相似文献   

9.

Video surveillance cameras capture huge amount of data 24 hours a day. However, most of these videos contain redundant data which make the process difficult for browsing and analysis. A significant amount of research findings have been made in summarization of recorded video, but such schemes do not have much impact on video surveillance applications. On the contrary, video synopsis is a smart technology that preserves all the activities of every single object and projects them concurrently in a condensed time. The energy minimization module in video synopsis framework plays a vital role, which in turn minimizes the activity loss, number of collision and temporal consistency cost. In most of the reported schemes, Simulated Annealing (SA) algorithm is employed to solve the energy minimization problem. However, it suffers from slow convergence rate resulting in a high computational load to the system. In order to mitigate this issue, this article presents an improved energy minimization scheme using hybridization of SA and Teaching Learning based Optimization (TLBO) algorithms. The suggested framework for static surveillance video synopsis generation consists of four computational modules, namely, Object detection and segmentation, Tube formation, Optimization, and finally Stitching and the central focus is on the optimization module. Thus, the present work deals with an improved hybrid energy minimization problem to achieve global optimal solution with reduced computational time. The motivation behind hybridization (HSATLBO) is that TLBO algorithm has the ability to search rigorously, ensuring to reach the optimum solution with less computation. On the contrary, SA reaches the global optimum solution, but it may get disarrayed and miss some critical search points. Exhaustive experiments are carried out and results compared with that of benchmark schemes in terms of minimizing the activity, collision and temporal consistency costs. All the experiments are conducted on five widely used videos taken from standard surveillance video data set (PETS 2001, MIT Surveillance Dataset, ChangeDetection.Net, PETS 2006 and UMN Dataset) as well as one real generated surveillance video from the IIIT Bhubaneswar Surveillance Dataset. To make a fair comparison, additionally, performance of the proposed hybrid scheme to solve video synopsis optimization problem is also compared with that of the other benchmark functions. Experimental evaluation and analysis confirm that the proposed scheme outperforms other state-of-the-art approaches. Finally, the suggested scheme can be easily and reliably deployed in the off-line video synopsis generation.

  相似文献   

10.
In this paper we study the bi-objective minimum cost flow (BMCF) problem which can be categorized as multi objective minimum cost flow problems. Generally, the exact computation of the efficient frontier is intractable and there may exist an exponential number of extreme non-dominated objective vectors. Hence, it is better to employ an approximate method to compute solutions within reasonable time. Therefore, we propose a hybrid meta heuristic algorithm (memetic algorithm hybridized with simulated annealing MA/SA) to develop an efficient approach for solving this problem. In order to show the efficiency of the proposed MA/SA some problems have been generated and solved by both the MA/SA and an exact method. It is perceived from this evaluation that the proposed MA/SA outputs are very close to the exact solutions. It is shown that when the number of arcs and nodes exceed 30 (large problems) the MA/SA model will be more preferred because of its strongly shorter computational time in comparison with exact methods.  相似文献   

11.
Latent class models with crossed subject-specific and test(rater)-specific random effects have been proposed to estimate the diagnostic accuracy (sensitivity and specificity) of a group of binary tests or binary ratings. However, the computation of these models are hindered by their complicated Monte Carlo Expectation–Maximization (MCEM) algorithm. In this article, a class of pseudo-likelihood functions is developed for conducting statistical inference with crossed random-effects latent class models in diagnostic medicine. Theoretically, the maximum pseudo-likelihood estimation is still consistent and has asymptotic normality. Numerically, our results show that not only the pseudo-likelihood approach significantly reduces the computational time, but it has comparable efficiency relative to the MCEM algorithm. In addition, dimension-wise likelihood, one of the proposed pseudo-likelihoods, demonstrates its superior performance in estimating sensitivity and specificity.  相似文献   

12.
Computational problems of large-scale data are gaining attention recently due to better hardware and hence, higher dimensionality of images and data sets acquired in applications. In the last couple of years non-smooth minimization problems such as total variation minimization became increasingly important for the solution of these tasks. While being favorable due to the improved enhancement of images compared to smooth imaging approaches, non-smooth minimization problems typically scale badly with the dimension of the data. Hence, for large imaging problems solved by total variation minimization domain decomposition algorithms have been proposed, aiming to split one large problem into N>1 smaller problems which can be solved on parallel CPUs. The N subproblems constitute constrained minimization problems, where the constraint enforces the support of the minimizer to be the respective subdomain. In this paper we discuss a fast computational algorithm to solve domain decomposition for total variation minimization. In particular, we accelerate the computation of the subproblems by nested Bregman iterations. We propose a Bregmanized Operator Splitting–Split Bregman (BOS-SB) algorithm, which enforces the restriction onto the respective subdomain by a Bregman iteration that is subsequently solved by a Split Bregman strategy. The computational performance of this new approach is discussed for its application to image inpainting and image deblurring. It turns out that the proposed new solution technique is up to three times faster than the iterative algorithm currently used in domain decomposition methods for total variation minimization.  相似文献   

13.
The problem of minimization of multiextremal quadratic functional constructed in the state space with binary variables is studied. To accelerate the local field (an analog of the gradient in a continuous space), it is proposed to binarize the matrix based on which the functional is constructed, coarsening its entries in sign to the values 0, ± 1. It is shown that the binarization procedure can be performed in an optimal way such that the calculates direction of the local field coincides with considerable probability with its true direction. The procedure is oriented to solving problems in the configuration space of high dimension, since the binarization of the matrix considerably reduces the required amount of main memory and the computational complexity of the algorithm.  相似文献   

14.
Apriori算法在挖掘频繁项集时需要多次扫描数据库,这样会因为频繁的IO操作而导致效率低下。为了改进算法的执行效率,提出BE-Apriori(binay encoded Apriori)算法,其充分利用了二进制数相比编程语言中各种数据结构在内存及运算速度上的优势,对事务记录进行二进制编码后加载到内存,然后利用等效的二进制数之间运算代替集合之间的运算。分析了算法性能,并利用UCI数据集中的毒蘑菇数据对BE-Apriori算法进行实验验证。结果表明BE-Apriori可以正确挖掘频繁项集,并且相比Apriori算法有着更好的性能。  相似文献   

15.
粒子滤波算法是进行运动目标跟踪的一种重要方法。针对传统粒子滤波算法在进行目标跟踪时存在的计算量大、实时性不足的问题,提出一种基于二值掩码图像的粒子滤波目标跟踪快速算法。该算法在传统粒子滤波算法的每个帧处理阶段产生二值掩码图像,再结合权重选择方法移除背景中权重较小的粒子,保留权重较大的重要粒子。提出的算法可以有效减少参与计算的粒子数目,节约算法的计算成本,从而提高目标跟踪的实时性。与传统粒子滤波算法进行比较,实验结果表明,提出的算法不仅能够有效地提高跟踪速度,而且跟踪结果的准确性和鲁棒性也有所增强。  相似文献   

16.
混合SPMD模拟退火算法及其应用   总被引:5,自引:0,他引:5  
模拟退火算法由于有很好的数学特性-以概率1收敛于全局最优值,再加上其算法本身与特定的问题无关,因此被广泛地用于各种组合优化问题。但是,模拟退火算法又具有收敛速度慢,执行时间长,算法性能与初始值有关及参数敏感等特点,使得它在不少应用中成为一种低效甚至是不可行的算法。文中提出一种混合SPMD模拟退火算法,在克服经典模拟退火算法内在串行性的同时,进一步和下山法结合起来,并综合多种优化方法,在一定的处理机规模内取得了可扩展和并行效果,显著提高了算法的收敛速度,克服了算法性能对初始值和参数选择的过分依赖,在提高算法性能的同时,方便了算法的使用。该算法已在一个机群系统THNPSC-1上得以实现,并在材料科学的一个定量电子晶体学研究问题中得到应用,降低了该问题的求解时间,提高了求解质量。  相似文献   

17.
Simulated annealing (SA) is an optimization technique that can process cost functions with degrees of nonlinearities, discontinuities and stochasticity. It can process arbitrary boundary conditions and constraints imposed on these cost functions. The SA technique is applied to the problem of robot path planning. Three situations are considered here: the path is represented as a polyline; as a Bézier curve; and as a spline interpolated curve. In the proposed SA algorithm, the sensitivity of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensitivity of each parameter is associated to its probability distribution in the definition of the next candidate.  相似文献   

18.
In this paper, we study the configuration of machine cells in the presence of alternative routings for part types. The objective function is the minimization of transportation costs. Limits on cell sizes as well as separation constraints (i.e. machines that are not allowed to be placed in the same cell) and co-location constraints (i.e. machines that must be placed in the same cell) may be imposed. An efficient Tabu Search (TS) algorithm is proposed to solve this problem. Extensive computational experiences with large-size problems show that this method outperforms some existing Simulated Annealing (SA) approaches.  相似文献   

19.
The team orienteering problem (TOP) is known as an NP-complete problem. A set of locations is provided and a score is collected from the visit to each location. The objective is to maximize the total score given a fixed time limit for each available tour. Given the computational complexity of this problem, a multi-start simulated annealing (MSA) algorithm which combines a simulated annealing (SA) based meta-heuristic with a multi-start hill climbing strategy is proposed to solve TOP. To verify the developed MSA algorithm, computational experiments are performed on well-known benchmark problems involving numbers of locations ranging between 42 and 102. The experimental results demonstrate that the multi-start hill climbing strategy can significantly improve the performance of the traditional single-start SA. Meanwhile, the proposed MSA algorithm is highly effective compared to the state-of-the-art meta-heuristics on the same benchmark instances. The proposed MSA algorithm obtained 135 best solutions to the 157 benchmark problems, including five new best solutions. In terms of both solution quality and computational expense, this study successfully constructs a high-performance method for solving this challenging problem.  相似文献   

20.
In recent years several authors have investigated binary search trees with minimal internal path length. In this paper we propose relaxing the requirement of inserting all nodes on one level before going to the next level. This leads to a new class of binary search trees called ISA [k] trees. We investigated the average locate cost per node, average shift cost per node, total insertion cost, and average successful search cost for ISA[k] trees. We also present an insertion algorithm with associated predecessor and successor functions for ISA[k] trees. For large binary search trees (over 160 nodes) our results suggest the use of ISA[2] or ISA[3] trees for best performance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号