首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
单峰函数最优化问题的一个快速收敛的进化策略   总被引:7,自引:0,他引:7  
针对单峰函数的最优化问题,给出一个快速收敛的进化策略,首先,对于该类最优化问题,本文使用一致分布的随机变量作为变异算子,替代传统进化策略的基于高斯分布的变异算子,减少了产生随机种随的代价,本文提出用当前种群和上一代种群的最优个体确定一个半空间,下一代种群在含有较多下降点的半空间中产生,使算法快速收敛,初步的数值结果表明,该方法可以明显提高计算效率。  相似文献   

2.
Shang  Yi  Li  Longzhuang 《World Wide Web》2002,5(2):159-173
In this paper, we present a general approach for statistically evaluating precision of search engines on the Web. Search engines are evaluated in two steps based on a large number of sample queries: (a) computing relevance scores of hits from each search engine, and (b) ranking the search engines based on statistical comparison of the relevance scores. In computing relevance scores of hits, we study four relevance scoring algorithms. Three of them are variations of algorithms widely used in the traditional information retrieval field. They are cover density ranking, Okapi similarity measurement, and vector space model algorithms. In addition, we develop a new three-level scoring algorithm to mimic commonly used manual approaches. In ranking the search engines in terms of precision, we apply a statistical metric called probability of win. In our experiments, six popular search engines, AltaVista, Fast, Google, Go, iWon, and NorthernLight, were evaluated based on queries from two domains of interest: parallel and distributed processing, and knowledge and data engineering. The first query set contains 1726 queries collected from the index terms of papers published in the IEEE Transactions on Knowledge and Data Engineering. The second set contains 1383 queries collected from the index terms of papers published in the IEEE Transactions on Parallel and Distributed Systems. Search engines were queried and compared in two different search modes: the default search mode and the exact phrase search mode. Our experimental results show that these six search engines performed differently under different search modes and scoring methods. Overall, Google was the best. NorthernLight was mostly second in the default search mode, whereas iWon was mostly second in the exact phrase search mode.  相似文献   

3.
Because of changes in the computer markets, the problem of efficient evaluation of elementary functions (which seemed to be already solved) becomes important again. In this paper, after a brief review of current approaches to this problem, algorithms for finding guaranteed bounds for values of elementary functions are suggested. The evaluation time is reduced through increasing the amount of memory used.  相似文献   

4.
为了改进原始和声搜索算法的全局搜索性能,提出了基于迭代局部搜索的和声搜索算法.该算法在充分利用和声记忆库中信息和提高搜索效率的同时,对于不满足停止准则的新和声采用基于改进kick策略移动的迭代局部搜索算法进行寻优,从而使新算法具有较强的"爬山"能力.针对4个benchmark函数对新算法做了测试,并与粒子群优化算法以及已有的几个算法进行了比较,结果表明该算法跳出局部极值点的能力较强、收敛速度更快、寻优精度较高;最后将新算法应用到焊接梁的优化设计问题中,仿真结果验证了该算法在求解焊接梁最小造价问题时优于原始的和声搜索算法、遗传算法等方法.  相似文献   

5.
Local Search (LS) has proven to be an efficient optimisation technique in clustering applications and in the minimisation of stochastic complexity of a data set. In the present paper, we propose two ways of organising LS in these contexts, the Multi-operator Local Search (MOLS) and the Adaptive Multi-Operator Local Search (AMOLS), and compare their performance to single operator (random swap) LS method and repeated GLA (Generalised Lloyd Algorithm). Both of the proposed methods use several different LS operators to solve the problem. MOLS applies the operators cyclically in the same order, whereas AMOLS adapts itself to favour the operators which manage to improve the result more frequently. We use a large database of binary vectors representing strains of bacteria belonging to the family Enterobacteriaceae and a binary image as our test materials. The new techniques turn out to be very promising in these tests.  相似文献   

6.
Comet is an object-oriented language supporting a constraint-based architecture for local search through declarative and search components. This paper proposes three novel and lightweight control abstractions for the search component, significantly enhancing the compositionality, modularity, and reuse of Comet programs. These abstractions, which includes events and checkpoints, rely on first-class closures as the enabling technology. They are especially useful for expressing, in a modular way, heuristic and meta-heuristics, unions of heterogeneous neighborhoods, and sequential composition of neighborhoods.  相似文献   

7.
One of the most appealing features of constraint programming is its rich constraint language for expressing combinatorial optimization problems. This paper demonstrates that traditional combinators from constraint programming have natural counterparts for local search, although their underlying computational model is radically different. In particular, the paper shows that constraint combinators, such as logical and cardinality operators, reification, and first-class expressions can all be viewed as differentiable objects. These combinators naturally support elegant and efficient modelings, generic search procedures, and partial constraint satisfaction techniques for local search. Experimental results on a variety of applications demonstrate the expressiveness and the practicability of the combinators.  相似文献   

8.
Let G=(V,E) be a finite graph, and be any function. The Local Search problem consists in finding a local minimum of the function f on G, that is a vertex v such that f(v) is not larger than the value of f on the neighbors of v in G. In this note, we first prove a separation theorem slightly stronger than the one of Gilbert, Hutchinson and Tarjan for graphs of constant genus. This result allows us to enhance a previously known deterministic algorithm for Local Search with query complexity , so that we obtain a deterministic query complexity of , where n is the size of G, d is its maximum degree, and g is its genus. We also give a quantum version of our algorithm, whose query complexity is of . Our deterministic and quantum algorithms have query complexities respectively smaller than the algorithm Randomized Steepest Descent of Aldous and Quantum Steepest Descent of Aaronson for large classes of graphs, including graphs of bounded genus and planar graphs.  相似文献   

9.
A very challenging problem in the genetics domain is to infer haplotypes from genotypes. This process is expected to identify genes affecting health, disease and response to drugs. One of the approaches to haplotype inference aims to minimise the number of different haplotypes used, and is known as haplotype inference by pure parsimony (HIPP). The HIPP problem is computationally difficult, being NP-hard. Recently, a SAT-based method (SHIPs) has been proposed to solve the HIPP problem. This method iteratively considers an increasing number of haplotypes, starting from an initial lower bound. Hence, one important aspect of SHIPs is the lower bounding procedure, which reduces the number of iterations of the basic algorithm, and also indirectly simplifies the resulting SAT model. This paper describes the use of local search to improve existing lower bounding procedures. The new lower bounding procedure is guaranteed to be as tight as the existing procedures. In practice the new procedure is in most cases considerably tighter, allowing significant improvement of performance on challenging problem instances.  相似文献   

10.
11.
12.
This paper investigates the necessary features of an effective clause weighting local search algorithm for propositional satisfiability testing. Using the recent history of clause weighting as evidence, we suggest that the best current algorithms have each discovered the same basic framework, that is, to increase weights on false clauses in local minima and then to periodically normalize these weights using a decay mechanism. Within this framework, we identify two basic classes of algorithm according to whether clause weight updates are performed additively or multiplicatively. Using a state-of-the-art multiplicative algorithm (SAPS) and our own pure additive weighting scheme (PAWS), we constructed an experimental study to isolate the effects of multiplicative in comparison to additive weighting, while controlling other key features of the two approaches, namely, the use of pure versus flat random moves, deterministic versus probabilistic weight smoothing and multiple versus single inclusion of literals in the local search neighbourhood. In addition, we examined the effects of adding a threshold feature to multiplicative weighting that makes it indifferent to similar cost moves. As a result of this investigation, we show that additive weighting can outperform multiplicative weighting on a range of difficult problems, while requiring considerably less effort in terms of parameter tuning. Our examination of the differences between SAPS and PAWS suggests that additive weighting does benefit from the random flat move and deterministic smoothing heuristics, whereas multiplicative weighting would benefit from a deterministic/probabilistic smoothing switch parameter that is set according to the problem instance. We further show that adding a threshold to multiplicative weighting produces a general deterioration in performance, contradicting our earlier conjecture that additive weighting has better performance due to having a larger selection of possible moves. This leads us to explain differences in performance as being mainly caused by the greater emphasis of additive weighting on penalizing clauses with relatively less weight.  相似文献   

13.
Feature selection is a challenging task that has been the subject of a large amount of research, especially in relation to classification tasks. It permits to eliminate the redundant attributes and enhance the classification accuracy by keeping only the relevant attributes. In this paper, we propose a hybrid search method based on both harmony search algorithm (HSA) and stochastic local search (SLS) for feature selection in data classification. A novel probabilistic selection strategy is used in HSA–SLS to select the appropriate solutions to undergo stochastic local refinement, keeping a good compromise between exploration and exploitation. In addition, the HSA–SLS is combined with a support vector machine (SVM) classifier with optimized parameters. The proposed HSA–SLS method tries to find a subset of features that maximizes the classification accuracy rate of SVM. Experimental results show good performance in favor of our proposed method.  相似文献   

14.
We extend previous work on the parameterized complexity of local search for the Traveling Salesperson Problem (TSP). So far, its parameterized complexity has been investigated with respect to the distance measures (defining the local search area) “Edge Exchange” and “Max-Shift”. We perform studies with respect to the distance measures “Swap” and “r-Swap”, “Reversal” and “r-Reversal”, and “Edit”, achieving both fixed-parameter tractability and W[1]-hardness results. In particular, from the parameterized reduction showing W[1]-hardness we infer running time lower bounds (based on the exponential time hypothesis) for all corresponding distance measures. Moreover, we provide non-existence results for polynomial-size problem kernels and we show that some in general W[1]-hard problems turn fixed-parameter tractable when restricted to planar graphs.  相似文献   

15.
Constant parallel-time cryptography allows to perform complex cryptographic tasks at an ultimate level of parallelism, namely by local functions that each of their output bits depend on a constant number of input bits. A natural way to obtain local cryptographic constructions is to use random local functions in which each output bit is computed by applying some fixed d-ary predicate P to a randomly chosen d-size subset of the input bits.In this work, we will study the cryptographic hardness of random local functions. In particular, we will survey known attacks and hardness results, discuss different flavors of hardness (one-wayness, pseudorandomness, collision resistance, public-key encryption), and mention applications to other problems in cryptography and computational complexity. We also present some open questions with the hope to develop a systematic study of the cryptographic hardness of local functions.  相似文献   

16.
一种并行多目标遗传邻域搜索算法   总被引:1,自引:0,他引:1  
现有的多目标遗传算法在解决大规模多目标生产调度问题时虽然有效,但往往非常耗时,难以应用于实际.为了提高求解效率,提出了一种并行多目标遗传邻域搜索算法来求解Pareto边界.该算法将多目标遗传算法的进化方向划分为若干范围,然后同时对每个进化方向的范围使用多目标遗传邻域搜索算法,并行地搜索各方向范围内的Pareto边界;在各进化方向范围内进化的子种群会定期交流各自进化成果.多目标遗传邻域搜索算法的并行化在不增加求解时间的前提下,提高了求解精度,加快了算法的收敛速度.仿真实验结果验证了算法的可行性与有效性.  相似文献   

17.
Recent experiments demonstrated that local search algorithms (e.g. GSAT) are able to find satisfying assignments for many hard Boolean formulas. A wide experimental study of these algorithms demonstrated their good performance on some inportant classes of formulas as well as poor performance on some other ones. In contrast, theoretical knowledge of their worst-case behavior is very limited. However, many worst-case upper and lower bounds of the form 2 n (<1 is a constant) are known for other SAT algorithms, for example, resolution-like algorithms. In the present paper we prove both upper and lower bounds of this form for local search algorithms. The class of linear-size formulas we consider for the upper bound covers most of the DIMACS benchmarks; the satisfiability problem for this class of formulas is NP-complete.  相似文献   

18.
#SMT问题是SMT问题的扩展,它需要计算一阶逻辑公式F所有可满足解的个数.目前,该问题已被广泛应用于编译器优化、硬件设计、软件验证和自动化推理等领域.随着#SMT问题的广泛应用,设计可以求解较大规模#SMT实例的求解器亟待解决.基于以上原因,设计了一种求解较大规模#SMT实例的近似求解器——VolComputeWithLocalSearch.它在现有的#SMT精确求解算法的基础上加入差分进化算法,通过调用体积计算工具qhull,进而给出#SMT问题的近似解.算法采用群体规则减少体积计算的次数,差分进化方法快速地枚举各个有解的区域.另外,从理论上证明了VolComputeWithLocalSearch求解器可以得到精确解的下界,使其可以应用在软件测试等只需要知道问题下界的领域.实验结果表明:VolComputeWithLocalSearch求解器是稳定的、具有快速的求解能力,并在高维问题上具有很好的表现.  相似文献   

19.
The search for p-median vertices on a network (graph) is a classical location problem. The p facilities (medians) must be located so as to minimize the sum of the distances from each demand vertex to its nearest facility. The Capacitated p-Median Problem (CPMP) considers capacities for the service to be given by each median. The total service demanded by vertices identified by p-median clusters cannot exceed their service capacity. Primal-dual based heuristics are very competitive and provide simultaneously upper and lower bounds to optimal solutions. The Lagrangean/surrogate relaxation has been used recently to accelerate subgradient like methods. The dual lower bound have the same quality of the usual Lagrangean relaxation dual but is obtained using modest computational times. This paper explores improvements on upper bounds applying local search heuristics to solutions made feasible by the Lagrangean/surrogate optimization process. These heuristics are based on location-allocation procedures that swap medians and vertices inside the clusters, reallocate vertices, and iterate until no improvements occur. Computational results consider instances from the literature and real data obtained using a geographical information system.  相似文献   

20.
The maximum clique problem (MCP) is one of the most popular combinatorial optimization problems with various practical applications. An important generalization of MCP is the maximum weight clique problem (MWCP) where a positive weight is associate to each vertex. In this paper, we present Breakout Local Search (BLS) which can be applied to both MC and MWC problems without any particular adaptation. BLS explores the search space by a joint use of local search and adaptive perturbation strategies. Extensive experimental evaluations using the DIMACS and BOSHLIB benchmarks show that the proposed approach competes favourably with the current state-of-art heuristic methods for MCP. Moreover, it is able to provide some new improved results for a number of MWCP instances. This paper also reports for the first time a detailed landscape analysis, which has been missing in the literature. This analysis not only explains the difficulty of several benchmark instances, but also justifies to some extent the behaviour of the proposed approach and the used parameter settings.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号