首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Many of today’s most successful planners perform a forward heuristic search. The accuracy of the heuristic estimates and the cost of their computation determine the performance of the planner. Thanks to the efforts of researchers in the area of heuristic search planning, modern algorithms are able to generate high-quality estimates. In this paper we propose to learn heuristic functions using artificial neural networks and support vector machines. This approach can be used to learn standalone heuristic functions but also to improve standard planning heuristics. One of the most famous and successful variants for heuristic search planning is used by the Fast-Forward (FF) planner. We analyze the performance of standalone learned heuristics based on nature-inspired machine learning techniques and employ a comparison to the standard FF heuristic and other heuristic learning approaches. In the conducted experiments artificial neural networks and support vector machines were able to produce standalone heuristics of superior accuracy. Also, the resulting heuristics are computationally much more performant than related ones.  相似文献   

2.
Functional decomposition is a process of splitting a complex circuit into smaller sub-circuits. There exist two major strategies in decomposition, namely, serial and parallel decomposition. In serial decomposition the problem the complex function represented as a truth table with support set variables and partitioned into free and bout set variables. The minterms corresponding to the bound set variables are represented as an equivalent function called the predecessor function. Equivalent minterms of the bound set variables are assigned an output code. The assigned output codes and the free set variable minterms are represented as the successor function. Serial decomposition is further categorized into disjoint and non-disjoint decomposition, when the free and bound set variables are disjoint and non-disjoint respectively. This paper deals with the problem of determining the set of best free and bound variables (variable partitioning problem) for disjoint serial decomposition. Variable partitioning is the first step in decomposition process. An efficient variable partition algorithm is one that determines the set of all free and bound set variables that satisfy the decomposition theorem in minimal time and by exploring the search space effectively. This will allow the decomposition algorithm to determine the best variable partition of a function that results in smaller decomposed functions and with maximum number of do not cares in these functions. Classical approaches to determine the best free and bound set use exhaustive search methods. The time and memory requirements for such approaches are exponential or super exponential.A novel heuristic search approach is proposed to determine the set of good variable partitions in minimal time by minimally exploring the search space. There are two heuristics employed in the proposed search approach, (1) r-admissibility based heuristic or pruned breadth first search (PBFS) approach and (2) Information relation based heuristic or improved pruned breadth first search (IPBFS) approach. The r-admissibility based heuristic is based on r-partition characteristics of the free and bound set variables. The information relation and measure based heuristic is based on information relationship of free and bound set variables that are expressed as r-partition heuristics. The proposed variable partition search approach has been successfully implemented and test with MCNC and Espresso benchmarks and the results indicate that the time complexity is comparable to r-admissible heuristic algorithm and the quality of solution is comparable to exact variable partitioning algorithm. A comparison of PBFS and IPBFS heuristics for certain benchmarks are also discussed in this paper.  相似文献   

3.
Planning with preferences involves not only finding a plan that achieves the goal, it requires finding a preferred plan that achieves the goal, where preferences over plans are specified as part of the planner's input. In this paper we provide a technique for accomplishing this objective. Our technique can deal with a rich class of preferences, including so-called temporally extended preferences (TEPs). Unlike simple preferences which express desired properties of the final state achieved by a plan, TEPs can express desired properties of the entire sequence of states traversed by a plan, allowing the user to express a much richer set of preferences. Our technique involves converting a planning problem with TEPs into an equivalent planning problem containing only simple preferences. This conversion is accomplished by augmenting the inputed planning domain with a new set of predicates and actions for updating these predicates. We then provide a collection of new heuristics and a specialized search algorithm that can guide the planner towards preferred plans. Under some fairly general conditions our method is able to find a most preferred plan—i.e., an optimal plan. It can accomplish this without having to resort to admissible heuristics, which often perform poorly in practice. Nor does our technique require an assumption of restricted plan length or make-span. We have implemented our approach in the HPlan-P planning system and used it to compete in the 5th International Planning Competition, where it achieved distinguished performance in the Qualitative Preferences track.  相似文献   

4.
In the field of heuristic search it is usually assumed that admissible heuristics are consistent, implying that consistency is a desirable attribute. The term “inconsistent heuristic” has, at times, been portrayed negatively, as something to be avoided. Part of this is historical: early research discovered that inconsistency can lead to poor performance for A? (nodes might be re-expanded many times). However, the issue has never been fully investigated, and was not re-considered after the invention of IDA?.This paper shows that many of the preconceived notions about inconsistent heuristics are outdated. The worst-case exponential time of inconsistent heuristics is shown to only occur on contrived graphs with edge weights that are exponential in the size of the graph. Furthermore, the paper shows that rather than being something to be avoided, inconsistent heuristics often add a diversity of heuristic values into a search which can lead to a reduction in the number of node expansions. Inconsistent heuristics are easy to create, contrary to the common perception in the AI literature. To demonstrate this, a number of methods for achieving effective inconsistent heuristics are presented.Pathmax is a way of propagating inconsistent heuristic values in the search from parent to children. This technique is generalized into bidirectional pathmax (BPMX) which propagates values from a parent to a child node, and vice versa. BPMX can be integrated into IDA? and A?. When inconsistent heuristics are used with BPMX, experimental results show a large reduction in the search effort required by IDA?. Positive results are also presented for A? searches.  相似文献   

5.
6.
Some of the current best conformant probabilistic planners focus on finding a fixed length plan with maximal probability. While these approaches can find optimal solutions, they often do not scale for large problems or plan lengths. As has been shown in classical planning, heuristic search outperforms bounded length search (especially when an appropriate plan length is not given a priori). The problem with applying heuristic search in probabilistic planning is that effective heuristics are as yet lacking.In this work, we apply heuristic search to conformant probabilistic planning by adapting planning graph heuristics developed for non-deterministic planning. We evaluate a straight-forward application of these planning graph techniques, which amounts to exactly computing a distribution over many relaxed planning graphs (one planning graph for each joint outcome of uncertain actions at each time step). Computing this distribution is costly, so we apply Sequential Monte Carlo (SMC) to approximate it. One important issue that we explore in this work is how to automatically determine the number of samples required for effective heuristic computation. We empirically demonstrate on several domains how our efficient, but sometimes suboptimal, approach enables our planner to solve much larger problems than an existing optimal bounded length probabilistic planner and still find reasonable quality solutions.  相似文献   

7.
A case-based approach to heuristic planning   总被引:1,自引:1,他引:0  
Most of the great success of heuristic search as an approach to AI Planning is due to the right design of domain-independent heuristics. Although many heuristic planners perform reasonably well, the computational cost of computing the heuristic function in every search node is very high, causing the planner to scale poorly when increasing the size of the planning tasks. For tackling this problem, planners can incorporate additional domain-dependent heuristics in order to improve their performance. Learning-based planners try to automatically acquire these domain-dependent heuristics using previous solved problems. In this work, we present a case-based reasoning approach that learns abstracted state transitions that serve as domain control knowledge for improving the planning process. The recommendations from the retrieved cases are used as guidance for pruning or ordering nodes in different heuristic search algorithms applied to planning tasks. We show that the CBR guidance is appropriate for a considerable number of planning benchmarks.  相似文献   

8.
The dynamic space allocation problem (DSAP) presented in this paper considers the task of assigning items (resources) to locations during a multi-period planning horizon such that the cost of rearranging the items is minimized. Three tabu search heuristics are presented for this problem. The first heuristic is a simple basic tabu search heuristic. The second heuristic adds diversification and intensification strategies to the first, and the third heuristic is a probabilistic tabu search heuristic. To test the performances of the heuristics, a set of test problems from the literature is used in the analysis. The results show that the tabu search heuristics are efficient techniques for solving the DSAP. More importantly, the proposed tabu search heuristic with diversification/intensification strategies found new best solutions using less computation time for one-half of all the test problems.  相似文献   

9.
Utility Functions for Ceteris Paribus Preferences   总被引:1,自引:0,他引:1  
Ceteris paribus (all-else equal) preference statements concisely represent preferences over outcomes or goals in a way natural to human thinking. Although deduction in a logic of such statements can compare the desirability of specific conditions or goals, many decision-making methods require numerical measures of degrees of desirability. To permit ceteris paribus specifications of preferences while providing quantitative comparisons, we present an algorithm that compiles a set of qualitative ceteris paribus preferences into an ordinal utility function. Our algorithm is complete for a finite universe of binary features. Constructing the utility function can, in the worst case, take time exponential in the number of features, but common independence conditions reduce the computational burden. We present heuristics using utility independence and constraint-based search to obtain efficient utility functions.  相似文献   

10.
Dealing with numerical information is practically important in many real-world planning domains where the executability of an action can depend on certain numerical conditions, and the action effects can consume or renew some critical continuous resources, which in pddl can be represented by numerical fluents. When a planning problem involves numerical fluents, the quality of the solutions can be expressed by an objective function that can take different plan quality criteria into account.We propose an incremental approach to automated planning with numerical fluents and multi-criteria objective functions for pddl numerical planning problems. The techniques in this paper significantly extend the framework of planning with action graphs and local search implemented in the lpg planner. We define the numerical action graph (NA-graph) representation for numerical plans and we propose some new local search techniques using this representation, including a heuristic search neighborhood for NA-graphs, a heuristic evaluation function based on relaxed numerical plans, and an incremental method for plan quality optimization based on particular search restarts. Moreover, we analyze our approach through an extensive experimental study aimed at evaluating the importance of some specific techniques for the performance of the approach, and at analyzing its effectiveness in terms of fast computation of a valid plan and quality of the best plan that can be generated within a given CPU-time limit. Overall, the results show that our planner performs quite well compared to other state-of-the-art planners handling numerical fluents.  相似文献   

11.
In this paper we present heuristic scheduling algorithms for the National Bureau of Standards/Aspen Inc. Pipelined Image-Processing Engine (PIPE). PIPE is a special-purpose machine for low-level image processing consisting of a linearly connected array of processing stages. A program is specified as a directed acyclic graph (DAG). Our first algorithm schedules planar DAGs. It works top-down through the graph and uses the greedy approach to schedule operations on a stage. It uses several heuristics to control the movement of images between stages. The worst case time for the schedule generated by the algorithm is O(N) times the optimal schedule, where N is the maximum width of the graph. We generalize this algorithm to work on nonplanar graphs, using heuristics for repositioning images on the stages of PIPE. The worst case time for the more general algorithm is also O(N) times the optimal schedule. Finally, we analyze the problem of optimizing throughput and latency for a sequence of DAGs on PIPE.  相似文献   

12.
This paper presents a new class of heuristics which embed an exact algorithm within the framework of a local search heuristic. This approach was inspired by related heuristics which we developed for a practical problem arising in electronics manufacture. The basic idea of this heuristic is to break the original problem into small subproblems having similar properties to the original problem. These subproblems are then solved using time intensive heuristic approaches or exact algorithms and the solution is re-embedded into the original problem. The electronics manufacturing problem where we originally used the embedded local search approach, contains the Travelling Salesman Problem (TSP) as a major subproblem. In this paper we further develop our embedded search heuristic, HyperOpt, and investigate its performance for the TSP in comparison to other local search based approaches. We introduce an interesting hybrid of HyperOpt and 3-opt for asymmetric TSPs which proves more efficient than HyperOpt or 3-opt alone. Since pure local search seldom yields solutions of high quality we also investigate the performance of the approaches in an iterated local search framework. We examine iterated approaches of Large-Step Markov Chain and Variable Neighbourhood Search type and investigate their performance when used in combination with HyperOpt. We report extensive computational results to investigate the performance of our heuristic approaches for asymmetric and Euclidean Travelling Salesman Problems. While for the symmetric TSP our approaches yield solutions of comparable quality to 2-opt heuristic, the hybrid methods proposed for asymmetric problems seem capable of compensating for the time intensive embedded heuristic by finding tours of better average quality than iterated 3-opt in many less iterations and providing the best heuristic solutions known, for some instance classes.  相似文献   

13.
We address the two-stage assembly scheduling problem where there are m machines at the first stage and an assembly machine at the second stage. The objective is to schedule the available n jobs so that total completion time of all n jobs is minimized. Setup times are treated as separate from processing times. This problem is NP-hard, and therefore we present a dominance relation and propose three heuristics. The heuristics are evaluated based on randomly generated data. One of the proposed heuristics is known to be the best heuristic for the case of zero setup times while another heuristic is known to perform well for such problems. A new version of the latter heuristic, which utilizes the dominance relation, is proposed and shown to perform much better than the other two heuristics.  相似文献   

14.
In this study, we consider a tactical problem where a time slot schedule for delivery service over a given planning horizon must be selected in each zone of a geographical area. A heuristic search evaluates each schedule selection by constructing a corresponding tactical routing plan of minimum cost based on demand and service time estimates. At the end, the schedule selection leading to the best tactical routing plan is selected. The latter can then be used as a blueprint when addressing the operational problem (i.e., when real customer orders are received and operational routes are constructed). We propose two heuristics to address the tactical problem. The first heuristic is a three‐phase approach: a periodic vehicle routing problem (PVRP) is first solved, followed by a repair phase and a final improvement phase where a vehicle routing problem (VRP) with time windows is solved for each period of the planning horizon. The second heuristic tackles the problem as a whole by directly solving a PVRP with time windows. Computational results compare the two heuristics under various settings, based on instances derived from benchmark instances for the VRP with time windows.  相似文献   

15.
Directed model checking is a well-established approach for detecting error states in concurrent systems. A popular variant to find shortest error traces is to apply the A\(^*\) search algorithm with distance heuristics that never overestimate the real error distance. An important class of such distance heuristics is the class of pattern database heuristics. Pattern database heuristics are built on abstractions of the system under consideration. In this paper, we propose downward pattern refinement, a systematic approach for the construction of pattern database heuristics for concurrent systems of timed automata. First, we propose a general framework for pattern databases in the context of timed automata and show that desirable theoretical properties hold for the resulting pattern database. Afterward, we formally define a concept to measure the accuracy of abstractions. Based on this concept, we propose an algorithm for computing succinct abstractions that are still accurate to produce informed pattern databases. We evaluate our approach on large and complex industrial problems. The experiments show the practical potential of the resulting pattern database heuristic.  相似文献   

16.
Additive ensembles of admissible heuristics constitute the most general form of exploiting the individual strengths of numerous admissible heuristics in optimal planning. However, the same set of heuristics can be additively composed in infinitely many ways and the quality of the resulting heuristic estimate depends directly on the choice of the composition. Focusing on abstraction heuristics, we describe a procedure that takes a deterministic planning problem, a forward-search state, and a set of abstraction-based admissible heuristics, and derives an optimal additive composition of these heuristics with respect to the given state. Most importantly, we show that this procedure is polynomial-time for arbitrary sets of all abstraction heuristics with which we are acquainted, including explicit abstractions such as pattern databases (regular or constrained) and merge-and-shrink, and implicit abstractions such as fork-decomposition and abstractions based on tractable constraint optimization over tree-shaped constraint networks.  相似文献   

17.
Graphplan-style of planning can be formulated as an incremental propositional CSP where the (boolean) variables correspond to operator instantiations (actions) that are or are not scheduled at certain time steps. In this paper we present a framework for solving this class of propositional CSPs using local search in planning graphs. The search space consists of particular subgraphs of a planning graph corresponding to (complete) variable assignments, and representing partial plans. The operators for moving from one search state to the next one are graph modifications corresponding to revisions of the current variable assignment (partial plan), or to an extension of the represented CSP.Our techniques are implemented in a planner called LPG using various types of heuristics based on a parametrized objective function, where the parameters weight different constraint violations, and are dynamically evaluated using Lagrange multipliers. LPG's basic heuristic was inspired by Walksat, which in Kautz and Selman's Blackbox can be used to solve the SAT-encoding of a planning graph. An advantage of LPG is that its heuristics exploit the structure of the planning graph, while Blackbox relies on general heuristics for SAT-problems, and requires the translation of the planning graph into propositional clauses. Another major difference is that LPG can handle action execution costs to produce good quality plans. This is achieved by an anytime process minimizing an objective function based on the number of constraint violations in a plan and on its overall cost. Experimental results illustrate the efficiency of our approach, showing, in particular, that LPG is significantly faster than Blackbox and other planners based on planning graphs.  相似文献   

18.
In this article, we formulate and study quantum analogues of randomized search heuristics, which make use of Grover search (in Proceedings of the 28th Annual ACM Symposium on Theory of Computing, pp. 212–219. ACM, New York, 1996) to accelerate the search for improved offsprings. We then specialize the above formulation to two specific search heuristics: Random Local Search and the (1+1) Evolutionary Algorithm. We call the resulting quantum versions of these search heuristics Quantum Local Search and the (1+1) Quantum Evolutionary Algorithm. We conduct a rigorous runtime analysis of these quantum search heuristics in the computation model of quantum algorithms, which, besides classical computation steps, also permits those unique to quantum computing devices. To this end, we study the six elementary pseudo-Boolean optimization problems OneMax, LeadingOnes, Discrepancy, Needle, Jump, and TinyTrap. It turns out that the advantage of the respective quantum search heuristic over its classical counterpart varies with the problem structure and ranges from no speedup at all for the problem Discrepancy to exponential speedup for the problem TinyTrap. We show that these runtime behaviors are closely linked to the probabilities of performing successful mutations in the classical algorithms.  相似文献   

19.
Planning graphs have been shown to be a rich source of heuristic information for many kinds of planners. In many cases, planners must compute a planning graph for each element of a set of states, and the naive technique enumerates the graphs individually. This is equivalent to solving a multiple-source shortest path problem by iterating a single-source algorithm over each source.We introduce a data-structure, the state agnostic planning graph, that directly solves the multiple-source problem for the relaxation introduced by planning graphs. The technique can also be characterized as exploiting the overlap present in sets of planning graphs. For the purpose of exposition, we first present the technique in deterministic (classical) planning to capture a set of planning graphs used in forward chaining search. A more prominent application of this technique is in conformant and conditional planning (i.e., search in belief state space), where each search node utilizes a set of planning graphs; an optimization to exploit state overlap between belief states collapses the set of sets of planning graphs to a single set. We describe another extension in conformant probabilistic planning that reuses planning graph samples of probabilistic action outcomes across search nodes to otherwise curb the inherent prediction cost associated with handling probabilistic actions. Finally, we show how to extract a state agnostic relaxed plan that implicitly solves the relaxed planning problem in each of the planning graphs represented by the state agnostic planning graph and reduces each heuristic evaluation to counting the relevant actions in the state agnostic relaxed plan. Our experimental evaluation (using many existing International Planning Competition problems from classical and non-deterministic conformant tracks) quantifies each of these performance boosts, and demonstrates that heuristic belief state space progression planning using our technique is competitive with the state of the art.  相似文献   

20.
In this paper, a hybrid meta-heuristic is proposed which combines the GRASP with path relinking method and Column Generation. The key idea of this method is to run a GRASP with path relinking search on a restricted search space, defined by Column Generation, instead of running the search on the complete search space of the problem. Moreover, column generation is used not only to compute the initial restricted search space but also to modify it during the whole algorithm. The proposed heuristic is used to solve the network load balancing problem: given a capacitated telecommunications network with single path routing and an estimated traffic demand matrix, the network load balancing problem is the determination of a routing path for each traffic commodity such that the network load balancing is optimized, i.e., the worst link load is minimized, among all such solutions, the second worst link load is minimized, and continuing in this way until all link loads are minimized. The computational results presented in this paper show that, for the network load balancing problem, the proposed heuristic is effective in obtaining better quality solutions in shorter running times.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号