首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We introduce the automatic recording constraint (ARC) that can be used to model and solve scheduling problems where tasks may not overlap in time and the tasks linearly exhaust some resource. Since achieving generalized arc-consistency for the ARC is NP-hard, we develop a filtering algorithm that achieves approximated consistency only. Numerical results show the benefits of the new constraint on three out of four different types of benchmark sets for the automatic recording problem. On these instances, run-times can be achieved that are orders of magnitude better than those of the best previous constraint programming approach.  相似文献   

2.
Imen Zghidi 《Constraints》2017,22(1):101-102
In most industrial contexts, decisions are made based on incomplete information. This is due to the fact that decision makers cannot be certain of the future behavior of factors that will affect the outcome resulting from various options under consideration. Stochastic Constraint Satisfaction Problems provide a powerful modeling framework for problems in which one is required to take decisions under uncertainty. In these stochastic problems, the uncertainty is modeled by using discrete random variables to capture uncontrollable factors like the customer demands, the processing times of machines, house prices, etc. These discrete random variables can take on a set of possible different values, each with an associated probability and are useful to model factors that fall outside the control of the decision maker who only knows the probability distribution function of these random variables which can be forecasted, for instance, by looking at the past behavior of such factors. There are controllable variables on which one can decide, named decision variables which allow to model the set of possible choices for the decisions to be made. Finally, such problems comprise chance constraints which express the relationship between random and decision variables that should be satisfied within a satisfaction probability threshold – since finding decisions that will always satisfy the constraints in an uncertain environment is almost impossible.If the random variables’ support set is infinite, the number of scenarios would be infinite. Hence, finding a solution in such cases is impossible in general. In this thesis, within the context of an infinite set of scenarios, we propose a novel notion of statistical consistency. Statistical consistency lifts the notion of consistency of deterministic constraints to infinite chance constraints. The essence of this novel notion of consistency is to be able to make an inference, in the presence of infinite scenarios in an uncertain environment, based on a restricted finite subset of scenarios with a certain confidence level and a threshold error. The confidence level is the probability that characterises the extent to which our inference, based on a subset of scenarios, is correct whereas the threshold error is the error range that we can tolerate while making such an inference. The statistical consistency acknowledges the fact that making a perfect inference in an uncertain environment and with an infinite number of scenarios is impossible. The statistical consistency, thus, with its reliance on a limited number of scenarios, a confidence level, and a threshold error constitutes a valid and an appropriate practical road that one can take in order to tackle infinite chance constraints.We design two novel approaches based on confidence intervals to enforce statistical consistency as well as a novel third approach based on hypothesis testing. We analyze the various methods theoretically as well as experimentally. Our empirical evaluation shows the weaknesses and strengths of each of the three methods in making a correct inference from a restricted subset of scenarios for enforcing statistical consistency. Overall, while the first two methods are able to make a correct inference in most of the cases, the third is a superior, effective, and robust one in all cases.  相似文献   

3.
Personnel rostering is a challenging combinatorial optimization problem in which shifts are assigned to employees over a scheduling period while subject to organizational, legislative, and personal constraints. Academic models for personnel rostering typically abstractly conceptualize complex real‐world problem characteristics. Often only one isolated scheduling period is considered, contradicting common practice where personnel rostering inherently spans multiple dependent periods. The state of the art offers no systematic approach to address this modeling challenge, and consequently, few models capture the requirements imposed by practice. The present paper introduces the concepts of local and global consistency in constraint evaluation processes and proposes a general methodology to address these challenges in integer programming approaches. The impact of inconsistent constraint evaluation is analyzed in a case study concerning rostering nurses in a hospital ward, of which the data have been made publicly available. The results demonstrate that the proposed methodology approximates the optimal solution.  相似文献   

4.
Constraint satisfaction problems play a central role in artificial intelligence. A class of network consistency algorithms for eliminating local inconsistencies in such problems has previously been described. We analyze the time complexity of several node, arc and path consistency algorithms and prove that arc consistency is achievable in time linear in the number of binary constraints. The Waltz filtering algorithm is a special case of the arc consistency algorithm. In the edge labelling computational vision application the constraint graph is planar and so the time complexity is linear in the number of variables.  相似文献   

5.
6.
The AllDifferent constraint is a crucial component of any constraint toolkit, language or solver, since it is very widely used in a variety of constraint models. The literature contains many different versions of this constraint, which trade strength of inference against computational cost. In this paper, we focus on the highest strength of inference, enforcing a property known as generalised arc consistency (GAC). This work is an analytical survey of optimizations of the main algorithm for GAC for the AllDifferent constraint. We evaluate empirically a number of key techniques from the literature. We also report important implementation details of those techniques, which have often not been described in published papers. We pay particular attention to improving incrementality by exploiting the strongly-connected components discovered during the standard propagation process, since this has not been detailed before. Our empirical work represents by far the most extensive set of experiments on variants of GAC algorithms for AllDifferent. Overall, the best combination of optimizations gives a mean speedup of 168 times over the same implementation without the optimizations.  相似文献   

7.
Convex quadratic fitting (CQF) has demonstrated great success recently in the task of non-rigidly registering a face in a still image using a constrained local model (CLM). A CLM is a commonly used model for non-rigid object registration and contains two components: (i) local patch-experts that model the appearance of each landmark in the object, and (ii) a global shape prior describing how each of these landmarks can vary non-rigidly. Conventional CLMs can be used in non-rigid facial tracking applications through a track-by-detection strategy. However, the registration performance of such a strategy is susceptible to local appearance ambiguity. Since there is no motion continuity constraint between neighboring frames of the same sequence, the resultant object alignment might not be consistent from frame to frame and the motion field is not temporally smooth. In this paper, we extend the CQF fitting method into the spatio-temporal domain by enforcing the appearance consistency constraint of each local patch between neighboring frames. More importantly, we show, as in the original CQF formulation, that the global warp update can be optimized jointly in an efficient manner. Finally, we demonstrate that our approach receives improved performance for the task of non-rigid facial motion tracking on the videos of clinical patients.  相似文献   

8.
We provide a reformulation of the constraint hierarchies (CHs) framework based on the notion of error indicators. Adapting the generalised view of local consistency in semiring-based constraint satisfaction problems, we define constraint hierarchy k-consistency (CH-k-C) and give a CH-2-C enforcement algorithm. We demonstrate how the CH-2-C algorithm can be seamlessly integrated into the ordinary branch-and-bound algorithm to make it a finite domain (FD) CH solver. Experimentation confirms the efficiency and robustness of our proposed solver prototype. Unlike other FD CH solvers, our proposed method works for both local and global comparators. In addition, our solver can support arbitrary error functions.  相似文献   

9.
Constraint satisfaction problems can be solved by network consistency algorithms that eliminate local inconsistencies before constructing global solutions. We describe a new algorithm that is useful when the variable domains can be structured hierarchically into recursive subsets with common properties and common relationships to subsets of the domain values for related variables. The algorithm, HAC, uses a technique known as hierarchical arc consistency. Its performance is analyzed theoretically and the conditions under which it is an improvement are outlined. The use of HAC in a program for understanding sketch maps, Mapsee3, is briefly discussed and experimental results consistent with the theory are reported.  相似文献   

10.
Inductive Logic Programming (ILP) deals with the problem of finding a hypothesis covering positive examples and excluding negative examples, where both hypotheses and examples are expressed in first-order logic. In this paper we employ constraint satisfaction techniques to model and solve a problem known as template ILP consistency, which assumes that the structure of a hypothesis is known and the task is to find unification of the contained variables. In particular, we present a constraint model with index variables accompanied by a Boolean model to strengthen inference and hence improve efficiency. The efficiency of models is demonstrated experimentally.  相似文献   

11.
The AtMostSeqCard constraint is the conjunction of a cardinality constraint on a sequence of n variables and of n???q?+?1 constraints AtMost u on each subsequence of size q. This constraint is useful in car-sequencing and crew-rostering problems. In van Hoeve et al. (Constraints 14(2):273–292, 2009), two algorithms designed for the AmongSeq constraint were adapted to this constraint with an O(2 q n) and O(n 3) worst case time complexity, respectively. In Maher et al. (2008), another algorithm similarly adaptable to filter the AtMostSeqCard constraint with a time complexity of O(n 2) was proposed. In this paper, we introduce an algorithm for achieving arc consistency on the AtMostSeqCard constraint with an O(n) (hence optimal) worst case time complexity. Next, we show that this algorithm can be easily modified to achieve arc consistency on some extensions of this constraint. In particular, the conjunction of a set of m AtMostSeqCard constraints sharing the same scope can be filtered in O(nm). We then empirically study the efficiency of our propagator on instances of the car-sequencing and crew-rostering problems.  相似文献   

12.
针对三维面皮生理点对应关系建立这一难题,充分考虑测地距离在描述复杂几何体表面形状方面的优势,提出了基于变形与测地距离一致性约束的3D面皮生理点对应方法。首先在Frankfurt坐标变换后标定面皮特征点集,利用特征点对应关系进行TPS变形;然后根据特征点几何特征向量建立初始点对应关系集,并利用测地距离一致性约束对其进行修剪以生成对应关系核心集;最后扩展对应关系核心集,直至确定源模型上每一顶点的对应关系。实验表明,该方法提高了点对应关系准确度,可有效建立三维面皮生理点对应关系。  相似文献   

13.
Due to the fact that neighboring hyperspectral pixels often belong to the same class with high probability, spatial correlation between pixels has been widely used in hyperspectral image classification. In this paper, a novel joint sparse representation classifier with spectral consistency constraint (JSRC-SCC) is proposed. Specifically, to efficiently exploit contextual structure information, a local adaptive weighted average value is reallocated as the central pixel of a window through spatial filtering, and then, representation coefficients are estimated by the joint sparse representation model, which is imposed by the spectral consistency constraint under \(\textit{l}_1\)-minimization. For the purpose of fast classification, graphics processing units are adopted to accelerate this model. Experimental results on two classical hyperspectral image data sets demonstrate the proposed method can not only produce satisfying classification performance, but also shorten the computational time significantly.  相似文献   

14.
Unwittingly, computing centre personnel may encourage potential burglars by supplying tools which threaten system integrity. Because system support staff and burglars seem to have goals in common, the activities of the former could give the latter a distinct advantage in violating security barriers. real-life examples will clarify the risks involved in customary computing centre practice. Aspects of an effective security policy will be presented.  相似文献   

15.
Concepts from the algebraic theory of finite automata are carried over to the program-over-monoid setting which underlies Barrington's algebraic characterization of the complexity classNC 1. Sets of languages accepted by polynomial-length programs over finite monoids drawn from a given monoid variety V emerge as fundamental language classes: as V ranges over monoid varieties these classes capture and indeed refine the usual bounded-depth circuit parametrization of nonuniformNC 1 subclasses. Here it is shown that any two separable such language classes can be separated by a regular language. Basic properties of these language classes are exhibited. New conditions are given under which distinct varieties lead to equal or to distinct language classes, thus sharpening our knowledge of the internal structure of non-uniformNC 1. The paper concludes with the statement of a conjecture whose proof would refine and then resolve most open questions about this internal structure.  相似文献   

16.
Clark's attempt [1] to validate negation as failure in first order logic is shown to contain some fundamental errors. In particular, we show that the motivation for the completed database, the definition of the completed database, and the attempt to validate negation as failure in terms of it are illogical, that the completed database cannot be regarded as the intended meaning of the database, and that the closed world assumption is generally absurd and, in any case, irrelevant. A validation is given using a consistent first order extension of the database and hence in the only terms which appear to make any sense, namely, consistency with the database. However, it seems that the query evaluation process, with negation interpreted as failure, is of no practical use as a theorem prover.  相似文献   

17.
Over the past few years there has been considerable progress in methods to systematically analyse the complexity of constraint satisfaction problems with specified constraint types. One very powerful theoretical development in this area links the complexity of a set of constraints to a corresponding set of algebraic operations, known as polymorphisms.In this paper we extend the analysis of complexity to the more general framework of combinatorial optimisation problems expressed using various forms of soft constraints. We launch a systematic investigation of the complexity of these problems by extending the notion of a polymorphism to a more general algebraic operation, which we call a multimorphism. We show that many tractable sets of soft constraints, both established and novel, can be characterised by the presence of particular multimorphisms. We also show that a simple set of NP-hard constraints has very restricted multimorphisms. Finally, we use the notion of multimorphism to give a complete classification of complexity for the Boolean case which extends several earlier classification results for particular special cases.  相似文献   

18.
Constraint diagrams are a diagrammatic notation which may be used to express logical constraints. They generalize Venn diagrams and Euler circles, and include syntax for quantification and navigation of relations. The notation was designed to complement the Unified Modelling Language in the development of software systems.Since symbols representing quantification in a diagrammatic language can be naturally ordered in multiple ways, some constraint diagrams have more than one intuitive meaning in first-order predicate logic. Any equally expressive notation which is based on Euler diagrams and conveys logical statements using explicit quantification will have to address this problem.We explicitly augment constraint diagrams with reading trees, which provides a partial ordering for the quantifiers (determining their scope as well as their relative ordering). Alternative approaches using spatial arrangements of components, or alphabetical ordering of symbols, for example, can be seen as implicit representations of a reading tree.Whether the reading tree accompanies the diagram explicitly (optimizing expressiveness) or implicitly (simplifying diagram syntax), we show how to construct unambiguous semantics for the augmented constraint diagram.  相似文献   

19.
We show that generic viewpoint and lighting assumptions resolve standard visual ambiguities by biasing toward planar surfaces. Our model uses orthographic projection with a two-dimensional affine warp and Lambertian reflectance functions, including cast and attached shadows. We use uniform priors on nuisance variables such as viewpoint direction and the light source. Limitations of using uniform priors on nuisance variables are discussed.  相似文献   

20.
Consistency enforcement is used to prune the search tree of a constraint satisfaction problem (CSP). Arc consistency (AC) is a well-studied consistency level, with many implementations. Bounds consistency (BC), a looser consistency level, is known to have equal time complexity to AC. To solve a CSP, we have to implement an algorithm of our own or employ an existing solver. In any case, at some point, we have to decide between enforcing either AC or BC. As the choice between AC or BC is more or less predefined and currently made without considering the individualities of each CSP, this study attempts to make this decision deterministic and efficient, without the need of trial and error. We find that BC fits better while solving a CSP with its maximum domains' size being greater than its constrained variables number. We study the behavior of maintaining arc or bounds consistency during search, and we show how the overall search methods complexity is affected by the employed consistency level.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号