Weighted Max-SAT is the optimization version of SAT and many important problems can be naturally encoded as such. Solving weighted Max-SAT is an important problem from both a theoretical and a practical point of view. In recent years, there has been considerable interest in finding efficient solving techniques. Most of this work focuses on the computation of good quality lower bounds to be used within a branch and bound DPLL-like algorithm. Most often, these lower bounds are described in a procedural way. Because of that, it is difficult to realize the logic that is behind.In this paper we introduce an original framework for Max-SAT that stresses the parallelism with classical SAT. Then, we extend the two basic SAT solving techniques: search and inference. We show that many algorithmic tricks used in state-of-the-art Max-SAT solvers are easily expressible in logical terms in a unified manner, using our framework.We also introduce an original search algorithm that performs a restricted amount of weighted resolution at each visited node. We empirically compare our algorithm with a variety of solving alternatives on several benchmarks. Our experiments, which constitute to the best of our knowledge the most comprehensive Max-SAT evaluation ever reported, demonstrate the practical usability of our approach. 相似文献
The proposed European Artificial Intelligence Act (AIA) is the first attempt to elaborate a general legal framework for AI carried out by any major global economy. As such, the AIA is likely to become a point of reference in the larger discourse on how AI systems can (and should) be regulated. In this article, we describe and discuss the two primary enforcement mechanisms proposed in the AIA: the conformity assessments that providers of high-risk AI systems are expected to conduct, and the post-market monitoring plans that providers must establish to document the performance of high-risk AI systems throughout their lifetimes. We argue that the AIA can be interpreted as a proposal to establish a Europe-wide ecosystem for conducting AI auditing, albeit in other words. Our analysis offers two main contributions. First, by describing the enforcement mechanisms included in the AIA in terminology borrowed from existing literature on AI auditing, we help providers of AI systems understand how they can prove adherence to the requirements set out in the AIA in practice. Second, by examining the AIA from an auditing perspective, we seek to provide transferable lessons from previous research about how to refine further the regulatory approach outlined in the AIA. We conclude by highlighting seven aspects of the AIA where amendments (or simply clarifications) would be helpful. These include, above all, the need to translate vague concepts into verifiable criteria and to strengthen the institutional safeguards concerning conformity assessments based on internal checks.
Applied Intelligence - Forecasting future heat load in smart district heating networks is a key problem for utility companies that need such predictions for optimizing their operational activities.... 相似文献
Coplanar Al/graphene/Al junctions fabricated on the same graphene sheet deposited on silicon carbide (SiC), show robust Josephson coupling at subKelvin temperature, when the separations between the electrodes is below 400 nm. Remarkably, a hysteretic Critical State sets in when ramping an orthogonal magnetic field, with a sudden collapse of the Josephson critical current Ic when turning the field on, and a revival of Ic when inverting the sweep. Similar hysteresis can be found in granular superconducting films which may undergo the Berezinskii-Kosterlitz-Thouless transition. Here, we give quantitative arguments to prove that this odd behavior of the magnetoconductance gives evidence for an incipient Berezinskii-Kosterlitz-Thouless transition with drift and pinning of fluctuating free vortices induced by the current bias. 相似文献
One of the important obstacles in the image-based analysis of the human face is the 3D nature of the problem and the 2D nature of most imaging systems used for biometric applications. Due to this, accuracy is strongly influenced by the viewpoint of the images, being frontal views the most thoroughly studied. However, when fully automatic face analysis systems are designed, capturing frontal-view images cannot be guaranteed. Examples of this situation can be found in surveillance systems, car driver images or whenever there are architectural constraints that prevent from placing a camera frontal to the subject. Taking advantage of the fact that most facial features lie approximately on the same plane, we propose the use of projective geometry across different views. An active shape model constructed with frontal-view images can then be directly applied to the segmentation of pictures taken from other viewpoints. The proposed extension demonstrates being significantly more invariant than the standard approach. Validation of the method is presented in 360 images from the AV@CAR database, systematically divided into three different rotations (to both sides), as well as upper and lower views due to nodding. The presented tests are among the largest quantitative results reported to date in face segmentation under varying poses. 相似文献
Multimedia Tools and Applications - This paper describes a 63-participant user study that compares two widely known systems supporting end users in creating trigger-action rules for the Internet of... 相似文献
This paper describes an algorithm to enforce hyper-arc consistency of polynomial constraints defined over finite domains. First, the paper describes the language of so called polynomial constraints over finite domains, and it introduces a canonical form for such constraints. Then, the canonical form is used to transform the problem of testing the satisfiability of a constraint in a box into the problem of studying the sign of a related polynomial function in the same box, a problem which is effectively solved by using the modified Bernstein form of polynomials. The modified Bernstein form of polynomials is briefly discussed, and the proposed hyper-arc consistency algorithm is finally detailed. The proposed algorithm is a subdivision procedure which, starting from an initial approximation of the domains of variables, removes values from domains to enforce hyper-arc consistency. 相似文献
A model of turbulent premixed combustion is formulated based on a recent dispersion model and on earlier flame models. The dispersion model accounts for the effect of velocity correlation on turbulent dispersion without using parameters that are explicitly dependent on time or space. The earlier flame models are used as a basis for formulating an appropriate chemical source term. The resulting model is more general than the earlier models.The proposed model and one of the earlier models are validated against various experimental flames. It is shown that the proposed model is more accurate and requires less calibration. Furthermore, the proposed model is tested successfully against general criteria for premixed combustion modeling formulated in earlier works. It is therefore concluded that the dispersion model is a useful tool for premixed combustion modeling. 相似文献
We define the notion of rational presentation of a complete metric space, in order to study metric spaces from the algorithmic complexity point of view. In this setting, we study some representations of the space C[0,1] of uniformly continuous real functions over [0,1] with the usual norm: ||f||∞ = Sup{|f(x)|; 0x1}. This allows us to have a comparison of global kind between complexity notions attached to these presentations. In particular, we get a generalization of Hoover's results concerning the Weierstrass approximation theorem in polynomial time. We get also a generalization of previous results on analytic functions which are computable in polynomial time. 相似文献