共查询到20条相似文献,搜索用时 15 毫秒
1.
Li Zhang Author Vitae 《Pattern recognition》2009,42(11):2961-2978
We present a restoration framework to reduce undesirable distortions in imaged documents. Our framework is based on two components: (1) an image inpainting procedure that can separate non-uniform illumination (and other) artifacts from the printed content and (2) a shape-from-shading (SfS) formulation that can reconstruct the 3D shape of the document's surface. Used either piecewise or in its entirety, this framework can correct a variety of distortions including shading, shadow, ink-bleed, show-through, perspective and geometric distortions, for both camera-imaged and flatbed-imaged documents. Our overall framework is described in detail. In addition, our SfS formulation can be easily modified to target various illumination conditions to suit different real-world applications. Results on images of synthetic and real documents demonstrate the effectiveness of our approach. OCR results are also used to gauge the performance of our approach. 相似文献
2.
Tijana Ru?i?Wilfried Philips 《Pattern recognition letters》2012,33(3):309-318
In this paper we propose a novel inference method for maximum a posteriori estimation with Markov random field prior. The central idea is to integrate a kind of joint “voting” of neighboring labels into a message passing scheme similar to loopy belief propagation (LBP). While the LBP operates with many pairwise interactions, we formulate “messages” sent from a neighborhood as a whole. Hence the name neighborhood-consensus message passing (NCMP). The practical algorithm is much simpler than LBP and combines the flexibility of iterated conditional modes (ICM) with some ideas of more general message passing. The proposed method is also a generalization of the iterated conditional expectations algorithm (ICE): we revisit ICE and redefine it in a message passing framework in a more general form. We also develop a simplified version of NCMP, called weighted iterated conditional modes (WICM), that is suitable for large neighborhoods. We verify the potentials of our methods on four different benchmarks, showing the improvement in quality and/or speed over related inference techniques. 相似文献
3.
Henrik Lieng 《Computer Graphics Forum》2017,36(7):195-205
We propose a framework for data‐driven manipulation and synthesis of component‐based vector graphics. Using labelled vector graphical images of a given type of object as input, our processing pipeline produces training data, learns a probabilistic Bayesian network from that training data, and offer various data‐driven vector‐related tools using synthesis functions. The tools ranges from data‐driven vector design to automatic synthesis of vector graphics. Our tools were well received by designers, our model provides good generalisation performance, also from small data sets, and our method for synthesis produces vector graphics deemed significantly more plausible compared with alternative methods. 相似文献
4.
Antonio Robles-Kelly Author Vitae Edwin R. Hancock Author Vitae 《Pattern recognition》2004,37(7):1387-1405
This paper presents an iterative spectral framework for pairwise clustering and perceptual grouping. Our model is expressed in terms of two sets of parameters. Firstly, there are cluster memberships which represent the affinity of objects to clusters. Secondly, there is a matrix of link weights for pairs of tokens. We adopt a model in which these two sets of variables are governed by a Bernoulli model. We show how the likelihood function resulting from this model may be maximised with respect to both the elements of link-weight matrix and the cluster membership variables. We establish the link between the maximisation of the log-likelihood function and the eigenvectors of the link-weight matrix. This leads us to an algorithm in which we iteratively update the link-weight matrix by repeatedly refining its modal structure. Each iteration of the algorithm is a three-step process. First, we compute a link-weight matrix for each cluster by taking the outer-product of the vectors of current cluster-membership indicators for that cluster. Second, we extract the leading eigenvector from each modal link-weight matrix. Third, we compute a revised link weight matrix by taking the sum of the outer products of the leading eigenvectors of the modal link-weight matrices. 相似文献
5.
We present a theoretical basis for supporting subjective and conditional probabilities in deductive databases. We design a language that allows a user greater expressive power than classical logic programming. In particular, a user can express the fact thatA is possible (i.e.A has non-zero probability),B is possible, but (A B) as a whole is impossible. A user can also freely specify probability annotations that may contain variables. The focus of this paper is to study the semantics of programs written in such a language in relation to probability theory. Our model theory which is founded on the classical one captures the uncertainty described in a probabilistic program at the level of Herbrand interpretations. Furthermore, we develop a fixpoint theory and a proof procedure for such programs and present soundness and completeness results. Finally we characterize the relationships between probability theory and the fixpoint, model, and proof theory of our programs. 相似文献
6.
Robert St-Aubin Joel Friedman Alan K. Mackworth 《Annals of Mathematics and Artificial Intelligence》2006,47(3-4):397-425
The development of autonomous agents, such as mobile robots and software agents, has generated considerable research in recent
years. Robotic systems, which are usually built from a mixture of continuous (analog) and discrete (digital) components, are
often referred to as hybrid dynamical systems. Traditional approaches to real-time hybrid systems usually define behaviors
purely in terms of determinism or sometimes non-determinism. However, this is insufficient as real-time dynamical systems
very often exhibit uncertain behavior. To address this issue, we develop a semantic model, Probabilistic Constraint Nets (PCN), for probabilistic hybrid systems. PCN captures the most general structure of dynamic systems, allowing systems with
discrete and continuous time/variables, synchronous as well as asynchronous event structures and uncertain dynamics to be
modeled in a unitary framework. Based on a formal mathematical paradigm exploiting abstract algebra, topology and measure
theory, PCN provides a rigorous formal programming semantics for the design of hybrid real-time embedded systems exhibiting
uncertainty.
相似文献
7.
A definition for the reliability of inferential sensor predictions is provided. A data-driven Bayesian framework for real-time performance assessment of inferential sensors is proposed. The main focus is on characterizing the effect of operating space on the reliability of inferential sensor predictions. A holistic, quantitative measure of the reliability of the inferential sensor predictions is introduced. A methodology is provided to define objective prior probabilities over plausible classes of reliability based on the total misclassification cost. The real-time performance assessment of multi-model inferential sensors is also discussed. The application of the method does not depend on the identification techniques employed for model development. Furthermore, on-line implementation of the method is computationally efficient. The effectiveness of the method is demonstrated through simulation and industrial case studies. 相似文献
8.
Djemel Ziou Author Vitae Touati Hamri Author VitaeAuthor Vitae 《Pattern recognition》2009,42(7):1511-1519
In this paper, we propose a probabilistic framework for efficient retrieval and indexing of image collections. This framework uncovers the hierarchical structure underlying the collection from image features based on a hybrid model that combines both generative and discriminative learning. We adopt the generalized Dirichlet mixture and maximum likelihood for the generative learning in order to estimate accurately the statistical model of the data. Then, the resulting model is refined by a new discriminative likelihood that enhances the power of relevant features. Consequently, this new model is suitable for modeling high-dimensional data described by both semantic and low-level (visual) features. The semantic features are defined according to a known ontology while visual features represent the visual appearance such as color, shape, and texture. For validation purposes, we propose a new visual feature which has nice invariance properties to image transformations. Experiments on the Microsoft's collection (MSRCID) show clearly the merits of our approach in both retrieval and indexing. 相似文献
9.
This study presents a probabilistic framework to simulate dam breach and evaluates the impact of using four empirical dam breach prediction methods on breach parameters (i.e., geometry and timing) and outflow hydrograph attributes (i.e., time to peak, hydrograph duration and peak). The methods that are assessed here include MacDonald and Langridge-Monopolis (1984), Von Thun and Gillette (1990), Froehlich (1995), 2008). Mean values and percentiles of breach parameters and outflow hydrograph attributes are compared for hypothetical overtopping failure of Burnett Dam in the state of North Carolina, USA. Furthermore, utilizing the probabilistic framework, the least and most uncertain methods alongside those giving the most critical value are identified for these parameters. The multivariate analysis also indicates that lone use of breach parameters is not necessarily sufficient to characterize outflow hydrograph attributes. However, timing characteristic of the breach is generally a more important driver than its geometric features. 相似文献
10.
The latest developments in human computer interfaces aim at greater ease of use, and the exploitation of human communication
and interaction skills typical of non-computerised environments. This kind of interaction is continuous rather than purely
discrete. Continuous interaction implies a tighter coupling between system and user, and raises complicated synchronisation
issues where real-time requirements and intrinsic variation of human behaviour play an essential role. In this paper, we propose
a human centred layered reference model to reduce the design complexity of systems exhibiting continuous interaction. In the
context of the layered model, we discuss the role that formal modelling can play in the design of these systems.
Published online: 14 May 2002 相似文献
11.
The objective of this paper is twofold. First, the problem of generation of real random matrix samples with uniform distribution in structured (spectral) norm bounded sets is studied. This includes an analysis of the distribution of the singular values of uniformly distributed real matrices, and an efficient (i.e. polynomial-time) algorithm for their generation. Second, it is shown how the developed techniques may be used to solve in a probabilistic setting several hard problems involving systems subject to real structured uncertainty. 相似文献
12.
We propose a probabilistic variant of the pi-calculus as a framework to specify randomized security protocols and their intended properties. In order to express and verify the correctness of the protocols, we develop a probabilistic version of the testing semantics. We then illustrate these concepts on an extended example: the Partial Secret Exchange, a protocol which uses a randomized primitive, the Oblivious Transfer, to achieve fairness of information exchange between two parties. 相似文献
13.
We constructed a probabilistic simulator that allows all the events in population dynamics such as death, birth, mutation, and suppression/stimulation to be described by probabilistic rules. The simulator also facilitates a lattice used for expressing distribution and diversity (number of distinct strains) of quasispecies. The simulator is used to investigate the diversity threshold in HIV and T-cell interaction. This work was presented in part at the 13th International Symposium on Artificial Life and Robotics, Oita, Japan, January 31–February 2, 2008 相似文献
14.
Ö. Özgür Tanr?överAuthor Vitae Semih BilgenAuthor Vitae 《Computer Standards & Interfaces》2011,33(5):448-464
Conceptual models are used in understanding and communicating the domain of interest during analysis phase of system development. As they are used in early phases, errors and omissions may propagate to later phases and may be very costly to correct. This paper proposes a framework for evaluating conceptual models when represented in a domain specific language based on UML constructs. The framework describes the main aspects to be considered when conceptual models are represented in a domain specific language, presents a classification of semantic issues and some evaluation indicators. The indicators can, in principle, identify situations in the models where inconsistencies or incompleteness might occur. Whether these are real concerns might depend on domain semantics, hence these are semantic, not syntactic checks. The use of the proposed review framework is illustrated in the context of two conceptual models in a domain specific notation, KAMA. With reviews based on the framework, it is possible to spot semantic issues which are not noticed by case tools and help the analyst to identify more information about the domain. 相似文献
15.
Knowledge patterns, such as association rules, clusters or decision trees, can be defined as concise and relevant information that can be extracted, stored, analyzed, and manipulated by knowledge workers in order to drive and specialize business decision processes. In this paper we deal with data mining patterns. The ability to manipulate different types of patterns under a unified environment is becoming a fundamental issue for any ‘intelligent’ and data-intensive application. However, approaches proposed so far for pattern management usually deal with specific and predefined types of patterns and mainly concern pattern extraction and exchange issues. Issues concerning the integrated, advanced management of heterogeneous patterns are in general not (or marginally) taken into account. 相似文献
16.
17.
Doron Drusinsky James Bret Michael Man-Tak Shing 《Innovations in Systems and Software Engineering》2008,4(2):161-168
This paper presents a framework for augmenting independent validation and verification (IV&V) of software systems with computer-based
IV&V techniques. The framework allows an IV&V team to capture its own understanding of the application as well as the expected
behavior of any proposed system for solving the underlying problem by using an executable system reference model, which uses formal assertions to specify mission- and safety-critical behaviors. The framework uses execution-based model
checking to validate the correctness of the assertions and to verify the correctness and adequacy of the system under test. 相似文献
18.
Recent distributed shared memory (DSM) systems provide increasingly more support for the sharing of objects rather than portions of memory. However, like earlier DSM systems these distributed shared object systems (DSO) still force developers to use a single protocol, or a small set of given protocols, for the sharing of application objects. This limitation prevents the applications from optimizing their communication behaviour and results in unnecessary overhead. A current general trend in software systems development is towards customizable systems, for example frameworks, reflection, and aspect‐oriented programming all aim to give the developer greater flexibility and control over the functionality and performance of their code. This paper describes a novel object‐oriented framework that defines a DSM system in terms of a consistency model and an underlying coherency protocol. Different consistency models and coherency protocols can be used within a single application because they can be customized, by the application programmer, on a per‐object basis. This allows application specific semantics to be exploited at a very fine level of granularity and with a resulting improvement in performance. The framework is implemented in JAVA and the speed‐up obtained by a number of applications that use the framework is reported. Copyright © 2002 John Wiley & Sons, Ltd. 相似文献
19.
A data model and algebra for probabilistic complex values 总被引:1,自引:0,他引:1
Thomas Eiter Thomas Lukasiewicz Michael Walter 《Annals of Mathematics and Artificial Intelligence》2001,33(2-4):205-252
We present a probabilistic data model for complex values. More precisely, we introduce probabilistic complex value relations, which combine the concept of probabilistic relations with the idea of complex values in a uniform framework. We elaborate a model-theoretic definition of probabilistic combination strategies, which has a rigorous foundation on probability theory. We then define an algebra for querying database instances, which comprises the operations of selection, projection, renaming, join, Cartesian product, union, intersection, and difference. We prove that our data model and algebra for probabilistic complex values generalizes the classical relational data model and algebra. Moreover, we show that under certain assumptions, all our algebraic operations are tractable. We finally show that most of the query equivalences of classical relational algebra carry over to our algebra on probabilistic complex value relations. Hence, query optimization techniques for classical relational algebra can easily be applied to optimize queries on probabilistic complex value relations. 相似文献
20.
Masahiko Sato 《Journal of Intelligent Information Systems》2008,31(2):111-125
We propose a logical framework, called Natural Framework (NF), which supports formal reasoning about computation and logic
(CAL) on a computer. NF is based on a theory of Judgments and Derivations. NF is designed by observing how working mathematical
theories are created and developed. Our observation is that the notions of judgments and derivations are the two fundamental
notions used in any mathematical activity. We have therefore developed a theory of judgments and derivations and designed
a framework in which the theory provides a uniform and common play ground on which various mathematical theories can be defined as derivation games and can be played, namely, can write and check proofs. NF is equipped with a higher-order intuitionistic logic and derivations
(proofs) are described following Gentzen’s natural deduction style. NF is part of an interactive computer environment CAL
and it is also referred to as NF/CAL. CAL is written in Emacs Lisp and it is run within a special buffer of the Emacs editor.
CAL consists of user interface, a general purpose parser and a checker for checking proofs of NF derivation games. NF/CAL
system has been successfully used as an education system for teaching CAL for undergraduate students for about 8 years. We
will give an overview of the NF/CAL system both from theoretical and practical sides. 相似文献