On the learning of rule uncertainties and their integration into probabilistic knowledge bases |
| |
Authors: | Beat Wüthrich |
| |
Affiliation: | (1) ECRC, Arabellastr. 17, 81925 Munich, Germany |
| |
Abstract: | We present a natural and realistic knowledge acquisition and processing scenario. In the first phase a domain expert identifies deduction rules that he thinks are good indicators of whether a specific target concept is likely to occur. In a second knowledge acquisition phase, a learning algorithm automatically adjusts, corrects and optimizes the deterministic rule hypothesis given by the domain expert by selecting an appropriate subset of the rule hypothesis and by attaching uncertainties to them. Then, in the running phase of the knowledge base we can arbitrarily combine the learned uncertainties of the rules with uncertain factual information.Formally, we introduce the natural class of disjunctive probabilistic concepts and prove that this class is efficiently distribution-free learnable. The distribution-free learning model of probabilistic concepts was introduced by Kearns and Schapire and generalizes Valiant's probably approximately correct learning model. We show how to simulate the learned concepts in probabilistic knowledge bases which satisfy the laws of axiomatic probability theory. Finally, we combine the rule uncertainties with uncertain facts and prove the correctness of the combination under an independence assumption. |
| |
Keywords: | computational learning probability theory stratified Datalog uncertainty |
本文献已被 SpringerLink 等数据库收录! |
|