首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
A logic-based system of knowledge representation for natural language discourse has three primary advantages:
–  • It has adequate expressive power,
–  • it has a well-defined semantics, and
–  • it uses simple, sound, general rules of inference.
On the other hand, a standard logic-based system has the following disadvantages:
–  • It supports only an exceedingly complex mapping from surface discourse sentences to internal representations, and
–  • reasoning about the content of semantically complex discourses is difficult because of the incommodious complexity of the internalized formulas.
  相似文献   

2.
3.
We consider the problem of efficiently sampling Web search engine query results. In turn, using a small random sample instead of the full set of results leads to efficient approximate algorithms for several applications, such as:
•  Determining the set of categories in a given taxonomy spanned by the search results;
•  Finding the range of metadata values associated with the result set in order to enable “multi-faceted search”;
•  Estimating the size of the result set;
•  Data mining associations to the query terms.
We present and analyze efficient algorithms for obtaining uniform random samples applicable to any search engine that is based on posting lists and document-at-a-time evaluation. (To our knowledge, all popular Web search engines, for example, Google, Yahoo Search, MSN Search, Ask, belong to this class.) Furthermore, our algorithm can be modified to follow the modern object-oriented approach whereby posting lists are viewed as streams equipped with a next method, and the next method for Boolean and other complex queries is built from the next method for primitive terms. In our case we show how to construct a basic sample-next(p) method that samples term posting lists with probability p, and show how to construct sample-next(p) methods for Boolean operators (AND, OR, WAND) from primitive methods. Finally, we test the efficiency and quality of our approach on both synthetic and real-world data. A preliminary version of this work has appeared in [3]. Work performed while A. Anagnostopoulos and A.Z. Broder were at IBM T. J. Watson Research Center.  相似文献   

4.
A formal definition of a combinatorial topology is presented in this paper for the discrete N-dimensional space defined by the An* lattice. The use of this grid instead of the classical ℤn is based on two arguments:
– It is the optimal sampling grid in the sense of Shannon’s sampling theorem in 2 and 3 dimensions,
– It induces the simplest discrete topology definition because its dual is a K-simplex.
Their interest in image processing and in the medical imaging field is presented with some examples.  相似文献   

5.
We present quantum algorithms for the following matching problems in unweighted and weighted graphs with n vertices and m edges:
•  Finding a maximal matching in general graphs in time .
•  Finding a maximum matching in general graphs in time .
•  Finding a maximum weight matching in bipartite graphs in time , where N is the largest edge weight.
Our quantum algorithms are faster than the best known classical deterministic algorithms for the corresponding problems. In particular, the second result solves an open question stated in a paper by Ambainis and Špalek (Proceedings of STACS’06, pp. 172–183, 2006).  相似文献   

6.
7.
Conclusion  We have provided a theoretical and methodological justification of complete construction of a universal formalized language of knowledge. We have proved the following:
–  the base term classification ensures unambiguous and deep indexation of documents;
–  the fixed sentence syntax makes it possible to standardize information-retrieval languages and automatic translation between languages;
–  the fixed message semantics provides the following opportunities: measuring the semantic information; rating the intensification of intellectual effort; eliminating unjustified duplication of research and publication; providing the user with timely necessary information in a form suitable for direct processing and use; organizing a national cost-efficient communication technology; solving linguistic problems of artificial intelligence and informatization of society; creating a reliable structural foundation for the development of a common unambiguous language for the entire humanity.
Deceased. Translated from Kibernetika i Sistemnyi Analiz, No. 4, pp. 154–162, July–August, 1997.  相似文献   

8.
We consider hypotheses about nondeterministic computation that have been studied in different contexts and shown to have interesting consequences:
•  The measure hypothesis: NP does not have p-measure 0.
•  The pseudo-NP hypothesis: there is an NP language that can be distinguished from any DTIME language by an NP refuter.
•  The NP-machine hypothesis: there is an NP machine accepting 0* for which no -time machine can find infinitely many accepting computations.
We show that the NP-machine hypothesis is implied by each of the first two. Previously, no relationships were known among these three hypotheses. Moreover, we unify previous work by showing that several derandomizations and circuit-size lower bounds that are known to follow from the first two hypotheses also follow from the NP-machine hypothesis. In particular, the NP-machine hypothesis becomes the weakest known uniform hardness hypothesis that derandomizes AM. We also consider UP versions of the above hypotheses as well as related immunity and scaled dimension hypotheses.   相似文献   

9.
This special section is devoted to a selection of papers that originally appeared in two thematic sessions on high-level testing of complex systems at IDPT 2002 and 2003, the 6th and 7th World Conference on Integrated Design and Process Technology, which took place in Pasadena, CA in June 2002 and in Austin, TX in December 2003, respectively.This collection of papers spans a wide panoramic view on the development of testing and validation technology along several dimmensions. It touches on issues such as
– Kinds of systems
– Kinds of testing or validation
– Practical difficulties (in the application area)
– Technical difficulties, e.g., state explosion, heterogeneity, etc.
– Particular approaches, i.e., methods tools and whether or not they are linked to other areas such as formal verification, simulation, abstract interpretation, etc.
– Current state of advancement and uptake (conceptual, implemented, industrial product, etc.)
All seven papers present methods, tools, and case studies that aim at using diverse formal techniques for testing complex systems.  相似文献   

10.
11.
This paper summarizes our recent activities to support people to communicate with each other using public computer network systems. Unlike conventional teleconferencing systems, which are mainly for business meetings, we focus on informal communication in open orgnizations. So far, three different systems have been developed and actually tested.
–  • InSocia, we introduced vision agents which act on behalf of their users in a network. To enable a meeting to be scheduled at a mutually acceptable time, we proposed the scheme called non-committed scheduling.
–  Free Walk supports casual meetings among more than a few people. For this purpose, we provide a 3-D virtual space calledcommunity common where participants can behave just as in real life.
–  • In theICMAS’96 Mobile Assistant Project, on the other hand, we conducted an experiment in an actual international conference using 100 personal digital assistants and wireless phones. Various services were provided to increase the interactions among participants of the conference.
Based on these experiences, we are now moving towardscommunity-ware to support people to form a community based on computer network technologies. Toru Ishida, Dr. Eng.: He received the B. E., M. Eng. and D. Eng. degrees from Kyoto University, Kyoto, Japan, in 1976, 1978 and 1989, respectively. He is currently a professor of Department of Information Science, Kyoto University. He has been working on “Parallel, Distributed and Multiagent Production Systems (Springer, 1994)” from 1983. He first proposed parallel rule firing, and extended it to distributed rule firing. Organizational self-design was then introduced into distributed production systems for increasing adaptiveness. From 1990, he started working on “Real-time Search for Learning Autonomous Agents (Kluwer Academic Publishers, 1997).” Again, organizational adaptation becomes a central issue in controlling multiple problem solving agents. He recently initiated the study of “Communityware: Towards Global Collaboration (John Wiley and Sons, 1998)” with his colleagues.  相似文献   

12.
We consider distribution-free property-testing of graph connectivity. In this setting of property testing, the distance between functions is measured with respect to a fixed but unknown distribution D on the domain, and the testing algorithm has an oracle access to random sampling from the domain according to this distribution D. This notion of distribution-free testing was previously defined, and testers were shown for very few properties. However, no distribution-free property testing algorithm was known for any graph property. We present the first distribution-free testing algorithms for one of the central properties in this area—graph connectivity (specifically, the problem is mainly interesting in the case of sparse graphs). We introduce three testing models for sparse graphs:
•  A model for bounded-degree graphs,
•  A model for graphs with a bound on the total number of edges (both models were already considered in the context of uniform distribution testing), and
•  A model which is a combination of the two previous testing models; i.e., bounded-degree graphs with a bound on the total number of edges.
We prove that connectivity can be tested in each of these testing models, in a distribution-free manner, using a number of queries that is independent of the size of the graph. This is done by providing a new analysis to previously known connectivity testers (from “standard”, uniform distribution property-testing) and by introducing some new testers. An extended abstract of this work appeared in the proceedings of RANDOM-APPROX 2004.  相似文献   

13.
Garibaldo  F.  Rebecchi  E. 《AI & Society》2004,18(1):44-67
In this paper the authors, starting from the experience described and commented on in earlier work by Mancini and Sbordone, deal with the three main epistemological problems that the research group they participated in had to face:
–  The conflicting and ambiguous relationship between psychoanalysis and social research
–  The classical epistemological problem of the relationship between the subject and object of research within the perspective of action research
–  The problem arising from their experience, i.e., the risk of manipulation, and the way to deal with it from an epistemic perspective
The three problems are dealt with one at a time, but from a common perspective, i.e., the attempt to integrate the richness and variety of human subjectivity in social research. As to the relationship between psychoanalysis and social research, a special section is devoted to the implications of an integrated or convergent methodology on team-working in organisations.
F. GaribaldoEmail:
  相似文献   

14.
Web-based bid invitation platforms and reverse auctions are increasingly used by consumers for the procurement of goods and services. An empirical examination shows that with B-2-C these procurement methods generate considerable benefits for the consumer:
–  ⊎ Reverse auctions and bid invitation platforms generate high consumer surplus in the procurement of general and crafts services.
–  ⊎ The level of this consumer surplus is affected by the number of bidders. The duration of the auction and the starting price are less important.
–  ⊎ In the painting business prices are considerably lower than with traditional procurement channels.
–  ⊎ On bid invitation platforms, in most cases (> 55%) the bids with the lowest price are chosen.
  相似文献   

15.
Consider an information network with threats called attackers; each attacker uses a probability distribution to choose a node of the network to damage. Opponent to the attackers is a protector entity called defender; the defender scans and cleans from attacks some part of the network (in particular, a link), which it chooses independently using its own probability distribution. Each attacker wishes to maximize the probability of escaping its cleaning by the defender; towards a conflicting objective, the defender aims at maximizing the expected number of attackers it catches. We model this network security scenario as a non-cooperative strategic game on graphs. We are interested in its associated Nash equilibria, where no network entity can unilaterally increase its local objective. We obtain the following results:
•  We obtain an algebraic characterization of (mixed) Nash equilibria.
•  No (non-trivial) instance of the graph-theoretic game has a pure Nash equilibrium. This is an immediate consequence of some covering properties we prove for the supports of the players in all (mixed) Nash equilibria.
•  We coin a natural subclass of mixed Nash equilibria, which we call Matching Nash equilibria, for this graph-theoretic game. Matching Nash equilibria are defined by enriching the necessary covering properties we proved with some additional conditions involving other structural parameters of graphs, such as Independent Sets.
–  We derive a characterization of graphs admitting Matching Nash equilibria. All such graphs have an Expanding Independent Set. The characterization enables a non-deterministic, polynomial time algorithm to compute a Matching Nash equilibrium for any such graph.
–  Bipartite graphs are shown to satisfy the characterization. So, using a polynomial time algorithm to compute a Maximum Matching for a bipartite graph, we obtain, as our main result, a deterministic, polynomial time algorithm to compute a Matching Nash equilibrium for any instance of the game with a bipartite graph.
A preliminary version of this work appeared in the Proceedings of the 16th Annual International Symposium on Algorithms and Computation, X. Deng and D. Du, eds., Lecture Notes in Computer Science, vol. 3827, pp. 288–297, Springer, December 2005. This work has been partially supported by the IST Program of the European Union under contract 001907 ( ), and by research funds at University of Cyprus.  相似文献   

16.
In this paper we classify several algorithmic problems in group theory in the complexity classes PZK and SZK (problems with perfect/statistical zero-knowledge proofs respectively). Prior to this, these problems were known to be in . As , we have a tighter upper bound for these problems. Specifically:
•  We show that the permutation group problems Coset Intersection, Double Coset Membership, Group Conjugacy are in PZK. Further, the complements of these problems also have perfect zero knowledge proofs (in the liberal sense). We also show that permutation group isomorphism for solvable groups is in PZK. As an ingredient of this protocol, we design a randomized algorithm for sampling short presentations of solvable permutation groups.
•  We show that the complement of all the above problems have concurrent zero knowledge proofs.
•  We prove that the above problems for black-box groups are in SZK.
•  Finally, we also show that some of the problems have SZK protocols with efficient provers in the sense of Micciancio and Vadhan (Lecture Notes in Comput. Sci. 2729, 282–298, 2003).
  相似文献   

17.
The model of the device of reading (visualization) of the hidden magnetic information from the holograms combined with magneto-optical layer is presented in the article. Ways of magnetic images formation on the protected documentation and their reading by magneto-optical methods are proposed. The reading head with the help of magneto-optical meridional Kerr effect allows to observe visually of “effect of blinking” from the hologram with the hidden magnetic layer. During the work the mathematical analysis magneto-optical Kerr or Faraday effects was carried out. The hidden magnetic image based on:
–  hard magnetic layer on basis Tb-Fe with perpendicular anisotropy
–  soft magnetic layers on a basis or permalloy.
Advantages of the device:
–  non contact reading of the magnetic information
–  difficulty of recurrence of magnetic images formation technology.
The optical scheme of devices contains a light source, the polarizer, the analyzer, the hologram with magneto-optic layers, and constant magnet. The hologram is placed between the polarizer and the analyzer. The text was submitted by the authors in English.  相似文献   

18.
19.
Alan Bundy 《AI & Society》2007,21(4):659-668
This paper is a modified version of my acceptance lecture for the 1986 SPL-Insight Award. It turned into something of a personal credo -describing my view of
  the nature of AI
  the potential social benefit of applied AI
  the importance of basic AI research
  the role of logic and the methodology of rational construction
  the interplay of applied and basic AI research, and
  the importance of funding basic AI.
These points are knitted together by an analogy between AI and structural engineering: in particular, between building expert systems and building bridges.  相似文献   

20.
Two experiments examined the judged quality of videoconferencing as a function of three measures of IP network performance: bandwidth, latency, and packet loss. The experiments were realized in a laboratory using a network emulator and a commercial videoconferencing system. Experiment 1 used a fractional factorial design and all three parameters:
•  Bandwidth: 128 kbits/s, 384 kbits/s, 768 kbits/s
•  Latency: 0, 150, 300 ms one-way
•  Bursty packet loss: 0, 2%, 4% (using the “Gilbert-Eliot” method).
Experiment 2 was designed (a) to use random packet loss rather than bursty packet loss, and (b) so that a statistical interaction between bandwidth and packet loss could be detected, if present. Bandwidth levels were the same as in Experiment 1, but packet loss was set to 0, 1 and 2%. The experimental design was full factorial. In both experiments, pairs of non-expert judges held five-minute videoconferences for each combination of parameters, then rated the quality of system performance immediately after each videoconference. In both experiments statistical analysis showed that packet loss was the most important network performance parameter in predicting subjective quality of videoconferencing. These results agree with results on VoIP quality. Bandwidth and latency also affected the judges’ ratings, but to a smaller extent. In Experiment 2, an interaction between packet loss and bandwidth was detected: At lower bandwidths and greater packet loss, the subjective ratings were not as low as would be expected, contrary to the idea that quality would degrade catastrophically when both bandwidth and packet loss were simultaneously unfavorable.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号