首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Results of Schlipf (J Comput Syst Sci 51:64?C86, 1995) and Fitting (Theor Comput Sci 278:25?C51, 2001) show that the well-founded semantics of a finite predicate logic program can be quite complex. In this paper, we show that there is a close connection between the construction of the perfect kernel of a $\Pi^0_1$ class via the iteration of the Cantor?CBendixson derivative through the ordinals and the construction of the well-founded semantics for finite predicate logic programs via Van Gelder??s alternating fixpoint construction. This connection allows us to transfer known complexity results for the perfect kernel of $\Pi^0_1$ classes to give new complexity results for various questions about the well-founded semantics ${\mathit{wfs}}(P)$ of a finite predicate logic program P.  相似文献   

2.
It is shown that the decision problem for the temporal logic with the strict until operator over general linear time is PSPACE-complete. This shows that it is no harder to reason with arbitrary linear orderings than with discrete linear time temporal logics. New techniques are used to give a PSPACE procedure for the logic.  相似文献   

3.
In the paper, a “truly concurrent” and nondeterministic semantics is defined in terms of branching processes of discrete-time Petri nets (DTPNs). These nets may involve infinite numbers of transitions and places, infinite number of tokens in places, and (maximal) steps of concurrent transitions, which allows us to consider this class of DTPNs to be the most powerful class of Petri nets. It is proved that the unfolding (maximal branching process) of the DTPN is the greatest element of a complete lattice constructed on branching processes of DTPNs with step semantics. Moreover, it is shown that this result is true also in the case of maximal transition steps if additional restrictions are imposed on the structure and behavior of the DTPN.  相似文献   

4.
Modern computerized stock trading systems (mechanical trading systems) are based on the simulation of the decision-making process and generate advice for traders to buy or sell stocks or other financial tools by taking into account the price history, technical analysis indicators, accepted rules of trading and so on. Two stock trading simulating systems based on trading rules defined using fuzzy logic are developed and compared. The first is based on the so-called “Logic-Motivated Fuzzy Logic Operators” (LMFL) approach and aims to avoid certain disadvantages of the classical Mamdani’s method, which has been developed for use in fuzzy logic controllers and not for solving the decision-making problems of stock trading. The LMFL   approach is based on the modified mathematical representation of tt-norm and Yager’s implication rule. The second trading system combines the tools of fuzzy logic and Dempster–Shafer Theory (DST  ) to represent the features of the decision-making process more transparently. The fuzzy representation of trading rules based on the theory of technical analysis is used in these expert systems. Since the theory of technical analysis is based on the indicators used by experts to predict stock price movements, the method maps these indicators into new inputs that can be used in a fuzzy logic system. The only required inputs to calculate these indicators are past sequences (history) of stock prices. The method relies on fuzzy logic to choose an appropriate decision when certain price movements or certain price formations occur. The optimization procedure based on historical (teaching) data is used as it significantly improves the performance of such expert systems. The efficiency of the developed expert systems is measured by comparing their outputs versus stock price movements. The results obtained using real NYSENYSE data allow us to say that the developed expert system based on the synthesis of fuzzy logic and DST provides better results and is more reliable. Moreover, such a conjunction of fuzzy logic, DST and technical analysis, makes it possible to make a profit even when trading against a dominating trend.  相似文献   

5.
This paper describes a partial evaluation system specifically designed to be used as an automatic compilation tool for metaprograms in a KBMS (EPSILON) based on Prolog. EPSILON main underlying concepts are the extension of Prolog with theories (“multiple worlds”) and the use of metaprogramming as the basic technique to define new inference engines and tools. Our partial evaluator is oriented towards theories and metainterpreter specialization. Being designed to be used as an automatic compiler, it does not require declarations from the user to control the unfolding process. It handles full Prolog and provides also an elegant solution to the problem of the partial evaluation of incomplete and self-modifying programs, by exploiting the multiple worlds feature added to Prolog. EPSILON partial evaluation system turned out to be a very useful and powerful tool to combine the low cost and the flexibility of metaprogramming with the performance requirements of a practical knowledge based system.  相似文献   

6.
The positive and negative effects of social media in crises are currently receiving an increased amount of scholarly attention. This study focuses on Twitter users in the context of a crisis in the Netherlands on January 29, 2015. After having made a bomb threat, an armed man managed to get access to the national news broadcasting station around 8 pm, where he demanded airplay to share “an important message” with Dutch citizens. Three weeks after the terrorist attack on Charlie Hebdo in Paris, approximately 1.5 million viewers were anxious that a similar attack was taking place in the television studio. The crisis, also followed by social media users, reached a climax when armed policemen arrested the man, which was later shown on national TV. We analyzed 58,931 tweets, posted in the six hours after the incident. By examining shared facts and rumors during the gunman crisis, we identified an “echo-effect”: the dissemination of older tweets continued after the posting of new facts by the same source. Moreover, we found that two rumors were based on misinterpreted humor in Twitter messages. The study adds insight into the self-correcting mechanism of social media communities when verifying and dispelling online rumors during crises.  相似文献   

7.
Work practices usually differ fundamentally from the way that organizations describe their operations in manuals, training programs, etc. This paper focuses on the way that certain work practices are supported at Xerox, and the conclusions of this effort are related to complementary investigations on learning and innovation. Here we propose that the combination of work, learning and innovation should be reconsidered within the framework of informal “communities-of-practice.” Information Technology tends to be used in order to reinforce the old work and study paradigms. This paper suggests a different use of IT, a use especially well suited to intra- and internets, with the aim of supporting informal structures rather than formal procedures. The case of Xerox Corporation is used as an example.  相似文献   

8.
The success of cities increasingly relies on its capacity to capitalize on its knowledge base, but also on its potential to anchor external knowledge and the strategies of knowledge-based firms. In this paper we analyze how a “born global” start-up firm is linked to different types of places, and how it explores and exploits different territorial innovation potentials. Our case company—i.e., Living PlanIT—develops, tests and sells smart city software to processes real-time information collected through sensors embedded in a city’s buildings and infrastructure towards energy savings and manifold efficiency gains. The paper illustrates how the interaction with different places and knowledge-based cities provides unique resources for the technology development, search, experimentation, market formation and societal legitimation. Beyond focusing on a place’s fixed knowledge assets, the paper empirically assesses the innovation functions of different types of knowledge-cities and temporary “non-places” such as international high-level events.  相似文献   

9.
10.
The presence of unknown mutual coupling between array elements is knownto significantly degrade the performance of most high-resolution direction of arrival (DOA)estimation algorithms. In this paper, a robust subspace-based DOA estimation and arrayauto-calibration algorithm is proposed for uniformly linear array (ULA), when the arraymutual coupling is present. Based on a banded symmetric Toeplitz matrix model for themutual coupling of ULA, the algorithm provides an accurate and high-resolution DOAestimate without any knowledge of the array mutual couplings. Moreover, a favorableestimate of mutual coupling matrix can also be achieved simultaneously for arrayauto-calibration. The algorithm is realized just via one-dimensional search or polynomialrooting, with no multidimensional nonlinear search or convergence burden involved. Theproblem of parameter ambiguity, statistically consistence and efficiency of the newestimator are also analyzed. Monte-Carlo simulation results are also provided todemonstrate the  相似文献   

11.
Recently, the building industry has shown great interest in building information modeling (BIM) due to the many benefits BIM provides. During the last decade, the government of Dubai has been working toward a BIM environment, where any new building project that is ten floors or higher shall be submitted in BIM format. The need for BIM creates more job opportunities for technicians with BIM skills. This inflates the need to prepare related field engineers with BIM background before graduation. This report analyzes the implementation of BIM in a construction course, which is currently using AutoCAD, by splitting the laboratory skills. Through this implementation, we will test whether adding BIM based on Revit to a currently existent course will improve the students’ motivation, performance and satisfaction. The authors use the results of the Architectural Engineering (AE) students at the United Arab Emirates University (UAEU) in construction courses to study the performance before and after implementing BIM. The authors found some issues, which can decrease the students’ performance with a 97% certainty. The students’ motivation and satisfaction was tested using a pre-test/post-test quasi-experiment design. The tests showed that BIM based on Revit reduces student performance time while increasing student motivation and satisfaction. The causes behind the last statement were analyzed with the use of interviews with related students.  相似文献   

12.
How is fuzzy logic usually formalized? There are many seemingly reasonable requirements that a logic should satisfy: e.g., since A B and B A are the same, the corresponding and-operation should be commutative. Similarly, since A A means the same as A, we should expect that the and-operation should also satisfy this property, etc. It turns out to be impossible to satisfy all these seemingly natural requirements, so usually, some requirements are picked as absolutely true (like commutativity or associativity), and others are ignored if they contradict to the picked ones. This idea leads to a neat mathematical theory, but the analysis of real-life expert reasoning shows that all the requirements are only approximately satisfied. we should require all of these requirements to be satisfied to some extent. In this paper, we show the preliminary results of analyzing such operations. In particular, we show that non-associative operations explain the empirical 7±2 law in psychology according to which a person can normally distinguish between no more than 7 plus minus 2 classes.  相似文献   

13.
OBJECTIVE: Participants performed a tracking task and system monitoring task while aided by diagnostic automation. The goal of the study was to examine operator compliance and reliance as affected by automation failures and to clarify claims regarding independence of these two constructs. BACKGROUND: Background data revealed a trend toward nonindependence of the compliance-reliance constructs. METHOD: Thirty-two undergraduate students performed the simulation that presented the visual display while dependent measures were collected. RESULTS: False alarm-prone automation hurt overall performance more than miss-prone automation. False alarm-prone automation also clearly affected both operator compliance and reliance, whereas miss-prone automation appeared to affect only operator reliance. CONCLUSION: Compliance and reliance do not appear to be entirely independent of each other. APPLICATION: False alarms appear to be more damaging to overall performance than misses, and designers must take the compliance-reliance constructs into consideration.  相似文献   

14.
In their joint paper entitled “The Replication of the Hard Problem of Consciousness in AI and BIO-AI” (Boltuc et al. Replication of the hard problem of conscious in AI and Bio- AI: An early conceptual framework 2008), Nicholas and Piotr Boltuc suggest that machines could be equipped with phenomenal consciousness, which is subjective consciousness that satisfies Chalmer’s hard problem (We will abbreviate the hard problem of consciousness as “H-consciousness”). The claim is that if we knew the inner workings of phenomenal consciousness and could understand its’ precise operation, we could instantiate such consciousness in a machine. This claim, called the extra-strong AI thesis, is an important claim because if true it would demystify the privileged access problem of first-person consciousness and cast it as an empirical problem of science and not a fundamental question of philosophy. A core assumption of the extra-strong AI thesis is that there is no logical argument that precludes the implementation of H-consciousness in an organic or in-organic machine provided we understand its algorithm. Another way of framing this conclusion is that there is nothing special about H-consciousness as compared to any other process. That is, in the same way that we do not preclude a machine from implementing photosynthesis, we also do not preclude a machine from implementing H-consciousness. While one may be more difficult in practice, it is a problem of science and engineering, and no longer a philosophical question. I propose that Boltuc’s conclusion, while plausible and convincing, comes at a very high price; the argument given for his conclusion does not exclude any conceivable process from machine implementation. In short, if we make some assumptions about the equivalence of a rough notion of algorithm and then tie this to human understanding, all logical preconditions vanish and the argument grants that any process can be implemented in a machine. The purpose of this paper is to comment on the argument for his conclusion and offer additional properties of H-consciousness that can be used to make the conclusion falsifiable through scientific investigation rather than relying on the limits of human understanding.  相似文献   

15.
With the development of the Internet, more people are creating personal websites and blogs, and they expect to be able to use their own name or surname in the domain name. The aim of this article is to identify and analyse the existing problems in the use and protection of personal names in the domain space, as well as the development of legal approaches that can be used to resolve disputes that arise. In the course of the analysis, the authors conclude that the existing legal regulations at both the international and national levels are not sufficient to protect personal names effectively. Current approaches in this area are scattered and not systematic. The conclusions and suggestions made in this paper can be used to form a specialised procedure to resolve domain name disputes relating to personal names, as well as in the course of organising and updating the national legislations of the states of Eastern Europe.  相似文献   

16.
Following Derrida (1995), our article explores the relationship between archival practices and archival documents on the assumption that “archivization produces as much as it records the event” (Derrida 1995, 17). On this approach, archival practices are understood as non-innocent practices that, in the act of “preservation,” help make specific “memories” at the expense of others (Barad 2007; Derrida 1995; Foucault 1972). We take up this issue in relation to the curation of social science quantitative research data and argue that the ontological identity of data is constituted through historically- and culturally-specific data curation practices including data cleaning, data anonymization, and metadata preparation.  相似文献   

17.
In a key article (Walsham & Sahay, 2005) outlining research on information systems in developing countries and suggesting potential areas for future research, a notable omission was the issue of gender and gender relations. In this article, we draw on the substantial gender and development literature to demonstrate the centrality of gender to our understanding of information systems (IS) in developing countries. In particular, we consider the relationship among gender, information and communication technologies (ICTs), and globalization to illustrate how changes in the global economy both impact on and are influenced by changing gender identities and roles. © 2008 Wiley Periodicals, Inc.  相似文献   

18.
Sharon Daniel 《AI & Society》2000,14(2):196-213
This paper will discuss interactive on-line artworks modelled on cellular automata that employ various types of agents, both algorithmic and human, to assist in the evolution of their databases. These works constitute what will here be referred to as Collaborative Systems systems that evolve through the practice of inter-authorship.  相似文献   

19.
Abstract

This article examines global and indigenous knowledge sharing with a focus on electronic information exchange in Nepal's development sector. Drawing on lessons from experience based on two local examples, a framework is presented of a strategy for realising the potential of Information and Communications Technology (ICT) in countries where knowledge sharing and access is constrained in a variety of ways.

The “iCAPACITY framework” outlined for the South Asian context integrates the inter‐dependent themes of Content, Access, and Partnership, highlighting the critical components that require consideration when building the capacity for ICT usage and knowledge sharing in a developing country context. Practical initial steps are put forward, that recognise the primary concern for holistically addressing economic, social and environmental issues, with the overall priority of alleviating poverty using broad‐based participation.

The paper concludes that developing countries such as Nepal, currently occupy what may be metaphorically referred to as “the thin air of cyberspace”, where the essential knowledge needing to be shared locally or globally is not yet widely available or accessible. In this context, particular care has to be taken in formulating localised strategies and models that can improve the quality of this “air”, and lead to a situation where development efforts can truly be enhanced by the IT revolution.  相似文献   

20.
In this paper, an attempt is made on the basis of the works of G. Ling (USA) and E. Bauer (USSR) to describe and analyze from a unified theoretical and experimental position the neurocomputer model of the “living state” of matter, the physical nature and the mechanisms of physiological resting-state and activity. These are considered as a precondition of understanding anticipatory phenomena in biosystems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号