首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Aspect-oriented programming (AOP) is a novel programming paradigm that aims at modularizing complex software. It embraces several mechanisms including (1) pointcuts and advice as well as (2) refinements and collaborations. Though all these mechanisms deal with crosscutting concerns, i.e., a special class of design and implementation problems that challenge traditional programming paradigms, they do so in different ways. In this article we explore their relationship and their impact on modularity, which is an important prerequisite for reliable and maintainable software. Our exploration helps researchers and practitioners to understand their differences and exposes which mechanism is best used for which problem.  相似文献   

2.
Liu  Simon Cheng  Bruce 《IT Professional》2009,11(3):14-21
Enterprises rely extensively on computerized information systems and electronic data in cyberspace to perform their daily activities and business. Today, virtually all public and private organizations connect to and live in cyberspace. As computers, information systems, and networking have become more ubiquitous, cybersecurity has become more critical for the continuity of business operations. To better understand cybersecurity, this article discusses the four "W's" of cyberattacks. It starts with an overview of the cause of cybersecurity problems, analyzes the challenges associated with it, outlines the cyberattacker profile, discusses cyberattack patterns, and finally summarizes recent cyberattack trends.  相似文献   

3.
4.
Coolstreaming: Design, Theory, and Practice   总被引:6,自引:0,他引:6  
Peer-to-peer (P2P) technology has found much success in applications like file distributions and VoIP yet, its adoption in live video streaming remains as an elusive goal. Our recent success in Coolstreaming system brings promises in this direction; however, it also reveals that there exist many practical engineering problems in real live streaming systems over the Internet. Our focus in this paper is on a nonoptimal real working system, in which we illustrate a set of existing practical problems and how they could be handled. We believe this is essential in providing the basic understanding of P2P streaming systems. This paper uses a set of real traces and attempts to develop some theoretical basis to demonstrate that a random peer partnership selection with a hybrid pull-push scheme has the potentially to scale. Specifically, first, we describe the fundamental system design tradeoffs and key changes in the design of a Coolstreaming system including substreaming, buffer management, scheduling and the adopt of a hybrid pull-push mechanism over the original pull-based content delivery approach; second, we examine the overlay topology and its convergence; third, using a combination of real traces and analysis, we quantitatively provide the insights on how the buffering technique resolves the problems associated with dynamics and heterogeneity; fourth, we show how substream and path diversity can help to alleviate the impact from congestion and churns; fifth, we discuss the system scalability and limitations.  相似文献   

5.
In this chapter, we shall discuss the origin, the development and the beyond of holography. We will show that basically there are two types of holography, namely Leith’s transmission-type and Denisyuk’s reflection-type. Nevertheless, the successful development of holography must be due to the discovery of laser. Without the discovery of a strong coherent source, holography may not have happened at all. Although the original purpose for developing holography is to produce true three-dimensional imaging, it has a much wider dimension for various applications far beyond its legacy. The text was submitted by the authors in English.  相似文献   

6.
Research in Artificial Intelligence (AI) and the Law has maintained an emphasis on knowledge representation and formal reasoning during a period when statistical, data-driven approaches have ascended to dominance within AI as a whole. Electronic discovery is a legal application area, with substantial commercial and research interest, where there are compelling arguments in favor of both empirical and knowledge-based approaches. We discuss the cases for both perspectives, as well as the opportunities for beneficial synergies.  相似文献   

7.
Directory services facilitate access to information organized under a variety of frameworks and applications. The lightweight directory access protocol is a promising technology that provides access to directory information using a data structure similar to that of the X.500 protocol. IBM Tivoli, Novell, Sun, Oracle, Microsoft, and many other vendors feature LDAP-based implementations. The technology's increasing popularity is due both to its flexibility and its compatibility with existing applications.  相似文献   

8.
9.
Similar to TCP and UDP, the stream control transmission protocol (SCTP) is a transport protocol providing end-to-end communication. SCTP was originally designed within the IETF Signaling Transport (SIGTRAN) working group to address TCP's shortcomings relating to telephony signaling over IP networks. SCTP has since evolved into a general-purpose IETF transport protocol with kernel implementations on various platforms. Similar to TCP, SCTP provides a connection-oriented, reliable, full- duplex, congestion and flow-controlled layer 4 channel. Unlike both TCP and UDP, however, SCTP offers new delivery options that better match diverse applications' needs. Here, we introduce SCTP, discuss its innovative services, and outline ongoing SCTP-related research and standardization activities.  相似文献   

10.
WWW: past, present, and future   总被引:1,自引:0,他引:1  
Berners-Lee  T. 《Computer》1996,29(10):69-77
The World Wide Web is simply defined as the universe of global network-accessible information. It is an abstract space within which people can interact, and it is chiefly populated by interlinked pages of text, images, and animations, with occasional sounds, videos, and three-dimensional worlds. The Web marks the end of an era of frustrating and debilitating incompatibility between computer systems. It has created an explosion of accessibility, with many potential social and economical impacts. The Web was designed to be a space within which people could work on a project. This was a powerful concept, in that: people who build a hypertext document of their shared understanding can refer to it at all times; people who join a project team can have access to a history of the team's activities, decisions, and so on; the work of people who leave a team can be captured for future reference; and a team's operations, if placed on the Web, can be machine-analyzed in a way that could not be done otherwise. The Web was originally supposed to be a personal information system and a tool for groups of all sizes, from a team of two to the entire world. People have rapidly developed new features for the Web, because of its tremendous commercial potential. This has made the maintenance of globalWeb interoperability a continuous task. This has also created a number of areas into which research must continue  相似文献   

11.
Connectionism: past,present, and future   总被引:1,自引:1,他引:0  
Research efforts to study computation and cognitive modeling on neurally-inspired mechanisms have come to be called Connectionism. Rather than being brand new, it is actually the rebirth of a research programme which thrived from the 40s through the 60s and then was severely retrenched in the 70s. Connectionism is often posed as a paradigmatic competitor to the Symbolic Processing tradition of Artificial Intelligence (Dreyfus & Dreyfus, 1988), and, indeed, the counterpoint in the timing of their intellectual and commercial fortunes may lead one to believe that research in cognition is merely a zero-sum game. This paper surveys the history of the field, often in relation to AI, discusses its current successes and failures, and makes some predictions for where it might lead in the future.  相似文献   

12.
This article discusses the computational structure of the most effective methods for factoring integers and the computer architectures—existing and used, proposed, and under construction—which efficiently perform the computations of these various methods. New developments in technology and in pricing of computers are making it possible to build powerful parallel machines, at relatively low cost, which can substantially outperform standard computers on specific types of computations. The intent of this article is to use factoring and computers for factoring to provoke general thought about this matching of computer architectures to algorithms and computations.The author's research at Louisiana State University was supported in part by the National Science Foundation and the National Security Agency under grants NSF DCR 83-115-80 and NSA MDA904-85-H-0006.  相似文献   

13.
Technologists act as if the "REST vs. RPC" debate is purely technical, but technology choices are never really quite so black and white. This column examines the non-intuitive theories and evidence behind Clayton Christensen's "innovator's dilemma" to explore technology life cycles, discuss what makes different customers choose different technologies, and consider how different types of innovation affect the evolution of integration products.  相似文献   

14.
15.
Space, time, and computation: Trends and problems   总被引:1,自引:0,他引:1  
At the International Joint Conference on Artificial Intelligence (IJCAI) in Chambéry, France, the authors organized and ran a Workshop on Spatial and Temporal Reasoning with the purpose of both presenting current research and development in these areas and fostering an interchange of ideas among attendees of differing interests. In particular, discussion was focussed on the interfaces between three separate concerns: spatial reasoning in AI, temporal reasoning in AI, and temporal methods for concurrent systems. The authors reflect on the outcome of the workshop as well as introduce the extended papers selected for this special issue. Research goals for the immediate future are presented.  相似文献   

16.
Contemporary philosophy of technology after the empirical turn has surprisingly little to say on the relation between language and technology. This essay describes this gap, offers a preliminary discussion of how language and technology may be related to show that there is a rich conceptual space to be gained, and begins to explore some ways in which the gap could be bridged by starting from within specific philosophical subfields and traditions. One route starts from philosophy of language (both “analytic” and “continental”: Searle and Heidegger) and discusses some potential implications for thinking about technology; another starts from artefact-oriented approaches in philosophy of technology and STS and shows that these approaches might helpfully be extended by theorizing relationships between language and technological artefacts. The essay concludes by suggesting a research agenda, which invites more work on the relation between language and technology.  相似文献   

17.
The design symposium ‘creative connections discusses the designers’ tools in the conceptual phase. Over the past few decades, many considerations, which hitherto occured before or after conceptualizing have become an integrated part of concept development. Examples are studies of users and contexts, and expressive new materials. Also, design tools are becoming increasingly, almost exclusively, computer-based. But current computer tools lack fluency, directness, and bodily involvement of the traditional paper tools, properties which are essential in the creative activities of conceptualizing, The symposium, and its four attached bazaar papers, deal with new tools that are being developed, and old tools that are evolving, to help designers at coping with this complexity of factors. This paper is part of the 3AD design colloquium creative connections.  相似文献   

18.
Even if software developers don't fully understand the faults or know their location in the code, software rejuvenation can help avoid failures in the presence of aging-related bugs. This is good news because reproducing and isolating an aging-related bug can be quite involved, similar to other Mandelbugs. Moreover, monitoring for signs of software aging can even help detect software faults that were missed during the development and testing phases. If, on the other hand, a developer can detect a specific aging-related bug in the code, fixing it and distributing a software update might be worthwhile. In the case of the Patriot missile-defense system, a modified version of the software was indeed prepared and deployed to users. It arrived at Dhahran on 26 February 1991- a day after the fatal incident  相似文献   

19.
Household technology adoption,use, and impacts: Past,present, and future   总被引:1,自引:1,他引:0  
Since the 1980s, researchers have been studying the phenomenon associated with technology being diffused to the household. In this paper, three themes in that stream of research, specifically adoption, use, and impacts, are explored. Key studies from prior research within each theme are discussed and directions for future research are offered. The directions for future research range from investigating adoption issues associated with the digital divide to understanding the impacts of new technology and social networking sites on individuals and families. The evolving nature of the technology continues to offer interesting research directions and challenges, with the study of unintended consequences of technology use presenting, perhaps, the greatest opportunities.
Susan A. BrownEmail:
  相似文献   

20.
Message-logging protocols are an integral part of a popular technique for implementing processes that can recover from crash failures. All message-logging protocols require that, when recovery is complete, there be no orphan processes, which are surviving processes whose states are inconsistent with the recovered state of a crashed process. We give a precise specification of the consistency property “no orphan processes”. From this specification, we describe how different existing classes of message-logging protocols (namely optimistic, pessimistic, and a class that we call causal) implement this property. We then propose a set of metrics to evaluate the performance of message-logging protocols, and characterize the protocols that are optimal with respect to these metrics. Finally, starting from a protocol that relies on causal delivery order, we show how to derive optimal causal protocols that tolerate f overlapping failures and recoveries for a parameter f (1⩽f⩽n)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号