首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
The last few decades have seen a phenomenal increase in the quality, diversity and pervasiveness of computer games. The worldwide computer games market is estimated to be worth around USD 21bn annually, and is predicted to continue to grow rapidly. This paper reviews some of the recent developments in applying computational intelligence (CI) methods to games, points out some of the potential pitfalls, and suggests some fruitful directions for future research.  相似文献   

2.
Computational models of ethical reasoning are in their infancy in the field of artificial intelligence. Ethical reasoning is a particularly challenging area of human behavior for AI scientists and engineers because of its reliance on abstract principles, philosophical theories not easily rendered computational, and deep-seated, even religious, beliefs. A further issue is this endeavor's ethical dimension: Is it even appropriate for scientists to try to imbue computers with ethical-reasoning powers? A look at attempts to build computational models of ethical reasoning illustrates this task's challenges. In particular, the Truth-Teller and SIROCCO programs incorporate AI computational models of ethical reasoning, both of which model the ethical approach known as casuistry. Truth-Teller compares pairs of truth-telling cases; SIROCCO retrieves relevant past cases and principles when presented with a new ethical dilemma. The computational model underlying Truth-Teller could serve as the basis for an intelligent tutor for ethics.This article is part of a special issue on Machine Ethics.  相似文献   

3.
New generation sequencing systems are changing how molecular biology is practiced. The widely promoted $1000 genome will be a reality with attendant changes for healthcare, including personalized medicine. More broadly the genomes of many new organisms with large samplings from populations will be commonplace. What is less appreciated is the explosive demands on computation, both for CPU cycles and storage as well as the need for new computational methods. In this article we will survey some of these develo...  相似文献   

4.
Collective intelligence has been an important research topic in many AI communities. With The big data phenomenon, we have been facing on many research problems on how to integrate the big data with collective intelligence. This special issue has selected 9 high quality papers covering various research issues.  相似文献   

5.
This article addresses how the functionalities of the cellular machinery of a bacterium might have constrained the genomic arrangement of its genes during evolution and how we can study such problems using computational approaches, taking full advantage of the rapidly increasing pool of the sequenced bacterial genomes, potentially leading to a much improved understanding of why a bacterial genome is organized in the way it is. This article discusses a number of challenging computational problems in elucidat...  相似文献   

6.
Ebert  Christof Jones  Capers 《Computer》2009,42(4):42-52
Due to the complex system context of embedded-software applications, defects can cause life-threatening situations, delays can create huge costs, and insufficient productivity can impact entire economies. Providing better estimates, setting objectives, and identifying critical hot spots in embedded-software engineering requires adequate benchmarking data.  相似文献   

7.
The computer is one of the most complex artifacts ever built. Given its complexity, it can be described from many different points of view. The aim of this paper is to investigate the representational structure and multifunctionality of a particular subset of computers, namely personal devices (PCs, laptops, smartphones, tablets) from a user-centred perspective. The paper also discusses the concept of “cognitive task”, as recently employed in some definitions of cognitive artifacts, and investigates the metaphysical properties of such artifacts. From a representational point of view, the article introduces the concepts of artifactual meta-representation and of semi-transparency, two features that personal devices share with some cognitive and non-cognitive artifacts. Recognising the meta-representational nature of personal devices and of other cognitive artifacts, thus overcoming semi-transparency, is important for the understanding of why different artifacts offer us different cognitive affordances as well as different cognitive advantages. In this sense, it is not simply a theoretical achievement, but has some important practical consequences. In our highly technological world we can use different kinds of computers and artifacts for solving the same tasks, and we need to understand why some artifacts are better suited for some tasks than others. The ultimate characterisation of personal devices that emerges from this work is that of a sort of super-artifact. This special status is given to personal devices because of their distinctive features. They are in fact intrinsically multifunctional and meta-representational artifacts, with extremely variable structures. As super-artifacts, personal devices are characterised by macro-functionality and can be easily used as both cognitive artifacts and tools for other functions, depending on the kind of representations they instantiate.  相似文献   

8.
Reasoning aboutfacts and reasoning aboutarguments regarding facts are distinct activities, and automated reasoning systems should be able to treat them accordingly. In this work, we discuss one precise sense in which this distinction can be envisaged and suggest the use of Annotated Logics to characterise it.  相似文献   

9.
Mass spectrometry is an analytical technique for determining the composition of a sample. Recently it has become a primary tool for protein identification and quantification, and post translational modification characterization in proteomics research. Both the size and the complexity of the data produced by this experimental technique impose great computational challenges in the data analysis. This article reviews some of these challenges and serves as an entry point for those who want to study the area in ...  相似文献   

10.
《Software, IEEE》2007,24(6):26-27
This article deals with the comparison between building architecture and software architecture. In software, cost estimation is extremely immature, and more accurate estimates can be obtained only by each iteration.  相似文献   

11.
With the rapid development of next-generation sequencing technologies, bacterial identification becomes a very important and essential step in processing genomic data, especially for metagenomic data. Many computational methods have been developed and some of them are widely used to address the problems in bacterial identification. In this article we review the algorithms of these methods, discuss their drawbacks, and propose future computational methods that use genomic data to characterize bacteria. In addition, we tackle two specific computational problems in bacterial identification, namely, the detection of host-specific bacteria and the detection of disease-associated bacteria, by offering potential solutions as a starting point for those who are interested in the area.  相似文献   

12.
针对现有计算机视觉、图形学、信号处理、数字图像处理、应用光学等领域无法通过现有成像模型与装置及计算方法获取足够目标场景信息的难题,计算摄像学研究提出新的成像机制与对应的计算重构方法,在光信号观测领域另辟蹊径,创新性地将视觉信息处理与计算前移至成像过程,从而极大地提高了信息优化计算的自由度,能够在维度、尺度与分辨率上实现质的突破,从而观测到传统成像系统看不清与看不见的场景信息.本文沿着计算摄像学思路、方法与目标三条主线,对国内外研究现状进行分析与综述,期望能够帮助读者更快地了解及进入相关研究.  相似文献   

13.
随着信息技术的发展, 复杂系统越来越多地呈现出社会、物理、信息相融合的特征. 因为这些系统涉及到了人和社会的因素, 其设计、分析、管理、控制和综合等问题正面临前所未有的挑战. 在这种背景下, 计算实验应运而生, 通过“反事实”的算法化, 为量化分析复杂系统提供了一种数字化和计算化方法. 对于计算实验方法的发展现状与未来挑战进行了全面梳理: 首先介绍了计算实验方法的概念起源与应用特征; 然后详细阐述了计算实验的方法框架与关键步骤; 接着展示了计算实验方法的典型应用, 包括现象解释、趋势预测与策略优化; 最后给出了计算实验方法所面临的一些关键问题与挑战. 旨在梳理出计算实验方法的技术框架, 为其快速发展与跨学科应用提供支撑.  相似文献   

14.
传感网络:概念,应用与挑战   总被引:1,自引:1,他引:1  
YAO Kung 《自动化学报》2006,32(6):839-845
Sensor network has experienced world-wide explosive interests in recent years. It combines the technology of modern microelectronic sensors, embedded computational processing systems, and modern computer and wireless networking methodologies. In this overview paper, we first provide some rationales for the growth of sensor networking. Then we discuss various basic concepts and hardware issues. Four basic application cases in the US. National Science Foundation funded Ceneter for Embedded Networked Sensing program at UCLA are presented. Finally, six challenging issues in sensor networks are discussed. Numerous references including relevant papers, books, and conferences that have appeared in recent years are given.  相似文献   

15.
戴孟元 《智能安全》2024,3(1):98-105
随着近年来美战略重心转向大国竞争,美基于“第三次抵消战略”提出了“马赛克战”这一新型作战概念,以期利用其在人工智能技术等方面的领先地位获得不对称作战优势,“打一场让对手看不懂的战争”。马赛克战的制胜机理,在于以复杂性对抗确定性,通过马赛克结点间的分布与聚合,在维持体系作战能力的同时令对手的反制手段降能失效。虽然马赛克战面临着技术和人的因素等多方面的挑战,但美已在这一方向上进行了长时间的探索并体系性布局了大量前沿项目,相关研究成果将为美军智能化变革持续注入动力。  相似文献   

16.
在军事作战等对时延敏感的应用场景中,云计算无法满足用户的实时需求,因此分散计算应运而生。它利用智能手机、平板电脑、联网汽车和物联网终端等全球计算资源提供服务,并将云数据中心视为通用计算节点,彻底消除中心化,实现计算资源的分散化。分散计算将所有具有计算能力的设备连接起来,形成一个网络化的有机体,每个计算节点以协作和共享的方式为用户提供服务。与雾计算和边缘计算的本地化处理不同,该范式利用了网络中的空闲计算资源,绕过了局部计算能力的限制,得到了广泛的关注。首先,介绍了分散计算的研究背景,并给出了分散计算的定义;其次,详细介绍了分散计算的三种核心技术;随后,通过一些具体的应用场景实例化分散计算的概念,更好地分析了分散计算在万物互联时代的优势;最后,阐述了未来分散计算的研究方向以及面临的挑战。  相似文献   

17.
随着万物联网的趋势不断加深,智能手机、智能眼镜等端设备的数量不断增加,使数据的增长速度远远超过了网络带宽的增速;同时,增强现实、无人驾驶等众多新应用的出现对延迟提出了更高的要求.边缘计算将网络边缘上的计算、网络与存储资源组成统一的平台为用户提供服务,使数据在源头附近就能得到及时有效的处理.这种模式不同于云计算要将所有数据传输到数据中心,绕过了网络带宽与延迟的瓶颈,引起了广泛的关注.首先介绍边缘计算的概念,并给出边缘计算的定义;随后,比较了当前比较有代表性的3个边缘计算平台,并通过一些应用实例来分析边缘计算在移动应用和物联网应用上的优势;最后阐述了当前边缘计算面临的挑战.  相似文献   

18.
If we were to have a Grid infrastructure for visualization, what technologies would be needed to build such an infrastructure, what kind of applications would benefit from it, and what challenges are we facing in order to accomplish this goal? In this survey paper, we make use of the term ‘visual supercomputing’ to encapsulate a subject domain concerning the infrastructural technology for visualization. We consider a broad range of scientific and technological advances in computer graphics and visualization, which are relevant to visual supercomputing. We identify the state‐of‐the‐art technologies that have prepared us for building such an infrastructure. We examine a collection of applications that would benefit enormously from such an infrastructure, and discuss their technical requirements. We propose a set of challenges that may guide our strategic efforts in the coming years.  相似文献   

19.
In this paper I introduce a formalism for natural language understandingbased on a computational implementation of Discourse RepresentationTheory. The formalism covers a wide variety of semantic phenomena(including scope and lexical ambiguities, anaphora and presupposition),is computationally attractive, and has a genuine inference component. Itcombines a well-established linguistic formalism (DRT) with advancedtechniques to deal with ambiguity (underspecification), and isinnovative in the use of first-order theorem proving techniques.The architecture of the formalism for natural language understandingthat I advocate consists of three levels of processing:underspecification, resolution, andinference. Each of these levels has a distinct function andtherefore employs a different kind of semantic representation. Themappings between these different representations define the interfacesbetween the levels.I show how underspecified semantic representations can be built in acompositional way (for a fragment of English Grammar) using standardtechniques borrowed from the -calculus, how inferences can becarried out on discourse representations using a translation tofirst-order logic, and how existing research prototypes (discourseprocessing and spoken-dialogue systems) implement the formalism.  相似文献   

20.
This paper presents a literature review in the field of summarizing software artifacts, focusing on bug reports, source code, mailing lists and developer discussions artifacts. From Jan. 2010 to Apr. 2016, numerous summarization techniques, approaches, and tools have been proposed to satisfy the ongoing demand of improving software performance and quality and facilitating developers in understanding the problems at hand. Since aforementioned artifacts contain both structured and unstructured data at the same time, researchers have applied different machine learning and data mining techniques to generate summaries. Therefore, this paper first intends to provide a general perspective on the state of the art, describing the type of artifacts, approaches for summarization, as well as the common portions of experimental procedures shared among these artifacts. Moreover, we discuss the applications of summarization, i.e., what tasks at hand have been achieved through summarization. Next, this paper presents tools that are generated for summarization tasks or employed during summarization tasks. In addition, we present different summarization evaluation methods employed in selected studies as well as other important factors that are used for the evaluation of generated summaries such as adequacy and quality. Moreover, we briefly present modern communication channels and complementarities with commonalities among different software artifacts. Finally, some thoughts about the challenges applicable to the existing studies in general as well as future research directions are also discussed. The survey of existing studies will allow future researchers to have a wide and useful background knowledge on the main and important aspects of this research field.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号