首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Ubiquitous computing technologies are slowly finding their way into commercial information systems, which are often constructed at considerably larger scale compared to what is possible in research demonstrators. Furthermore, lengthy and costly preparation or upgrade of existing infrastructures, training of employees and users in the new ways of working, controlled introduction of new functionality, features and services to manage risk, unexpected behaviors due to the wider variety of possible real-world situations, incremental approach to systems development so as to better identify successful aspects, regard for the economics of systems as a core requirement, and selection of open or closed systems are all issues that are mostly outside the scope of current ubiquitous computing research but play a critical role in industrial deployments. In this paper we review two case studies of fully operational Radio Frequency Identification-based systems: the Oyster card ticketing system used at the London Underground in the UK, and retail applications deployed at the Mitsukoshi departmental stores in Tokyo, Japan. We examine each case in terms of technologies, user interactions, and their business and organizational context and make several observations in each case. We conclude by drawing general lessons related to ubiquitous computing in the real world and identify challenges for future ubiquitous computing research.  相似文献   

2.
3.
Large interactive displays for supporting workgroup collaboration comprise a growing area of ubiquitous computing research and many such systems have been designed and deployed in laboratory studies and research settings. Such displays face difficulties in real-world deployments, as they are often supplemental technologies as opposed to primary tools for work activities. In this work, we investigate the integration and uptake of the NASA MERBoards, shared interactive displays that were deployed to support science tasks in the Mars Exploration Rover (MER) missions. We examine the hurdles to adoption imposed specifically by the real-world circumstances of the deployment that were external to the design of the system, and explain how these concerns apply to the general deployment of shared ubicomp technologies in the real world.  相似文献   

4.
本文在不增加系统硬件的情况下,采用串行控制多路转换的技术,成功地实现了大规模测量点的实时测控,提高了系统的可靠性。  相似文献   

5.
A code clone is a code portion in source files that is identical or similar to another. Since code clones are believed to reduce the maintainability of software, several code clone detection techniques and tools have been proposed. This paper proposes a new clone detection technique, which consists of the transformation of input source text and a token-by-token comparison. For its implementation with several useful optimization techniques, we have developed a tool, named CCFinder (Code Clone Finder), which extracts code clones in C, C++, Java, COBOL and other source files. In addition, metrics for the code clones have been developed. In order to evaluate the usefulness of CCFinder and metrics, we conducted several case studies where we applied the new tool to the source code of JDK, FreeBSD, NetBSD, Linux, and many other systems. As a result, CCFinder has effectively found clones and the metrics have been able to effectively identify the characteristics of the systems. In addition, we have compared the proposed technique with other clone detection techniques.  相似文献   

6.
On the value of static analysis for fault detection in software   总被引:1,自引:0,他引:1  
No single software fault-detection technique is capable of addressing all fault-detection concerns. Similarly to software reviews and testing, static analysis tools (or automated static analysis) can be used to remove defects prior to release of a software product. To determine to what extent automated static analysis can help in the economic production of a high-quality product, we have analyzed static analysis faults and test and customer-reported failures for three large-scale industrial software systems developed at Nortel Networks. The data indicate that automated static analysis is an affordable means of software fault detection. Using the orthogonal defect classification scheme, we found that automated static analysis is effective at identifying assignment and checking faults, allowing the later software production phases to focus on more complex, functional, and algorithmic faults. A majority of the defects found by automated static analysis appear to be produced by a few key types of programmer errors and some of these types have the potential to cause security vulnerabilities. Statistical analysis results indicate the number of automated static analysis faults can be effective for identifying problem modules. Our results indicate static analysis tools are complementary to other fault-detection techniques for the economic production of a high-quality software product.  相似文献   

7.
Interacting with the real world: design principles for intelligent systems   总被引:1,自引:0,他引:1  
The last two decades in the field of artificial intelligence have clearly shown that true intelligence always requires the interaction of an agent with a real physical and social environment. The concept of embodiment that has been introduced to designate the modern approach to designing intelligence has far-reaching implications. Rather than studying computation alone, we must consider the interplay between morphology, materials, brain (control), and the environment. A number of case studies are presented, and it is demonstrated how artificial evolution and morphogenesis can be used to systematically investigate this interplay. Taking these ideas into account requires entirely novel ways of thinking, and often leads to surprising results.This work was presented, in part, at the 9th International Symposium on Artificial Life and Robotics, Oita, Japan, January 28–30, 2004  相似文献   

8.
Tracking systems are important in computervision, with applications in surveillance, human computer interaction, etc. Consumer graphics processing units (GPUs) have experienced an extraordinary evolution in both computing performance and programmability, leading to greater use of the GPU for non-rendering applications. In this work we propose a real-time object tracking algorithm, based on the hybridization of particle filtering (PF) and a multi-scale local search (MSLS) algorithm, presented for both CPU and GPU architectures. The developed system provides successful results in precise tracking of single and multiple targets in monocular video, operating in real-time at 70 frames per second for 640 × 480 video resolutions on the GPU, up to 1,100% faster than the CPU version of the algorithm.  相似文献   

9.
Large-scale semantic concept detection from large video database suffers from large variations among different semantic concepts as well as their corresponding effective low-level features. In this paper, we propose a novel framework to deal with this obstacle. The proposed framework consists of four major components: feature pool construction, pre-filtering, modeling, and classification. First, a large low-level feature pool is constructed, from which a specific set of features are selected for the latter steps automatically or semi-automatically. Then, to deal with the unbalance problem in training set, a pre-filtering classifier is generated, which the aim of achieving a high recall rate and a certain precision rate nearly 50% for a certain concept. Thereafter, from the pre-filtered training samples, a SVM classifier is built based on the selected features in the feature pool. After that, the SVM classifier is applied to classification of semantic concept. This framework is flexible and extensible in terms of adding new features into the feature pool, introducing human interactions in selecting features, building models for new concepts and adopting active learning.  相似文献   

10.
It is argued that the backpropagation learning algorithm is unsuited to tackling real world problems such as sensory-motor coordination learning or the encoding of large amounts of background knowledge in neural networks. One difficulty in the real world - the unavailability of ‘teachers’ who already know the solution to problems, may be overcome by the use of reinforcement learning algorithms in place of backpropagation. It is suggested that the complexity of search space in real world neural network learning problems may be reduced if learning is divided into two components. One component is concerned with abstracting structure from the environment and hence with developing representations of stimuli. The other component involves associating and refining these representations on the basis of feedback from the environment. Time-dependent learning problems are also considered in this hybrid framework. Finally, an ‘open systems’ approach in which subsets of a network may adapt independently on the basis of spatio-temporal patterns is briefly discussed.  相似文献   

11.
12.
In this research, we compare malware detection techniques based on static, dynamic, and hybrid analysis. Specifically, we train Hidden Markov Models (HMMs) on both static and dynamic feature sets and compare the resulting detection rates over a substantial number of malware families. We also consider hybrid cases, where dynamic analysis is used in the training phase, with static techniques used in the detection phase, and vice versa. In our experiments, a fully dynamic approach generally yields the best detection rates. We discuss the implications of this research for malware detection based on hybrid techniques.  相似文献   

13.
Based on the electric pitch system of large scale horizontal-axis wind turbines,the blade pitch loads coming mainly from centrifugal force,aerodynamic force and gravity are analyzed,and the calculation models for them are established in this paper.For illustration,a 1.2 MW wind turbine is introduced as a practical sample,and its blade pitch loads from centrifugal force,aerodynamic force and gravity are calculated and analyzed separately and synthetically.The research results showed that in the process of ro...  相似文献   

14.
Social fMRI: Investigating and shaping social mechanisms in the real world   总被引:1,自引:0,他引:1  
We introduce the Friends and Family study, a longitudinal living laboratory in a residential community. In this study, we employ a ubiquitous computing approach, Social Functional Mechanism-design and Relationship Imaging, or Social fMRI, that combines extremely rich data collection with the ability to conduct targeted experimental interventions with study populations. We present our mobile-phone-based social and behavioral sensing system, deployed in the wild for over 15 months. Finally, we present three investigations performed during the study, looking into the connection between individuals’ social behavior and their financial status, network effects in decision making, and a novel intervention aimed at increasing physical activity in the subject population. Results demonstrate the value of social factors for choice, motivation, and adherence, and enable quantifying the contribution of different incentive mechanisms.  相似文献   

15.
16.
Mobile robot navigation under controlled laboratory conditions is, by now, state of the art and reliably achievable. To transfer navigation mechanisms used in such small-scale environments to applications in untreated, large environments, however, is not trivial, and typically requires modifications to the original navigation mechanism: scaling up is hard.In this paper, we discuss the difficulties of mobile robot navigation in general, the various options to achieve navigation in large environments, and experiments with Manchester’s FortyTwo, which investigate how scaling up of navigational competencies can be achieved. We were particularly interested in autonomous mobile robot navigation in unmodified, large and varied environments, without the aid of pre-installed maps or supplied CAD models of the environment. This paper presents a general approach to achieve this.FortyTwo regularly travels the corridors of the Department of Computer Science at Manchester University, using topological maps, landmarks, low level “enabling behaviours” and active exploitation of features of the environment. Experimental results obtained in these environments are given in this paper.  相似文献   

17.
The increased complexity and scale of high performance computing and future extreme-scale systems have made resilience a key issue, since it is expected that future systems will have various faults during critical operations. It is also expected that current solutions for resiliency, mainly counting on checkpointing in hardware and applications, will become infeasible because of unacceptable recovery time for checkpointing and restarting. In this paper, we present innovative concepts for anomaly detection and identification, analyzing the duration of pattern transition sequences of an execution window. We use a three-dimensional array of features to capture spatial and temporal variability to be used by an anomaly analysis system to immediately generate an alert and identify the source of faults when an abnormal behavior pattern is captured, indicating some kind of software or hardware failure. The main contributions of this paper include the innovative analysis methodology and feature selection to detect and identify anomalous behavior. Evaluating the effectiveness of this approach to detect faults injected asynchronously shows a detection rate of above 99.9% with no occurrences of false alarms for a wide range of scenarios, and accuracy rate of 100% with short root cause analysis time.  相似文献   

18.
The objective of this paper is to explain our approach called “Work Flow Methodology for Analysis and Conceptual Data Base Design of Large Scale Computer Based Information System”. The user fills in, through the different steps of the methodology and in the light of the definition of dynamic adaptive system, a number of forms which relate the topological dimension to the time dimension for each application of a given system. In addition, we obtain the “Unit Subschema” which defines the responsibilities of issuing and authorization of receiving information at the proper time. Finally, we apply our methodology to the Registration System in Kuwait University.  相似文献   

19.
The Coordinating Centre (CC) of the Gruppo Italiano per lo Studio della Sopravvivenza nell’Infarto miocardico (GISSI) used telecommunication technology to develop a computerized network system for the data management of the GISSI studies. Through a personal computer (PC), a communication program, a modem and a telephone line, the investigator in each participating centre can connect with a micro-computer at the CC, to recruit/randomize patients and download reports on the progress of the trial. In the first case, the investigator is required to answer a set of predefined questions, and thereby the system automatically checks eligibility criteria and randomly assigns the patient to a treatment arm. In the second case, once the investigator has made a choice from a list of standard reports and the relative query on CC central database, the generation, the formatting and the transfer of the selected report to the PC are executed automatically on line. The main advantages of this system are a reduction in number of mistakes in data completion and in the human and economic resources required, as well as the real time updating of participating centres. The system was successfully adopted in the GISSI-3 trial (200 Coronary Care Units and 19 394 patients enrolled), in the European arm of the CORE trial and it is currently being used in the GISSI-Prevenzione trial.  相似文献   

20.
Mobile device manufacturers are rapidly producing miscellaneous Android versions worldwide. Simultaneously, cyber criminals are executing malicious actions, such as tracking user activities, stealing personal data, and committing bank fraud. These criminals gain numerous benefits as too many people use Android for their daily routines, including important communi-cations. With this in mind, security practitioners have conducted static and dynamic analyses to identify malware. This study used static analysis because of its overall code coverage, low resource consumption, and rapid processing. However, static analysis requires a minimum number of features to efficiently classify malware. Therefore, we used genetic search (GS), which is a search based on a genetic algorithm (GA), to select the features among 106 strings. To evaluate the best features determined by GS, we used five machine learning classifiers, namely, Naïve Bayes (NB), functional trees (FT), J48, random forest (RF), and multilayer perceptron (MLP). Among these classifiers, FT gave the highest accuracy (95%) and true positive rate (TPR) (96.7%) with the use of only six features.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号