共查询到20条相似文献,搜索用时 0 毫秒
1.
Roy Ladner Frederick Petry Kalyan Moy Gupta Elizabeth Warner Philip Moore David W. Aha 《Soft Computing - A Fusion of Foundations, Methodologies and Applications》2008,12(11):1089-1098
To enhance and improve the interoperability of meteorological Web Services, we are currently developing an Integrated Web
Services Brokering System (IWB). IWB uses a case-based classifier to automatically discover Web Services. In this paper, we
explore the use of rough set techniques for selecting features prior to classification. We demonstrate the effectiveness of
this feature technique by comparing it with a leading non-rough set (Information Gain) feature selection technique. 相似文献
2.
《Computers in human behavior》2006,22(4):557-587
Human perceptual and cognitive abilities are limited resources. Attention is the mechanism used to allocate such resources in the most effective way. Current technologies, in addition to allowing fast access to information and people, should be designed to support human attentional processes on which they impose further strain. This paper analyses the issues related to the design of systems capable of such support: attention aware systems. We introduce the research aimed at understanding and modelling human attentional processes, including perceptual and cognitive processes as studied in cognitive psychology, as well as rhetorical, aesthetic, and social aspects related to attentional mechanisms. We analyse current approaches to the design of attention aware systems along three major features: detection of user’s current attentional state, detection and evaluation of possible alternative attentional states, strategies for focus switch or maintenance. Finally, we discuss the most promising research direction for the development of systems capable of supporting human attentional mechanisms. 相似文献
3.
Theories, tools and research methods in program comprehension: past, present and future 总被引:1,自引:0,他引:1
Margaret-Anne Storey 《Software Quality Journal》2006,14(3):187-208
Program comprehension research can be characterized by both the theories that provide rich explanations about how programmers
understand software, as well as the tools that are used to assist in comprehension tasks. In this paper, I review some of
the key cognitive theories of program comprehension that have emerged over the past thirty years. Using these theories as
a canvas, I then explore how tools that are commonly used today have evolved to support program comprehension. Specifically,
I discuss how the theories and tools are related and reflect on the research methods that were used to construct the theories
and evaluate the tools. The reviewed theories and tools are distinguished according to human characteristics, program characteristics,
and the context for the various comprehension tasks. Finally, I predict how these characteristics will change in the future
and speculate on how a number of important research directions could lead to improvements in program comprehension tool development
and research methods.
Dr. Margaret-Anne
Storey is an associate professor of computer science at the University of Victoria, a Visiting Scientist at the IBM Centre for Advanced
Studies in Toronto and a Canada Research Chair in Human Computer Interaction for Software Engineering. Her research passion
is to understand how technology can help people explore, understand and share complex information and knowledge. She applies
and evaluates techniques from knowledge engineering and visual interface design to applications such as reverse engineering
of legacy software, medical ontology development, digital image management and learning in web-based environments. She is
also an educator and enjoys the challenges of teaching programming to novice programmers. 相似文献
4.
5.
6.
7.
Healthcare is of particular importance in everyone’s life, and keeping the advancement of it on a good pace is a priority of any country, as it highly influences the overall well-being of its citizens. Each government strives to build a modern, intelligent medical system that provides maximum population coverage with high-quality medical services. The development of Information and Communication Technologies (ICT) significantly improves the accessibility and effectiveness of the healthcare system by forming the eHealth environment, thus, providing an opportunity to enhance the quality of patient care and significantly speed up the work of medical experts and reduce costs for medical services. Shifting medical services to digital and remote operations requires a lot of computational capabilities. Implementing new computing paradigms is prominent — remote services face new requirements due to the increasing data and demand for new computing solutions. Computing paradigms, e.g., Cloud, Edge, Mobile Edge Computing, besides others, are used to process the collected medical data, improving patient healthcare quality. This paper focuses on computing solutions for medical use cases by offering a comprehensive survey on standardization aspects, use cases, applicable computing paradigms, security limitations, and design considerations within the ICT usages for medical applications. Finally, it outlines the most critical integration challenges and solutions from the literature. 相似文献
8.
Ji?í Wiedermann 《Natural computing》2012,11(1):59-63
Amorphous computing presents a novel computational paradigm. The respective computational models have been recently introduced
and studied in a series of works by J. Wiedermann and his Ph.D. student L. Petrů. From a computational viewpoint, amorphous
computing systems differ from the classical ones almost in every aspect: they consist of a set of tiny, independent and self-powered
processors or robots that can communicate wirelessly to a limited distance. The processors are randomly placed in a closed
area or volume and form an ad-hoc network; in some applications they can move, either actively, or passively (e.g., in a bloodstream).
Assuming the exponential progress in all sciences resulting in our ability to produce amorphous computing systems with myriads
of processors, an unmatched application potential is expected profoundly to change all areas of science and life. But prior
to this state of the matters theoretical and practical studies of the computational properties and efficiency of amorphous
computing systems must be performed. It is expected that an indispensable part of computer science will be affected by this
trend. 相似文献
9.
Model driven engineering (MDE) is a suitable approach for performing the construction of software systems (in particular in the Web application domain). There are different types of Web applications depending on their purpose (i.e., document-centric, interactive, transactional, workflow/business process-based, collaborative, etc). This work focusses on business process-based Web applications in order to be able to understand business processes in a broad sense, from the lightweight business processes already addressed by existing proposals to long-running asynchronous processes. This work presents a MDE method for the construction of systems of this type. The method has been designed in two steps following the MDE principles. In the first step, the system is represented by means of models in a technology-independent manner. These models capture the different aspects of Web-based systems (these aspects refer to behaviour, structure, navigation, and presentation issues). In the second step, the model transformations (both model- to-model and model-to-text) are applied in order to obtain the final system in terms of a specific technology. In addition, a set of Eclipse-based tools has been developed to provide automation in the application of the proposed method in order to validate the proposal. 相似文献
10.
Recent advances in physiological computing have been made due to Artificial Intelligence and Machine Learning, which have profoundly begun to influence occupational health and safety (OHS) in construction. Acknowledging the current and future use of physiological computing, we address the following research question in this paper: What developments in physiological computing can be used to improve OHS in construction? Using a narrative systematic review, we examine studies that have used physiological computing in construction to monitor people OHS. Our review indicates that there is a need for physiological computing systems to be: (1) more accurate; (2) portable and easier to use; (3) generalizable across varying work tasks; and (4) accepted by users and their benefits realized. Considering our observations derived from the prevailing literature and practice, we suggest that future research should aim to mitigate OHS risks by focusing on: (1) development of high-quality database; (2) feature engineering extraction by using an array of machine learning techniques; (3) understanding the context and enacting intervention strategies. The upshot of performing such a review is to provide a signpost for future research in physiological computing of OHS in construction. 相似文献
11.
Oommen B. John Omslandseter Rebekka Olsson Jiao Lei 《Pattern Analysis & Applications》2023,26(3):917-928
Pattern Analysis and Applications - Partitioning, in and of itself, is an NP-hard problem. Prior to the Artificial Intelligence (AI)-based solutions, it was solved in the 1970s by... 相似文献
12.
The Journal of Supercomputing - Wireless sensor networks (WSNs) have been considered as one of the fine research areas in recent years because of vital role in numerous applications. To process the... 相似文献
13.
14.
15.
16.
17.
18.
Massive multiple-input multiple-output (MIMO) systems combined with beamforming antenna array technologies are expected to play a key role in next-generation wireless communication systems (5G), which will be deployed in 2020 and beyond. The main objective of this review paper is to discuss the state-of-the-art research on the most favourable types of beamforming techniques that can be deployed in massive MIMO systems and to clarify the importance of beamforming techniques in massive MIMO systems for eliminating and resolving the many technical hitches that massive MIMO system implementation faces. Classifications of optimal beamforming techniques that are used in wireless communication systems are reviewed in detail to determine which techniques are more suitable for deployment in massive MIMO systems to improve system throughput and reduce intra- and inter-cell interference. To overcome the limitations in the literature, we have suggested an optimal beamforming technique that can provide the highest performance in massive MIMO systems, satisfying the requirements of next-generation wireless communication systems. 相似文献
19.
Sukhpal Singh Gill Adarsh Kumar Harvinder Singh Manmeet Singh Kamalpreet Kaur Muhammad Usman Rajkumar Buyya 《Software》2022,52(1):66-114
Quantum computing (QC) is an emerging paradigm with the potential to offer significant computational advantage over conventional classical computing by exploiting quantum-mechanical principles such as entanglement and superposition. It is anticipated that this computational advantage of QC will help to solve many complex and computationally intractable problems in several application domains such as drug design, data science, clean energy, finance, industrial chemical development, secure communications, and quantum chemistry. In recent years, tremendous progress in both quantum hardware development and quantum software/algorithm has brought QC much closer to reality. Indeed, the demonstration of quantum supremacy marks a significant milestone in the Noisy Intermediate Scale Quantum (NISQ) era—the next logical step being the quantum advantage whereby quantum computers solve a real-world problem much more efficiently than classical computing. As the quantum devices are expected to steadily scale up in the next few years, quantum decoherence and qubit interconnectivity are two of the major challenges to achieve quantum advantage in the NISQ era. QC is a highly topical and fast-moving field of research with significant ongoing progress in all facets. A systematic review of the existing literature on QC will be invaluable to understand the state-of-the-art of this emerging field and identify open challenges for the QC community to address in the coming years. This article presents a comprehensive review of QC literature and proposes taxonomy of QC. The proposed taxonomy is used to map various related studies to identify the research gaps. A detailed overview of quantum software tools and technologies, post-quantum cryptography, and quantum computer hardware development captures the current state-of-the-art in the respective areas. The article identifies and highlights various open challenges and promising future directions for research and innovation in QC. 相似文献