首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Performance considerations, particularly network delays, for integrated voice and data networks are reviewed. The nature of the delay problem is discussed, followed by a review of concepts, objectives and advances in enhanced circuit, packet and hybrid switching techniques, including fast circuit switching (FCS), virtual circuit switching (VCS), buffered speech interpolation (SI), packetized virtual circuit (PVC), cut-through switching (CTS), composite packets and various frame-management strategies for hybrid switching.In particular, the concept of introducing delay to resolve contention in SI is emphasized and, when applied to both voice talkspurts and data messages, this forms a basis for a relatively new approach to network design called transparent message switching (TMS). This approach and its potential performance advantages are reviewed in terms of various architectural aspects of integrated services networks, such as packet structure, multiplexing scheme, server structure and queuing performance, network topology and network protocols.A number of traffic-management strategies and their grade-of-service implications for voice service are discussed. These strategies include voice call and data session blocking, voice talkspurt and data message buffering, speech loss and data integrity and speech processing techniques, including variable quality, rate, speed and entropy coding. Emphasis is placed on the impact of variable delays on voice traffic, especially the importance of generating and preserving appropriate length speech talkspurts in order to mitigate the effects of variable network delay.  相似文献   

2.
Recent advances in data collection and operations analysis techniques have facilitated the process of designing, analyzing, planning, and controlling of engineering processes. Mathematical tools such as graphical models, scheduling techniques, operations research, and simulation have enabled engineers to create models that represent activities, resources, and the environment under which a project is taking place. Traditionally, most simulation paradigms use static or historical data to create computer interpretable representations of real engineering systems. The suitability of this approach for modeling construction operations, however, has always been a challenge since most construction projects are unique in nature as every project is different in design, specifications, methods, and standards. Due to the dynamic nature and complexity of most construction operations, there is a significant need for a methodology that combines the capabilities of traditional modeling of engineering systems and real time field data collection. This paper presents the requirements and applicability of a data-driven modeling framework capable of collecting and manipulating real time field data from construction equipment, creating dynamic 3D visualizations of ongoing engineering activities, and updating the contents of a discrete event simulation model representing the real engineering system. The developed framework can be adopted for use by project decision-makers for short-term project planning and control since the resulting simulation and visualization are completely based on the latest status of project entities.  相似文献   

3.
The NameVoyager, a Web-based visualization of historical trends in baby naming, has proven remarkably popular. We describe design decisions behind the application and lessons learned in creating an application that makes do-it-yourself data mining popular. The prime lesson, it is hypothesized, is that an information visualization tool may be fruitfully viewed not as a tool but as part of an online social environment. In other words, to design a successful exploratory data analysis tool, one good strategy is to create a system that enables "social" data analysis. We end by discussing the design of an extension of the NameVoyager to a more complex data set, in which the principles of social data analysis played a guiding role.  相似文献   

4.
After a brief overview, tools representing a broad spectrum of management features are described in separate presentations. DSCC and Configuration Management Assistant offer a user-definable development framework in which other tools and systems can be included. Amplify Control, EAST IPSE, and Epos are integrated project-management software-engineering systems that provide specific functions in such development frameworks. Aisle, is also such a system but with the even more specific purpose of aiding Ada developers. The Integrated Test Tool System is a utility system that can be integrated with the output of an integrated programming environment to test and validate the results of development efforts, either in Ada, C, Cobol, Fortran, or Pascal. CASE-PM is an example of a learning tool being used by software-engineering students  相似文献   

5.
We introduce FieldBook and GeoDatabase, 2 new and effective tools for geologic field data acquisition and analysis. FieldBook is an application for Apple's Newton MessagePad. Geological data collected at the outcrop, including notes and drawings, can be entered directly and on-site. The formalization of the multiparameter information leads directly to a consistent database. This procedure results in a complete, up-to-date database where all information collected by different researchers in a project is available anytime, and no data are lost. GeoDatabase is an application based on FileMaker™ Pro, representing the FieldBook interface on PC/Macintosh. GeoDatabase provides extensive search possibilities and strong export features that are needed for field-data analysis, either in the field or in the office. It can be used as a central database within a local network with several users on either PC or a Macintosh. FieldBook and GeoDatabase both are simple to use, yet they satisfy the demands of field campaigns involving numerous scientists. Applications of field projects in the crystalline basement of the Salalah area and the Masirah ophiolite are given.  相似文献   

6.

Health services research provides a multi-disciplinary area of scientific exploration in relation to financial systems, social factors, organizational processes, and health technologies. With the help of big data, the huge amount of data can well be stored and handled effectively for diagnosis and also proper treatment of diseases can be monitored with these emerging technologies. In recent years, Diabetes Mellitus is non-transmittable illnesses that are a matter of concern in most of the developing countries. This paper proposes a model of a statistical assessment, healthcare information system for Diabetes Analysis employing big data. The performance metric such as accuracy and F-measure for the proposed statistical assessment model is evaluated by Hadoop framework, the results are comparatively higher than existing methods.

  相似文献   

7.
在房住不炒定位下,住房选购成了广大市民比较关心的问题.把大数据分析技术引入到房价分析,利用Scrapy爬虫框架对广州房价线上数据的爬取,经清洗和可视化,把影响房价的要素以可视化的形式予以呈现.与传统方法相比,大数据分析技术在数据采集及可视化分析应用方面优势明显.  相似文献   

8.
Road accidents cause a great loss to human lives and assets. Most of the accidents occur due to human errors, such as bad awareness, distraction, drowsiness, low training, and fatigue. Advanced driver assistance system (ADAS) can reduce the human errors by keeping an eye on the driving environment and warning a driver to the upcoming danger. However, these systems come only with modern luxury cars because of their high cost and complexity due to several sensors employed. Therefore, camera-based ADAS are becoming an option due to their lower cost, higher availability, numerous applications and ability to combine with other systems. Targeting at designing a camera-based ADAS, we have conducted an ethnographic study of drivers to know what information about the driving environment would be useful in preventing accidents. It turned out that information on speed, distance, relative position, direction, and size and type of the nearby objects would be useful and enough for implementing most of the ADAS functions. Several camera-based techniques are available for capturing the required information. We propose a novel design of an integrated camera-based ADAS that puts technologies??such as five ordinary CMOS image sensors, a digital image processor, and a thin display??into a smart system to offer a dozen advanced driver assistance functions. A basic prototype is also implemented using MATLAB. Our design and the prototype testify that all the required technologies are now available for implementing a full-fledged camera-based ADAS.  相似文献   

9.
Mobile sensing and mapping applications are becoming more prevalent because sensing hardware is becoming more portable and more affordable. However, most of the hardware uses small numbers of fixed sensors that report and share multiple sets of environmental data which raises privacy concerns. Instead, these systems can be decentralized and managed by individuals in their public and private spaces. This paper describes a robust system called MobGeoSens which enables individuals to monitor their local environment (e.g. pollution and temperature) and their private spaces (e.g. activities and health) by using mobile phones in their day to day life. The MobGeoSen is a combination of software components that facilitates the phone’s internal sensing devices (e.g. microphone and camera) and external wireless sensors (e.g. data loggers and GPS receivers) for data collection. It also adds a new dimension of spatial localization to the data collection process and provides the user with both textual and spatial cartographic displays. While collecting the data, individuals can interactively add annotations and photos which are automatically added and integrated in the visualization file/log. This makes it easy to visualize the data, photos and annotations on a spatial and temporal visualization tool. In addition, the paper will present ways in which mobile phones can be used as noise sensors using an on-device microphone. Finally, we present our experiences with school children using the above mentioned system to measure their exposure to environmental pollution.
Adrain WoolardEmail:
  相似文献   

10.
The attack on September 11, 2001 set off numerous efforts to counter terrorism and insurgencies. Central to these efforts has been the drive to improve data collection and analysis. Section 1 summarizes some of the more notable improvements among U.S. government agencies as they strive to develop their capabilities. Although progress has been made, daunting challenges remain. Section 2 reviews the basic challenges to data collection and analysis focusing in some depth on the difficulties of data integration. Three general approaches to data integration are identified—discipline-centric, placed-centric and virtual. A summary of the major challenges in data integration confronting field operators in Iraq and Afghanistan illustrates the work that lies ahead. Section 3 shifts gears to focus on the future and introduces the discipline of Visual Analytics—an emerging field dedicated to improving data collection and analysis through the use of computer-mediated visualization techniques and tools. The purpose of Visual Analytics is to maximize human capability to perceive, understand, reason, make judgments and work collaboratively with multidimensional, conflicting, and dynamic data. The paper concludes with two excellent examples of analytic software platforms that have been developed for the intelligence community—Palantir and ORA. They signal the progress made in the field of Visual Analytics to date and illustrate the opportunities that await other IS researchers interested in applying their knowledge and skills to the tracking and disrupting of dark networks.  相似文献   

11.
A software package, OMNILAB, has been written for the IBM PC, XT and AT computers to be a general purpose system for data collection, analysis, and display. The program supports collection of dta from a variety of absorbance detectors, from pH meters and from other instruments that output time-varying analog voltages in the ranges of millivolts or volts. The program includes capabilities for averaging of data, baseline subtraction, integration of curves, and versatile formatting for both video and hardcopy display of data.  相似文献   

12.
Symbolic data analysis tools for recommendation systems   总被引:3,自引:2,他引:1  
Recommender systems have become an important tool to cope with the information overload problem by acquiring data about user behavior. After tracing the user’s behavior, through actions or rates, computational recommender systems use information- filtering techniques to recommend items. In order to recommend new items, one of the three major approaches is generally adopted: content-based filtering, collaborative filtering, or hybrid filtering. This paper presents three information-filtering methods, each of them based on one of these approaches. In our methods, the user profile is built up through symbolic data structures and the user and item correlations are computed through dissimilarity functions adapted from the symbolic data analysis (SDA) domain. The use of SDA tools has improved the performance of recommender systems, particularly concerning the find good items task measured by the half-life utility metric, when there is not much information about the user.  相似文献   

13.
This paper proposes a generic approach for designing vulnerability testing tools for web services, which includes the definition of the testing procedure and the tool components. Based on the proposed approach, we present the design of three innovative testing tools that implement three complementary techniques (improved penetration testing, attack signatures and interface monitoring, and runtime anomaly detection) for detecting injection vulnerabilities, thus offering an extensive support for different scenarios. A case study has been designed to demonstrate the tools for the particular case of SQL Injection vulnerabilities. The experimental evaluation demonstrates that the tools can effectively be used in different scenarios and that they outperform well-known commercial tools by achieving higher detection coverage and lower false-positive rates.  相似文献   

14.
《Ergonomics》2012,55(6):423-435
The collection of in-depth road crash data is considered from three perspectives.(1) The problems of sampling are illustrated by reference to several recent studies.(2) The techniques of collecting statements from witnesses are reviewed and the problem of when to stop collecting details is illustrated by consideration of the importance of post-traumatic psychological sequelae of road crashes. (3) The training and support systems needed for personnel working on in-depth projects are considered. A simple contributing factor classification technique for the analysis of road crash data is described and its usefulness illustrated by the brief presentation of some results of in-depth studies at the Traffic Accident Research Unit including pedestrian, pedal and motorcycle crashes.  相似文献   

15.
Context-sensitive systems (CSS) are computer systems that use context to provide more relevant services or information to support users performing their tasks, where context is any information that can be used to characterize the situation in which something exists or occurs. CSS demand that designers consider new aspects and challenges in comparison to traditional applications. In a preliminary experiment, we observed that developers find it difficult to include the concept of context in their applications. However, only few approaches offer integrated domain-independent support on developing CSS. This paper presents an integrated approach to assist the design of CSS. The originality of this work lies on the proposed way of thinking about context, on the proposed context metamodel and on the specification of a process for designing CSS. The metamodel supports building context models by making explicit the concepts related to context manipulation and by separating the context structure model from the CSS behavior model. The design process details the main activities related to context specification and the design of CSS, providing a systematic way to execute these tasks. This work also advances the state of the art related to the understanding of the concept of context (and its associated concepts). Three experimental studies were conducted to evaluate the proposal: its instantiation in the design of a context-sensitive Expert Recommender System, its usage by distinct designers on their CSS projects, and a qualitative evaluation of the overall proposal by experienced CSS developers. These studies showed a good acceptance of our approach and indicated the feasibility of using it on real projects.  相似文献   

16.
Abstract

The Advanced Very High Resolution Radiometer (AVHRR) is currently the only operational remote sensing system capable of providing global daily data which can be used for vegetation monitoring. These data are available with resolution cell sizes ranging from around one to 20 km on a side, though the temporal and spatial extent of cover at each resolution is variable. In this paper Normalized Difference Vegetation Index temporal curves derived from AVHRR at different resolutions are compared over both agricultural and natural tropical vegetation types. For the agricultural regions the length of growing season and major breaks of slope associated with key crop development events are equally well shown at coarse and fine resolution. Detailed examination of the curves reveals differences thought to result from temporal changes in landscape structure. Temporal curves derived from AVHRR data at dilTerent spatial resolutions shows that the spatial organization of both agricultural and natural landscapes, tropical forest in this case, changes throughout a single season. Transitions across major ecological zones are detected across a range of resolutions, though the undersampling employed in the generation of the coarser resolution products is found to exert some limitations on the spatial representivity of these data; this varies both with geographical area and time. These observations highlight the importance of a consideration of scale when using AVHRR data for vegetation monitoring, and emphasize the need for dilTerent scales of observation (both in temporal and spatial terms) for different problems and at different times of the year.  相似文献   

17.
A software package is described that collects, graphs, performs peak sensing, and prints data from a spectrophotometer interfaced to an Apple II microcomputer with an ADALAB analog-to-digital converter card. Data collection and storage is performed by an interpreted BASIC program with machine language subroutines. Recalling, graphing, and printout of the data is accomplished by a separate compiled BASIC program. Both of these programs, as well as assisting utility programs, are under control of a menu program. The programs take advantage of all available memory in the 64-K Apple II computer to allow for storage of up to 12 600 data points (7 hours of data collection at a sampling rate of 1 sample every 2 s). As the data collection program provides peak sensing, and the capability of annotating various time points, the programs are ideally suited for use in column chromatography.  相似文献   

18.
全面论述了大数据分析技术的概念、现状、技术及应用,包括大数据基本概念及特点、大数据分析产生的时代背景、关键技术以及大数据带来的变革和挑战,在大数据分析的关键技术中重点对大数据的清洗与融合、大数据处理框架、大数据的建模与分析进行了介绍。  相似文献   

19.
In the legal domain, it is rare to find solutions to problems by simply applying algorithms or invoking deductive rules in some knowledge‐based program. Instead, expert practitioners often supplement domain‐specific knowledge with field experience. This type of expertise is often applied in the form of an analogy. This research proposes to combine both reasoning with precedents and reasoning with statutes and regulations in a way that will enhance the statutory interpretation task. This is being attempted through the integration of database and expert system technologies. Case‐based reasoning is being used to model legal precedents while rule‐based reasoning modules are being used to model the legislation and other types of causal knowledge. It is hoped to generalise these findings and to develop a formal methodology for integrating case‐based databases with rule‐based expert systems in the legal domain.  相似文献   

20.
We introduce from first principles an analysis of the information content of multivariate distributions as information sources. Specifically, we generalize a balance equation and a visualization device, the Entropy Triangle, for multivariate distributions and find notable differences with similar analyses done on joint distributions as models of information channels.As an example application, we extend a framework for the analysis of classifiers to also encompass the analysis of data sets. With such tools we analyze a handful of UCI machine learning task to start addressing the question of how well do datasets convey the information they are supposed to capture about the phenomena they stand for.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号