首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Scientific datasets are often difficult to analyse or visualize, due to their large size and high dimensionality. A multistep approach to address this problem is proposed. Data management techniques are used to identify areas of interest within the dataset. This allows the reduction of a dataset's size and dimensionality, and the estimation of missing values or correction of erroneous entries. The results are displayed using visualization techniques based on perceptual rules. The visualization tools are designed to exploit the power of the low-level human visual system. The result is a set of displays that allow users to perform rapid and accurate exploratory data analysis. In order to demonstrate the techniques, an environmental dataset being used to model salmon growth and migration patterns was visualized. Data mining was used to identify significant attributes and to provide accurate estimates of plankton density. Colour and texture were used to visualize the significant attributes and estimated plankton densities for each month for the years 1956-1964. Experiments run in the laboratory showed that the chosen colours and textures support rapid and accurate element identification, boundary detection, region tracking and estimation. The result is a visualization tool that allows users to quickly locate specific plankton densities and the boundaries they form. Users can compare plankton densities to other environmental conditions like sea surface temperature and current strength. Finally, users can track changes in any of the dataset's attributes on a monthly or yearly basis.  相似文献   

2.

Scientific datasets are often difficult to analyse or visualize, due to their large size and high dimensionality. A multistep approach to address this problem is proposed. Data management techniques are used to identify areas of interest within the dataset. This allows the reduction of a dataset's size and dimensionality, and the estimation of missing values or correction of erroneous entries. The results are displayed using visualization techniques based on perceptual rules. The visualization tools are designed to exploit the power of the low-level human visual system. The result is a set of displays that allow users to perform rapid and accurate exploratory data analysis. In order to demonstrate the techniques, an environmental dataset being used to model salmon growth and migration patterns was visualized. Data mining was used to identify significant attributes and to provide accurate estimates of plankton density. Colour and texture were used to visualize the significant attributes and estimated plankton densities for each month for the years 1956-1964. Experiments run in the laboratory showed that the chosen colours and textures support rapid and accurate element identification, boundary detection, region tracking and estimation. The result is a visualization tool that allows users to quickly locate specific plankton densities and the boundaries they form. Users can compare plankton densities to other environmental conditions like sea surface temperature and current strength. Finally, users can track changes in any of the dataset's attributes on a monthly or yearly basis.  相似文献   

3.
Any large organizations that first came online in the late 1990s are now facing the decision whether to upgrade their Web systems or to start anew. Given the speed with which new technologies are introduced in the Web environment, system deployment life cycles have shrunk significantly-but so have system life spans. After only a few years, an organization's Internet infrastructure is likely to need a major overhaul. In late 2001, the systems architecture team to which I belong took on these issues for an organization that wanted to rebuild its Web infrastructure. The existing infrastructure contained multiple single points of failure, could not scale to expected usage patterns, was built on proprietary systems, and had a high management overhead. The legacy infrastructure had grown organically over the previous five years as administrators added unplanned features and functionality, and usage had grown 100-fold since the specifications were initially developed. Because of the age and condition of the legacy systems, we decided to redesign the solution from scratch to overcome the inherent limitations. This case study describes the process our systems architecture team followed for designing and deploying the new architecture. I detail the component selection rationale, with implementation details where allowed. Ours is just one successful approach to deploying a. multisite, fully redundant Web-based system for a large organization; other reasonable and viable ways to build such a system also exist.  相似文献   

4.
A low cost, high-speed, general-purpose ditigal signal processing system was constructed using the TMS32010 digital signal processor. The system was designed with simplicity, compactness, flexibility and expandibility in mind. A parallel processing architecture was adopted to achieve realtime performance. Four processors were used in the prototype system, but this can be expanded easily. Interprocessor data transfer and communications with the host computer are facilitated via a single common bus and a bank of shared memory. A one-dimensional digital FIR filter and a realtime FFT program were used to evaluate the performance of the system. In addition, a realtime spectrogram was implemented as an application example.  相似文献   

5.
Building and using a scalable display wall system   总被引:4,自引:0,他引:4  
Princeton's scalable display wall project explores building and using a large-format display with commodity components. The prototype system has been operational since March 1998. Our goal is to construct a collaborative space that fully exploits a large-format display system with immersive sound and natural user interfaces. Our prototype system is built with low-cost commodity components: a cluster of PCs, PC graphics accelerators, consumer video and sound equipment, and portable presentation projectors. This approach has the advantages of low cost and of tracking technology well, as high-volume commodity components typically have better price-performance ratios and improve at faster rates than special-purpose hardware. We report our early experiences in building and using the display wall system. In particular, we describe our approach to research challenges in several specific research areas, including seamless tiling, parallel rendering, parallel data visualization, parallel MPEG decoding, layered multiresolution video input, multichannel immersive sound, user interfaces, application tools, and content creation  相似文献   

6.
The use of mechanical trading systems allows managing a huge amount of data related to the factors affecting investment performance (macroeconomic variables, company information, industrial indicators, market variables, etc.) while avoiding the psychological reactions of traders when they invest in financial markets. When trading is executed in an intra-daily frequency instead a daily frequency, mechanical trading systems needs to be supported by very powerful engines since the amount of data to deal with grow while the response time required to support trades gets shorter. Numerous studies document the use of genetic algorithms (GAs) as the engine driving mechanical trading systems. The empirical insights provided in this paper demonstrate that the combine use of GA together with a GPU-CPU architecture speeds up enormously the power and search capacity of the GA for this kind of financial applications. Moreover, the parallelization allows us to implement and test previous GA approximations. Regarding the investment results, we can report 870% of profit for the S&P 500 companies in a 10-year time period (1996–2006), when the average profit of the S&P 500 in the same period was 273%.  相似文献   

7.
8.
The mathematician-architect Christopher Alexander has devised a theory of objective architectural design. He believes that all architectural forms can be described as interacting patterns, all possible relationships of which are governed by generative rules. These form a ‘pattern language’ capable of generating forms appropriate for a given environmental context. The complexity of interaction among these rules leads to difficulties in their representation by conventional methods. This paper presents a Prolog-based expert system which implements Alexander's design methodology to produce perspective views of partially and fully differentiated 3-dimensional architectural forms.  相似文献   

9.
Rapide is an event-based, concurrent, object-oriented language specifically designed for prototyping system architectures. Two principle design goals are: (1) to provide constructs for defining executable prototypes of architectures and (2) to adopt an execution model in which the concurrency, synchronization, dataflow, and timing properties of a prototype are explicitly represented. This paper describes the partially ordered event set (poset) execution model and outlines with examples some of the event-based features for defining communication architectures and relationships between architectures. Various features of Rapide are illustrated by excerpts from a prototype of the X/Open distributed transaction processing reference architecture  相似文献   

10.
Traditional interactive system architectures such as MVC [Goldberg, A., 1984. Smaltalk-80: The Interactive Programming Environment, Addison-Wesley Publ.] and PAC [Coutaz, J., 1987. PAC, an implementation model for dialog design. In: Interact’87, Sttutgart, September 1987, pp. 431-436; Coutaz, J., 1990. Architecture models for interactive software: faillures and trends. In: Cockton, G. (Ed.), Engineering for Human-Computer Interaction, Elsevier Science Publ., pp. 137-153.] decompose the system into subsystems that are relatively independent, thereby allowing the design work to be partitioned between the user interfaces and underlying functionalities. Such architectures extend the independence assumption to usability, approaching the design of the user interface as a subsystem that can designed and tested independently from the underlying functionality. This Cartesian dichotomy can be fallacious, as functionalities buried in the application’s logic can sometimes affect the usability of the system. Our investigations model the relationships between internal software attributes and externally visible usability factors. We propose a pattern-based approach for dealing with these relationships. We conclude by discussing how these patterns can lead to a methodological framework for improving interactive system architectures, and how these patterns can support the integration of usability in the software design process.  相似文献   

11.
This paper will show how architecture design and analysis techniques rest on a small number of foundational principles. We will show how those principles have been instantiated as a core set of techniques. These techniques, combined together, have resulted in several highly successful architecture analysis and design methods. Finally, we will show how these foundations, and the techniques that instantiate them, can be re-combined for new purposes addressing problems of ever-increasing scale, specifically: addressing the highly complex problems of analyzing software-intensive ecosystems.  相似文献   

12.
13.
The IP packet forwarding of current Internet is mainly destination based. In the forwarding process, the source IP address is not checked in most cases.This causes serious security, management and accounting problems. Based on the drastically increased IPv6 address space, a "source address validation architecture" (SAVA) is proposed in this paper, which can guarantee that every packet received and forwarded holds an authenticated source IP address. The design goals of the architecture are lightweight, loose coupling, "multi-fence support" and incremental deployment. This paper discusses the design and implementation for the architecture, including inter-AS, intra-AS and local subnet. The performance and scalability of SAVA are described. This architecture is deployed into the CNGI-CERNET2 infrastructure a large-scale native IPv6 backbone network of the China Next Generation Internet project. We believe that the SAVA will help the transition to a new, more secure and dependable Internet.  相似文献   

14.
Over the past few years, the Internet of Things has gone from theoretical concept to our everyday living experience. The explosive growth of sensor streams also leads to a new paradigm of edge computing. In the surveillance system, edge-based automation is crucial to get fast response for fast data analytics among connected devices. In this paper, we propose an automated surveillance system to improve robustness and intelligence. Our scalable architecture is an alternative way of reducing the server resource and wireless network limitation.  相似文献   

15.
Multimedia Tools and Applications - Object detection in computer vision has been a significant research area for the past decade. Identifying objects with multiple classes from an image has...  相似文献   

16.
In recent years, network applications and hardware technology have been developed in impressive speed. That is, a large-scale network switching system is needed to satisfy all demand among network service providers and population, such as data, voice, image, video on demand, videoconferencing, telecommunications, remote control and teaching, etc. A general large-scale network switching system cannot fulfill various application needs, such as the correctness of data transmission and the capacity of extension for input/output port of switching system. In this paper, we propose a nested ring-based architecture to build a very large-scale network switching system. In order to satisfy the various network application needs, a nested ring-based architecture is designed as a switching element. It can make input data exchange fast to the destined output port, and input/output port of switching system can easily be extended up to 100 ports or 1000 ports to construct a very large-scale network switching system. The simulation results show that a better performance can be achieved in data transmission, delay, throughput and extensibility for the proposed system.  相似文献   

17.
Depth from focus using a pyramid architecture   总被引:1,自引:0,他引:1  
A method is presented for depth recovery through the analysis of scene sharpness across changing focus position. Modeling a defocused image as the application of a low pass filter on a properly focused image of the same scene, we can compare the high spatial frequency content of regions in each image and determine the correct focus position. Recovering depth in this manner is inherently a local operation, and can be done efficiently using a pipelined image processor. Laplacian and Gaussian pyramids are used to calculate sharpness maps which are collected and compared to find the focus position that maximizes high spatial frequencies for each region.  相似文献   

18.
A software architecture to engineer complex process control applications must combine into the same paradigm efficient reactive and real-time functionalities and mechanisms to capture dynamic time-pressured intelligent behaviors, and must provide convenient high level tools to free the programmer from having to think at an unappropriate level of detail. We implement such characteristics into a blackboard framework that builds the basic abstract elements of reactive behavior and the blackboard computational model on top of low level real-time operating system functions. Under this approach, the engineer gets a powerful and flexible high level medium to map a complex system design that requires artificial intelligence techniques, like intelligent monitoring, and reactive planning and execution, with fully support for real-time programming. The paper also reviews other alternatives which have been explored in the past recent years for implementing complex reactive planning and execution systems.  相似文献   

19.
Neural Computing and Applications - Nowadays, lecture-recording systems play a vital role in collecting spoken discourse for e-learning. However, in view of the growing development of e-learning,...  相似文献   

20.
Issues are discussed that relate to designing a state automated scientometric information system. A conceptual model is developed. The model, as well as its software analogues, is used to analyze the experience in the design of scientometric information systems in Russia and abroad. An original scheme is proposed for the implementation of the program architecture. The main feature of the scheme is the ability to collect all the necessary data and provide functionality to end users through the import and export of web services.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号