The JISC-funded Focus of Access to Institutional Resources (FAIR) Programme ran from 2002-2005. The 14 projects within this programme investigated the cultural, organisational, legal and technical factors involved in providing places where institutional digital content, of which there is an increasing amount, could be stored and subsequently shared with others in the Higher and Further Education communities where appropriate. The primary technology to enable such sharing is the Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH), a lightweight protocol based on sharing metadata about the digital content available. The technical issues were at times overshadowed by the cultural, organisational and legal issues that had to be addressed. The experience of the Programme as a whole provides a valuable insight to the issues at hand in sharing content and a good starting point for other institutions wishing to investigate this capability. A Synthesis of the Programme was commissioned in late 2004 to capture this experience, and all tangible outputs where produced. A website was produced providing a comprehensive listing of all project outputs and a printed brochure was published in late 2005 as an introduction to the Programme and its findings. This article summarises the findings of the FAIR Synthesis and provides a range of pointers to further information for subsequent investigation. 相似文献
A near-resonant, sway-induced sloshing flow in a rectangular tank is used to compare a homogeneous and inhomogeneous multiphase approach for fluid density and viscosity in a commercial CFD code. Dimensional analysis of the relative motion between the phases suggests the application of an inhomogeneous multiphase model whereas previous published work has used the computationally cheaper homogeneous (or average property) approach. The comparison between the computational and experimental results shows that the homogeneous model tends to underestimate the experimental peak pressures by up to 50%. The inhomogeneous multiphase model gives good agreement with the experimental pressure data. Examination of the relative velocity at the fluid interface confirms that the inhomogeneous model is the appropriate model to use for the simulation of a violent sloshing flow. 相似文献
The environmental and societal impacts of tropical cyclones could be reduced using a range of management initiatives. Remote sensing can be a cost effective, accurate, and potential tool for mapping the multiple impacts caused by tropical cyclones using high-to-moderate spatial resolution (5–30 m) satellite imagery to provide data on the following essential parameters – evacuation, relief, and management of natural resources. This study developed and evaluated an approach for assessing the impacts of tropical cyclones through object-based image analysis and moderate spatial resolution imagery. Pre- and post-cyclone maps of artificial and natural features are required for assessing the overall impacts in the landscape that could be acquired by mapping specific land cover types. We used the object-based approach to map land-cover types in pre- and post-cyclone Satellite Pour l’Observation de la Terre (SPOT) 5 image data and the post-classification comparison technique to identify changes in the particular features in the landscape. Cyclone Sidr (2007) was used to test the applicability of this approach in Sarankhola Upazila in Bangladesh. The object-based approach provided accurate results for classifying features from pre- and post-cyclone satellite images with an overall accuracy of 95.43% and 93.27%, respectively. Mapped changes identified the extent, type, and form of cyclone induced impacts. Our results indicate that 63.15% of the study area was significantly affected by cyclone Sidr. The majority of mapped damage was found in vegetation, cropped lands, settlements, and infrastructure. The damage results were verified through the high spatial resolution satellite imagery, reports and pictures that were taken after the cyclone. The methods developed may be used in future to assess the multiple impacts caused by tropical cyclones in Bangladesh and other similar environments for the purposes of tropical cyclone disaster management. 相似文献
Large datasets typically contain coarse features comprised of finer sub-features. Even if the shapes of the small structures are evident in a 3D display, the aggregate shapes they suggest may not be easily inferred. From previous studies in shape perception, the evidence has not been clear whether physically-based illumination confers any advantage over local illumination for understanding scenes that arise in visualization of large data sets that contain features at two distinct scales. In this paper we show that physically-based illumination can improve the perception for some static scenes of complex 3D geometry from flow fields. We perform human-subjects experiments to quantify the effect of physically-based illumination on participant performance for two tasks: selecting the closer of two streamtubes from a field of tubes, and identifying the shape of the domain of a flow field over different densities of tubes. We find that physically-based illumination influences participant performance as strongly as perspective projection, suggesting that physically-based illumination is indeed a strong cue to the layout of complex scenes. We also find that increasing the density of tubes for the shape identification task improved participant performance under physically-based illumination but not under the traditional hardware-accelerated illumination model. 相似文献
Interactive history tools, ranging from basic undo and redo to branching timelines of user actions, facilitate iterative forms of interaction. In this paper, we investigate the design of history mechanisms for information visualization. We present a design space analysis of both architectural and interface issues, identifying design decisions and associated trade-offs. Based on this analysis, we contribute a design study of graphical history tools for Tableau, a database visualization system. These tools record and visualize interaction histories, support data analysis and communication of findings, and contribute novel mechanisms for presenting, managing, and exporting histories. Furthermore, we have analyzed aggregated collections of history sessions to evaluate Tableau usage. We describe additional tools for analyzing users’ history logs and how they have been applied to study usage patterns in Tableau. 相似文献
In many applications, volumetric data sets are examined by displaying isosurfaces, surfaces where the data, or some function of the data, takes on a given value. Interactive applications typically use local lighting models to render such surfaces. This work introduces a method to precompute or lazily compute global illumination to improve interactive isosurface renderings. The precompiled illumination resides in a separate volume and includes direct light, shadows, and intersections. Using this volume, interactive globally illuminated renderings of isosurfaces become feasible while still allowing dynamic manipulation of lighting, viewpoint and isovalue. 相似文献
Many important science and engineering applications, such as regulating the temperature distribution over a semiconductor wafer and controlling the noise from a photocopy machine, require interpreting distributed data and designing decentralized controllers for spatially distributed systems. Developing effective computational techniques for representing and reasoning about these systems, which are usually modeled with partial differential equations (PDEs), is one of the major challenge problems for qualitative and spatial reasoning research.
This paper introduces a novel approach to decentralized control design, influence-based model decomposition, and applies it in the context of thermal regulation. Influence-based model decomposition uses a decentralized model, called an influence graph, as a key data abstraction representing influences of controls on distributed physical fields. It serves as the basis for novel algorithms for control placement and parameter design for distributed systems with large numbers of coupled variables. These algorithms exploit physical knowledge of locality, linear superposability, and continuity, encapsulated in influence graphs representing dependencies of field nodes on control nodes. The control placement design algorithms utilize influence graphs to decompose a problem domain so as to decouple the resulting regions. The decentralized control parameter optimization algorithms utilize influence graphs to efficiently evaluate thermal fields and to explicitly trade off computation, communication, and control quality. By leveraging the physical knowledge encapsulated in influence graphs, these control design algorithms are more efficient than standard techniques, and produce designs explainable in terms of problem structures. 相似文献
This study extended client-focused research by using the nearest neighbor (NN) approach, a client-specific sampling and prediction strategy derived from research on alpine avalanches. Psychotherapy clients (N = 203) seen in routine practice settings in the United Kingdom completed a battery of intake measures and then completed symptom intensity ratings before each session. Forecasts of each client's rate of change and session-by-session variability were computed on the basis of that client's NNs (n = 10-50 in different comparisons). Alternative forecasts used linear or log-linear slopes and were compared with an alternative prediction strategy. Results showed that the NN approach was superior to the alternative model in predicting rate of change, though the advantage was less clear for predicting variability. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献