首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Wildfires occur annually in UK moorland environments, especially in drought years. They can be severely damaging to the ecosystem when they burn deep into the peat, killing ground-nesting birds and releasing CO2 into the atmosphere. Synthetic aperture radar (SAR) was evaluated for detecting the 18 April 2003 Bleaklow wildfire scar (7.4 km2). SAR’s ability to penetrate cloud is advantageous in this inherently overcast area. SAR can provide fire scar boundary information which is otherwise labour intensive to collect in the field using a global positioning system (GPS). This article evaluates the potential of SAR intensity and InSAR coherence to detect a large peat moorland wildfire scar in the Peak District of northern England. A time-series of pre-fire and post-fire ERS-2 and advanced synthetic aperture radar (ASAR) Single Look Complex (SLC) data were pre-processed using SARScape 4.2 to produce georeferenced greyscale images. SAR intensity and InSAR coherence values were analysed against Coordinate Information on the Environment (CORINE) land‐cover classes and precipitation data. SAR intensity detected burnt peat well after a precipitation event and for previous fire events within the CORINE peat bog class. For the 18 April 2003 fire event, intensity increased to 0.84 dB post-fire inside the fire scar for the peat bog class. InSAR coherence peaked post-fire for moors and heathland and natural grassland classes inside the fire scar, but peat bog exposed from previous fires was less responsive. Overall, SAR was found to be effective for detecting the Bleaklow moorland wildfire scar and monitoring wildfire scar persistence in a degraded peat landscape up to 71 days later. Heavy precipitation amplified the SAR fire scar signal, with precipitation after wildfires being typical in UK moorlands. Further work is required to disentangle the effects of fire size, topography, and less generalized land‐cover classes on SAR intensity and InSAR coherence for detecting fire scars in degraded peat moorlands.  相似文献   

2.
Quantum coherence is the most fundamental feature of quantum mechanics. The usual understanding of it depends on the choice of the basis, that is, the coherence of the same quantum state is different within different reference framework. To reveal all the potential coherence, we present the total quantum coherence measures in terms of two different methods. One is optimizing maximal basis-dependent coherence with all potential bases considered and the other is quantifying the distance between the state and the incoherent state set. Interestingly, the coherence measures based on relative entropy and \(l_2\) norm have the same form in the two different methods. In particular, we show that the measures based on the non-contractive \(l_2\) norm are also a good measure different from the basis-dependent coherence. In addition, we show that all the measures are analytically calculable and have all the good properties. The experimental schemes for the detection of these coherence measures are also proposed by multiple copies of quantum states instead of reconstructing the full density matrix. By studying one type of quantum probing schemes, we find that both the normalized trace in the scheme of deterministic quantum computation with one qubit and the overlap of two states in quantum overlap measurement schemes can be well described by the change of total coherence of the probing qubit. Hence the nontrivial probing always leads to the change of the total coherence.  相似文献   

3.
We use changes in the interferometric coherence to map earthquake damage that occurred in the city of Bam during the Bam earthquake on 26 December 2003. The approach presented here defines a coherence change index that can be interpreted quantitatively in terms of damage during a catastrophic event like a major earthquake. Using five differential interferometric synthetic aperture radar images, we compute maps of interferometric coherence. Three coherence change images are computed from these based on different interferograms. These three damage assessments yield very similar results despite the range of times and interferometric baselines spanned. This coherence‐based damage assessment also agrees closely with independent damage maps derived from other types of imagery. Using existing or planned synthetic aperture radar (SAR) satellites, coherence‐based damage assessments can be obtained within days after a catastrophic event, provided the necessary reference images are prepared ahead of time. As SAR sensors can operate independently of weather conditions and daylight, this may present a reliable and robust means of damage assessment.  相似文献   

4.
This article proposes and demonstrates a technique enabling polygon-based scanline hidden-surface algorithms to be used in applications that require a moderate degree of user interaction. Interactive speeds have been achieved through the use of screen-area coherence,a derivative of frame-to-frame coherence and object coherence. This coherence takes advantage of the face that most of the area of the screen does not change from one frame to the next in applications that have constant viewing positions for a number of frames and in which a majority of the image remains the same. One such application, the user interface of constructive solid geometry (CSG) based modelers, allows a user to modify a model by adding, deleting, repositioning, and performing volumetric Boolean operations on solid geometric primitives. Other possible applications include robot simulation, NC verification, facility layout, surface modeling, and some types of animation. In this article, screen-area coherence is used as the rationale for recalculating only those portions of an image that correspond to a geometric change. More specifically, this article describes a scanline hidden-surface removal procedure that uses screen-area coherence to achieve interactive speeds. A display algorithm using screen-area coherence within a CSG-based scanline hidden-surface algorithm was implemented and tested. Screen-area coherence reduced the average frame update time to about one quarter of the original time for three test sequences of CSG modeling operations.  相似文献   

5.
6.
通过区域块投影方法直接绘制三维数据场   总被引:6,自引:0,他引:6  
直接的体绘制技术提供了在一幅图形内显示三维数据场各种信息的巨大潜力。然而,生成这样的图形是极其昂贵的,而且高质量图形的绘制远远达不到交互实现的水平。体元投方法之所以能引起人们的极大举是因为在处理过程中充分地利用了体元的空间连贯性。本文提出了一个更有效的体绘制方法:区域块投影方法(Block Projection Method)。这一算法不仅利用了体元的空间连贯性而且还充分地利用了数据场函数值分布的  相似文献   

7.
BIOPRESS – Linking pan‐European land cover change to pressures on biodiversity – is a European Community Framework 5 project, which aims to develop a standardised product that will link quantified historical (1950–2000) land cover change to pressures on biodiversity. It exploits archived historic and recent aerial photographs (a data source that has remained consistent over the last 60 years) to assess land cover change around Natura 2000 sites within 30×30 km windows and 15×2 km transects. The CORINE (Coordination of Information on the Environment) land cover mapping methodology has been adapted for use with aerial photographs. Sample sites are mapped to CORINE Land Cover (CLC) classes, and then backdated to assess change. Results from eight UK transects (and associated windows) are presented. Changes in land cover classes are interpreted as pressures: urbanisation, intensification, abandonment, afforestation, deforestation and drainage. Urbanisation was the major pressure in all but two transects (both in the uplands), and intensification was of similar importance in most transects. Afforestation was a significant pressure in two transects. In six out of the eight transects, annual change was greater in the 1990–2000 period than in the 1950–1990 period. The methodology has been demonstrated to provide quantitative results of long‐term land cover change in the UK rural landscape at a spatial scale that is relevant to management decisions. The methods are transferable and applicable to a wide range of landscape studies.  相似文献   

8.
The SHARC framework for data quality in Web archiving   总被引:1,自引:0,他引:1  
Web archives preserve the history of born-digital content and offer great potential for sociologists, business analysts, and legal experts on intellectual property and compliance issues. Data quality is crucial for these purposes. Ideally, crawlers should gather coherent captures of entire Web sites, but the politeness etiquette and completeness requirement mandate very slow, long-duration crawling while Web sites undergo changes. This paper presents the SHARC framework for assessing the data quality in Web archives and for tuning capturing strategies toward better quality with given resources. We define data quality measures, characterize their properties, and develop a suite of quality-conscious scheduling strategies for archive crawling. Our framework includes single-visit and visit?Crevisit crawls. Single-visit crawls download every page of a site exactly once in an order that aims to minimize the ??blur?? in capturing the site. Visit?Crevisit strategies revisit pages after their initial downloads to check for intermediate changes. The revisiting order aims to maximize the ??coherence?? of the site capture(number pages that did not change during the capture). The quality notions of blur and coherence are formalized in the paper. Blur is a stochastic notion that reflects the expected number of page changes that a time-travel access to a site capture would accidentally see, instead of the ideal view of a instantaneously captured, ??sharp?? site. Coherence is a deterministic quality measure that counts the number of unchanged and thus coherently captured pages in a site snapshot. Strategies that aim to either minimize blur or maximize coherence are based on prior knowledge of or predictions for the change rates of individual pages. Our framework includes fairly accurate classifiers for change predictions. All strategies are fully implemented in a testbed and shown to be effective by experiments with both synthetically generated sites and a periodic crawl series for different Web sites.  相似文献   

9.
As the nuclear power plants within the UK age, there is an increased requirement for condition monitoring to ensure that the plants are still be able to operate safely. This paper describes the novel application of Intelligent Systems (IS) techniques to provide decision support to the condition monitoring of Nuclear Power Plant (NPP) reactor cores within the UK. The resulting system, BETA (British Energy Trace Analysis) is deployed within the UK’s nuclear operator and provides automated decision support for the analysis of refuelling data, a lead indicator of the health of AGR (Advanced Gas-cooled Reactor) nuclear power plant cores. The key contribution of this work is the improvement of existing manual, labour-intensive analysis through the application of IS techniques to provide decision support to NPP reactor core condition monitoring. This enables an existing source of condition monitoring data to be analysed in a rapid and repeatable manner, providing additional information relating to core health on a more regular basis than routine inspection data allows. The application of IS techniques addresses two issues with the existing manual interpretation of the data, namely the limited availability of expertise and the variability of assessment between different experts. Decision support is provided by four applications of intelligent systems techniques. Two instances of a rule-based expert system are deployed, the first to automatically identify key features within the refuelling data and the second to classify specific types of anomaly. Clustering techniques are applied to support the definition of benchmark behaviour, which is used to detect the presence of anomalies within the refuelling data. Finally data mining techniques are used to track the evolution of the normal benchmark behaviour over time. This results in a system that not only provides support for analysing new refuelling events but also provides the platform to allow future events to be analysed. The BETA system has been deployed within the nuclear operator in the UK and is used at both the engineering offices and on station to support the analysis of refuelling events from two AGR stations, with a view to expanding it to the rest of the fleet in the near future.  相似文献   

10.
Abstract. The now familiar and longstanding discussion on the status of the field of management information systems (MIS) consists of at least two themes – the lack of coherence in MIS and the question of rigour vs. relevance (academic vs. practical concerns). The research questions we pose here ask: what themes or ideas represent the centre of MIS or its zones of coherence – or is diversity and fragmentation the rule? and will the centre or zones change over time? Within MIS research, is there evidence of theory building that contributes to a cumulative research tradition? Using a co‐word analysis approach – to analyse the patterns in discourse by measuring the association strengths of terms representative of relevant publications – the researchers found 62 specific centres of coherence. The data documented a high degree of change in centres of coherence over time. Evidence of theory building was extremely weak. A cumulative research tradition remains elusive. MIS centres of coherence change over time – we think, partly in response to practical pressures. We suggest that MIS opens a richer and more difficult debate on its theory, practice, and identity as a discipline in the 21st century university.  相似文献   

11.
The focus of our work is to design and build a dynamic data distribution system that is coherence-preserving, i.e. the delivered data must preserve associated coherence requirements (the user-specified bound on tolerable imprecision) and resilient to failures. To this end, we consider a system in which a set of repositories cooperate with each other and the sources, forming a peer-to-peer network. In this system, necessary changes are pushed to the users so that they are automatically informed, about changes of interest. We present techniques 1) to determine when to push an update from one repository to another for coherence maintenance, 2) to construct an efficient dissemination tree for propagating changes from sources to cooperating repositories, and 3) to make the system resilient to failures. An experimental evaluation using real world traces of dynamically changing data demonstrates that 1) careful dissemination of updates through a network of cooperating repositories can substantially lower the cost of coherence maintenance, 2) unless designed carefully, even push-based systems experience considerable loss in fidelity due to message delays and processing costs, 3) the computational and communication cost of achieving resiliency is made to be low, and 4) surprisingly, adding resiliency actually improve fidelity even in the absence of failures.  相似文献   

12.
分布式缓存被广泛应用于解决传统关系型数据库的性能瓶颈问题,但是当不能感知分布式缓存的第三方应用直接更新后台数据库时,缓存数据会获得不一致的状态,存在过时缓存问题.本文提出一种基于变化数据捕获机制的分布式缓存一致性策略,集成了基于触发器和基于日志的两种变化数据捕获机制实时捕获后台数据库更新,实现了数据模型自动转换方法和SQL翻译引擎,实时更新缓存,从而保障分布式缓存的一致性.实验模拟TPC-W测试基准中的关键操作,验证了基于日志的变化数据捕获机制相比基于触发器的变化数据捕获机制有更好的数据库性能和缓存一致性效果.  相似文献   

13.
《Ergonomics》2012,55(9):851-871
The aim of this study was to gather and collate information from the major researchers and consultancies in the UK regarding the performance of Information Technology (IT) and the role of human and organizational factors. The findings are based on the experience of 45 of the leading experts in the UK, drawing on a collective sample of approximately 14,000 organizations, covering all major sectors of economic activity and a comprehensive range of information technologies. The main findings are that 80–90% of IT investments do not meet their performance objectives and the reasons for this are rarely purely technical in origin. The context of technical change, the ways in which IT is developed and implemented, a range of human and organizational factors, and the roles of managers and end-users, are identified as critical areas affecting performance. A major implication is that the poor performance of IT systems is the result of a complex set of interacting forces that will be difficult to change. The study reports ideas concerning ‘best practice’ within companies, along with some suggestions for what needs to be done on a national scale to improve performance and practice in this area. A key goal is that action on these human and organizational issues becomes embedded in practice, part of the natural way of managing organizational and technical change.  相似文献   

14.
15.
针对低分辨率视频图像,提出一种基于角点检测与颜色连贯性分析的快速车牌定位算法。考虑到车牌具备固定的颜色连贯性特点,首先由车牌背景颜色与车牌字体颜色得出分析掩膜;然后使用较小的阈值和最小角点间距计算得出掩膜内角点;最后对角点进行颜色加权,选择权值最大的分析区域。该方法对图像噪声具有极好的抗干扰性,适用于分辨率较低的视频图像数据。实验表明与其他方法相比,该方法的车牌定位准确率高,平均耗时短。  相似文献   

16.
Biclustering algorithms have become popular tools for gene expression data analysis. They can identify local patterns defined by subsets of genes and subsets of samples, which cannot be detected by traditional clustering algorithms. In spite of being useful, biclustering is an NP-hard problem. Therefore, the majority of biclustering algorithms look for biclusters optimizing a pre-established coherence measure. Many heuristics and validation measures have been proposed for biclustering over the last 20 years. However, there is a lack of an extensive comparison of bicluster coherence measures on practical scenarios. To deal with this lack, this paper experimentally analyzes 17 bicluster coherence measures and external measures calculated from information obtained in the gene ontologies. In this analysis, results were produced by 10 algorithms from the literature in 19 gene expression datasets. According to the experimental results, a few pairs of strongly correlated coherence measures could be identified, which suggests redundancy. Moreover, the pairs of strongly correlated measures might change when dealing with normalized or non-normalized data and biclusters enriched by different ontologies. Finally, there was no clear relation between coherence measures and assessment using information from gene ontology.  相似文献   

17.
18.
Because Synthetic Aperture Radar(SAR)can penetrate into forest canopy and interact with the primary stem volume contents of the trees (trunk and branch),SAR data are widely used for forest stem volume estimation.This paper investigated the correlation between SAR data and forest stem volume in Xunke,Heilongjiang using the stand-wise forest inventory data in 2003 and ALOS PALSAR data for five dates in 2007.The influences of season and polarizations on the relationship between stem volume and SAR data were studied by analyzing the scatterplots;that was followed by interpretation of the mechanisms primarily based on a forest radar backscattering model-water cloud model.The results showed that the relationship between HV polarization backscatter and stem volume is better than HH polarization,and SAR data in summer dry conditions are more correlated to stem volume than the data acquired in other conditions.The interferometric coherence with 46-day temporal baseline is negatively correlated to the stem volume.The correlation coefficients from winter coherence are higher than those from summer coherence and backscatter.The study results suggest using the interferometric coherence in winter as the best choice for forest stem volume estimation with L-band SAR data.  相似文献   

19.
The conflicting evidence in the literature on energy feedback as a driver for energy behaviour change has lead to the realization that it is a complex problem and that interventions must be proposed and evaluated in the context of a tangled web of individual and societal factors. We put forward an integrated agent-based computational model of energy consumption behaviour change interventions based on personal values and energy literacy, informed by research in persuasive technologies, environmental, educational and cognitive psychology, sociology, and energy education. Our objectives are: (i) to build a framework to accommodate a rich variety of models that might impact consumption decisions, (ii) to use the simulation as a means to evaluate persuasive technologies in-silico prior to deployment. The model novelty lies in its capacity to connect the determinants of energy related behaviour (values, energy literacy and social practices) and several generic design strategies proposed in the area of persuasive technologies within one framework. We validate the framework using survey data and personal value and energy consumption data extracted from a 2-year field study in Exeter, UK. The preliminary evaluation results demonstrate that the model can predict energy saving behaviour much better than a random model and can correctly estimate the effect of persuasive technologies. The model can be embedded into an adaptive decision-making system for energy behaviour change.  相似文献   

20.
Model Checking Data Consistency for Cache Coherence Protocols   总被引:1,自引:0,他引:1       下载免费PDF全文
A method for automatic verification of cache coherence protocols is presented, in which cache coherence protocols are modeled as concurrent value-passing processes, and control and data consistency requirement are described as formulas in first-orderμ-calculus. A model checker is employed to check if the protocol under investigation satisfies the required properties. Using this method a data consistency error has been revealed in a well-known cache coherence protocol. The error has been corrected, and the revised protocol has been shown free from data consistency error for any data domain size, by appealing to data independence technique.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号