共查询到20条相似文献,搜索用时 15 毫秒
1.
F. Maselli Corresponding author M. Chiesi M. Bindi 《International journal of remote sensing》2013,34(19):3929-3941
The estimation of transpiration fluxes through wide vegetated land surfaces is of great importance for the proper planning and management of environmental resources, particularly in areas where water is a main limiting factor during at least part of the growing cycle. While remotely sensed techniques cannot directly measure these fluxes, they can provide useful information on vegetation variables such as Leaf Area Index (LAI), which are functionally related to the mentioned processes. The aims of the present work were: (a) to illustrate the use of multi-temporal LAI profiles derived from National Oceanic and Atmospheric Administration Advanced Very High Resolution Radiometer (NOAA-AVHRR) Normalized Difference Vegetation Index (NDVI) data as input for a biogeochemical model (Forest-BGC) which simulates the main processes of forest vegetation (transpiration and photosynthesis); and (b) to analyse the sensitivity of the calibrated model to its main driving variables (meteorological data and NDVI-derived LAI profiles) in order to assess their relative importance for operational transpiration monitoring. In particular, the model was applied to two oak stands in the Tuscany Region (central Italy), which are representative of Mediterranean forests and for which a calibration phase had already been performed. Simulations were carried out for a 15-year period (1986–2000) using as inputs daily meteorological data and NDVI-derived monthly LAI profiles. The sensitivity of the model to both input types was then assessed through other model runs with fixed values of the two variables. The results of these experiments indicated that the remotely sensed LAI estimates are the main determinant of simulated transpirations, especially during the Mediterranean arid season (summer) when water resources are the primary limiting factor for vegetation development. 相似文献
2.
《Environmental Modelling & Software》2002,17(2):135-144
Quantification of soil loss is one of the greatest challenges in natural resources and environmental planning. Computer simulation models are becoming increasingly popular in predicting soil loss for various land use and management practices. In this study, three soil erosion prediction models — the Water Erosion Prediction Project (WEPP), the Erosion Productivity Impact Calculator (EPIC), and the Areal Nonpoint Source Watershed Environment Response Simulation (ANSWERS) were used for simulating soil loss and testing the capability of the models in predicting soil losses for three different tillage systems (ridge-till, chisel-plow, and no-till). For each model, the most sensitive model parameters were calibrated using measured soil erosion data. After calibration, models were run and predicted soil loss values were compared with the measured soil loss values. The measured soil erosion data were collected from an erosion experiment field of Kansas State University at Ottawa (Kansas), USA. Field experiments were conducted from 1995 to 1997 on small plots to measure runoff and soil losses under all three tillage systems. All three models were evaluated on the basis of individual event, total yearly, and mean event-based soil loss predictions. Results showed that all the three models performed reasonably well and the predicted soil looses were within the range of measured values. For ridge-till and chisel-plow systems, WEPP and ANSWERS gave better predictions than those by EPIC model. For no-till system, WEPP and EPIC predictions were better than those by ANSWERS. The overall results indicate that WEPP predictions were better than those by the other two models in most of the cases, and it can be used with reasonable degree of confidence for soil loss quantification for all the three tillage systems. 相似文献
3.
Positional accuracy of spatial data can be assessed by means of line-based methods. In this work we develop an analysis of the following four methods: Hausdorff Distance, Mean Distance, Single Buffer Overlay and Double Buffer Overlay, using a set of 12 synthetic cases. The synthetic cases incorporate specific shape features for bias, random errors and outliers which correspond to simplified versions of real world possibilities. The use of synthetic cases helps us to understand the basic behavioral differences between the methods. Numerical results for the positional accuracy estimations are different between methods and cases due to the different concepts of distance involved and the specific configurations of each case. When the method results in a function, patterns related to different types of errors can be detected in this function. The length-inclusion level of each method is revealed as the base criterion for comparison. The Single Buffer Overlay Method offers the more general solution because it includes the others’ results. 相似文献
4.
Rodrigo Lersch 《International journal of remote sensing》2013,34(12):3211-3221
In this study we investigate a new approach to implement concepts developed by the ‘theory of evidence’ to remote sensing digital image classification. In the proposed approach, auxiliary variables are structured as layers in a Geographical Information System (GIS)-like format to produce layers of belief and plausibility. Thresholds are applied to the layers of belief and plausibility to detect errors of commission and omission, respectively on the thematic image. The thresholds are estimated as functions of the user's and producer's accuracy. Preliminary tests were performed over an area covered by natural forest with Araucaria, showing some promising results. 相似文献
5.
6.
Abstract Musicians have long been interested in using iterative processes to aid the composition of musical forms (macrostructure) and to synthesize sounds (microstructure). This paper introduces a new sound synthesis method exploring the non-linear behaviour of two iterative cross-coupled digital oscillators. It begins with a brief introduction to iterative systems followed by background information on previous attempts at using them for synthesizing sounds (e.g. feedback frequency and amplitude modulations). Next, it introduces our synthesis method and briefly explains how it has been implemented in a system for real-time composition and performance. The paper concludes with a discussion on how the system has been put into practice to compose and perform a number of works. 相似文献
7.
F. Maselli A. Rodolfi L. Bottai S. Romanelli C. Conese 《International journal of remote sensing》2013,34(17):3303-3313
Mediterranean vegetation is strongly subjected to the risk of wildfires, which can become a major cause of land degradation. The knowledge of the spatial variations of this risk is essential, therefore, for forest resource management. Relying on the fact that different vegetation types can be associated with different risk levels, a classification approach based on the use of Landsat Thematic Mapper (TM) scenes is currently proposed for the generation of maps related to fire risk. Hard and fuzzy classifications were tested for this purpose on Elba island (central Italy), taking into account the effects of the use of scenes from different periods (spring and summer) and of ancillary data. The fire risk images obtained were evaluated by comparison with the fire events that occurred on the island during the last decade. The results show that, while the acquisition period has only minor effects, classification accuracy is strongly dependent on the inclusion of ancillary data. Moreover, the fuzzy approach better exploits the information of the integrated datasets, producing maps which are temporally stable and highly indicative of the fire risk in the study area. 相似文献
8.
Iain Brown Simon Jude Sotiris Koukoulas Robert Nicholls Mark Dickson Mike Walkden 《Computers, Environment and Urban Systems》2006,30(6):840
A key requirement for effective coastal zone management is good knowledge and prediction of land erosion rates due to encroachment of the sea. However, in addition to demarcation of the hazard through modelling and mapping, a policy of risk mitigation necessitates significant attention should also be addressed to communicating the transient behaviour of the predictions and associated uncertainty. With climate change and sea level rise implying that historical rates of change may not be a reliable guide for the future, enhanced visualisation of the evolving coastline has the potential to improve awareness of this changing risk. This visual content is developed by linking scientific modelling with the transformation of digital elevation models, and then using GIS to integrate other spatiotemporal content. The resulting high-resolution visualisations may meet demands from decision-makers for tools to communicate scientific results more effectively, due to their realism and apparent authenticity. Nevertheless they can also produces a tension with the underlying scientific content because of the necessary extrapolation of extra detail, and the lack of established procedures to communicate the resulting uncertainty in the visualisation. Coastal managers also have concerns about releasing the visualisations to the general public. These issues are explored through analysis of future cliff erosion in Norfolk on the eastern coast of Great Britain. 相似文献
9.
Supporting safe and resilient authentication and integrity of digital images is of critical importance in a time of enormous creation and sharing of these contents. This paper presents an improved digital image watermarking model based on a coefficient quantization technique that intelligently encodes the owner’s information for each color channel to improve imperceptibility and robustness of the hidden information. Concretely, a novel color channel selection mechanism automatically selects the optimal HL4 and LH4 wavelet coefficient blocks for embedding binary bits by adjusting block differences, calculated between LH and HL coefficients of the host image. The channel selection aims to minimize the visual difference between the original image and the embedded image. On the other hand, the strength of the watermark is controlled by a factor to achieve an acceptable tradeoff between robustness and imperceptibility. The arrangement of the watermark pixels before shuffling and the channel into which each pixel is embedded is ciphered in an associated key. This key is utterly required to recover the original watermark, which is extracted through an adaptive clustering thresholding mechanism based on the Otsu’s algorithm. Benchmark results prove the model to support imperceptible watermarking as well as high robustness against common attacks in image processing, including geometric, non-geometric transformations, and lossy JPEG compression. The proposed method enhances more than 4 dB in the watermarked image quality and significantly reduces Bit Error Rate in the comparison of state-of-the-art approaches. 相似文献
10.
《Remote sensing of environment》1987,21(2):201-213
The information content of Landsat TM and MSS data was examined to assess the ability to digitally differentiate urban and near-urban land covers around Miami, FL. This examination included comparisons of unsupervised signature extractions for various cover types, training site statistics for intraclass and interclass separability, and band and band combination selection from an 11-band multisensor data set. The principal analytical tool used in this study was transformed divergence calculations. The TM digital data are typically more useful than the MSS data in the homogeneous near-urban land-covers and less useful in the heterogeneous urban areas. 相似文献
11.
《Journal of Microcomputer Applications》1987,10(1):83-87
This paper deals with the introduction of the microcomputer in digital electronics courses as a demonstration tool for simulating the working principles of basic digital electronic circuits. Another usage of the microcomputer is made in laboratory work where it is introduced as a circuit debugger. 相似文献
12.
《Mathematics and computers in simulation》2002,59(5):431-436
A PERT-COST type project with random activity duration is considered. The project comprises several essential parameters which practically define the quality of the project as a whole:
- •the budget assigned to the project;
- •the project’s due date;
- •the project’s reliability, i.e. the probability of meeting the project’s due date on time.
13.
Stress and strain during manual tool handling not only depend on factors such as weight to be handled, but are also determined by the design of the man-machine interface. In this study, three different handles of electric hedge-clippers were analysed; the results of a comparative investigation into the physiological cost demanded by the use of the different handles are discussed. Muscular strain was measured via surface electromyography in laboratory experiments with nine male subjects. The results showed significant differences in physiological cost depending on both work height and the handles' shape. Systematic differences in muscular strain between the utilized tools were found, despite the fact that all clippers were compensated with respect to weight and location of the centre of gravity. One of the handle designs enabled working under varying conditions (work height and direction) at a reduced level of muscular strain of the right arm. Results from the physiological evaluation were partly supported by the working persons' own subjective experience. The results of this investigation show that further ergonomic tool and handle design is necessary. 相似文献
14.
S. Eckert Corresponding author T. Kellenberger K. Itten 《International journal of remote sensing》2013,34(9):1943-1957
The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) aboard the Terra satellite was designed to generate along‐track stereo images. The data are available at low cost, providing a feasible opportunity for generating digital elevation models (DEMs) in areas where little or no elevation data are yet available. This study evaluates the accuracy of DEMs extracted from ASTER data covering mountainous terrain. For an assessment of the achieved accuracies in the Andean study site, comparisons were made to similar topographical conditions in Switzerland, where reference data were available. All raw DEMs were filtered and interpolated by the post‐processing tools included with PCI Geomatica, the software package used. After carefully checking the DEM quality, further post‐processing was undertaken to eliminate obvious artefacts such as peaks and sinks. Accuracy was tested by comparing the DEMs in the Swiss Alps to three reference models. The achieved results of the generated DEMs are promising, considering the extreme terrain. Given accurate and well‐distributed ground control points (GCPs), it is possible to generate DEMs with a root mean square (RMS) error between 15?m and 20?m in hilly terrain and about 30?m in mountainous terrain. The DEMs are very accurate in nearly flat regions and on smooth slopes with southern expositions: errors are generally within ±10?m in those cases. Larger errors do appear in forested, snow covered or shady areas and at steep cliffs and deep valleys with extreme errors of a few hundred metres. The evaluation showed that the quality of the DEMs is sufficient for enabling atmospheric, topographic and geometric correction to various satellite datasets and for deriving additional products. 相似文献
15.
16.
Twitter data has recently been considered to perform a large variety of advanced analysis. Analysis of Twitter data imposes new challenges because the data distribution is intrinsically sparse, due to a large number of messages post every day by using a wide vocabulary. Aimed at addressing this issue, generalized itemsets – sets of items at different abstraction levels – can be effectively mined and used to discover interesting multiple-level correlations among data supplied with taxonomies. Each generalized itemset is characterized by a correlation type (positive, negative, or null) according to the strength of the correlation among its items.This paper presents a novel data mining approach to supporting different and interesting targeted analysis – topic trend analysis, context-aware service profiling – by analyzing Twitter posts. We aim at discovering contrasting situations by means of generalized itemsets. Specifically, we focus on comparing itemsets discovered at different abstraction levels and we select large subsets of specific (descendant) itemsets that show correlation type changes with respect to their common ancestor. To this aim, a novel kind of pattern, namely the Strong Flipping Generalized Itemset (SFGI), is extracted from Twitter messages and contextual information supplied with taxonomy hierarchies. Each SFGI consists of a frequent generalized itemset X and the set of its descendants showing a correlation type change with respect to X.Experiments performed on both real and synthetic datasets demonstrate the effectiveness of the proposed approach in discovering interesting and hidden knowledge from Twitter data. 相似文献
17.
S. V. Zhurin A. N. Krylov A. L. Kusov O. V. Shtyrkov V. A. Ushkov 《Mathematical Models and Computer Simulations》2017,9(3):349-358
An upper atmosphere probe is considered. For the atmosphere parameters, a reconstruction method using the density data by a vacuometer located behind a wire screen is constructed. In the context of the Monte-Carlo method of direct statistical simulation, we propose a simplified simulation algorithm for the interactions of a molecule with the wire screen treated as a semipermeable membrane. To obtain a relationship between the transmitter reading and the parameters of the undisturbed atmosphere, we numerically simulate the flow over the transmitter. 相似文献
18.
Crescencio Bravo Miguel . Redondo Manuel Ortega M. Felisa Verdejo 《Journal of Network and Computer Applications》2006,29(4):321-342
The Simulation discipline has to face new challenges such as the incorporation of Collaborative Technologies for professional use as well as for teaching purposes. This integration permits the creation of new kinds of support for collaborative learning processes. In this paper, we explore the potential of this synergy with DomoSim-TPC, a synchronous distributed collaborative environment for the teaching and learning of Domotics. The system supports an active, simulation-based and problem-based approach for learning house automation design. Using this learning environment, teachers propose and organize problem solving activities and the students carry out, in a collaborative way, the construction of artefacts (designs) using modelling and simulation tools. 相似文献
19.
An assessment of shuttle radar topography mission digital elevation data for studies of volcano morphology 总被引:1,自引:0,他引:1
Robert Wright Harold Garbeil Peter J. Mouginis-Mark 《Remote sensing of environment》2006,105(1):41-53
The Shuttle Radar Topography Mission has provided high spatial resolution digital topographic data for most of Earth's volcanoes. Although these data were acquired with a nominal spatial resolution of 30 m, such data are only available for volcanoes located within the U.S.A. and its Territories. For the overwhelming majority of Earth's volcanoes not contained within this subset, DEMs are available in the form of a re-sampled 90 m product. This has prompted us to perform an assessment of the extent to which volcano-morphologic information present in the raw 30 m SRTM product is retained in the degraded 90 m product. To this end, we have (a) applied a simple metric, the so called dissection index (di), to summarize the shapes of volcanic edifices as encoded in a DEM and (b) using this metric, evaluated the extent to which this topographic information is lost as the spatial resolution of the data is reduced. Calculating di as a function of elevation (a di profile) allows us to quantitatively summarize the morphology of a volcano. Our results indicate that although the re-sampling of the 30 m SRTM data obviously results in a loss of morphological information, this loss is not catastrophic. Analysis of a group of six Alaskan volcanoes indicates that differences in di profiles calculated from the 30 m SRTM product are largely preserved in the 90 m product. This analysis of resolution effects on the preservation of topographic information has implications for research that relies on understanding volcanoes through the analysis of topographic datasets of similar spatial resolutions produced by other remote sensing techniques (e.g., repeat-pass interferometric SAR; optical stereometry). 相似文献
20.
Ahmed Musa Shamseddin Ali Mohamed Adeeb 《International journal of remote sensing》2013,34(12):3798-3815
Rainfed agriculture is dominant in Sudan. The current methods for crop yield estimation are based on taking random cutting samples during harvesting time. This is ineffective in terms of cost of information and time. The general objective of this study is to highlight the potential role of remote-sensing techniques in upgrading methods of monitoring rainfed agricultural performance. The specific objective is to develop a relationship between satellite-derived crop data and yield of rainfed sorghum. The normalized difference vegetation index (NDVI), rainfall, air temperature (AT) and soil moisture (SM) are used as independent variables and yield as a dependent variable. To determine the uncertainty associated with the independent variables, a sensitivity analysis (SA) is conducted. Multiple models are developed using different combinations of data sets. The temporal images taken during sorghum’s mid-season growth stage give a better prediction than those taken during its development growth stage. Among predictor variables, SM is associated with the highest uncertainty. 相似文献