首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The combined multi‐view photogrammetric retrieval of cloud‐top height (CTH) from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) and the Multi‐angle Imaging SpectroRadiometer (MISR) is discussed. Although ASTER was designed mainly for land applications, the synergistic use of MISR and ASTER is shown to be valuable for 3D cloud analysis. A new cloud‐adapted matching algorithm based on least‐squares matching (LSM) was used for the photogrammetric processing of both MISR and ASTER. The methods were applied to an ASTER scene over Zürich‐Kloten, Switzerland, in April 2002, which was acquired on‐demand. This case study, with coincident ASTER, MISR and Meteosat‐6 10‐minute Rapid Scans, is treated in detail. As a matching validation option it is shown that, by chance, the cloud motion error for the MISR An‐Aa and ASTER stereo CTHs is approximately the same, independent of the actual cloud height and cloud motion. It was therefore possible to evaluate the accuracy of the MISR An‐Aa matching versus the ASTER matching, independent of artefacts due to the subsequent wind correction. The results were also compared to the operational MISR L2TC stereo CTH results. The results obtained by each of these methods yield consistent values for CTH (uncorrected for wind motion).  相似文献   

2.
A coastal cumulus cloud‐line formation along the east coast of the USA was observed on a National Oceanic and Atmospheric Administration (NOAA) Polar Orbiting Environmental Satellite (POES) Advanced Very High Resolution Radiometer (AVHRR) satellite image from 17 August 2001. The cloud line starts to form at about 16:00 UTC (local 12:00 noon) and follows the coastline from Florida to North Carolina. The length and width of the cloud line are about 850 km and 8.5 km, respectively. A 15‐min interval sequence of NOAA Geostationary Operational Environmental Satellite (GOES) images shows that the cloud line maintains the shape of the coastline and penetrates inland for more than 20 km over the next 6‐h timespan. Model simulation with actual atmospheric conditions as inputs shows that the cloud line is formed near the land–sea surface temperature (SST) gradient. The synoptic flow at all model levels is in the offshore direction prior to 16:00 UTC whereas low‐level winds (below 980 hPa) reverse direction to blow inland after 16:00 UTC. This reversal is due to the fact that local diurnal heating over the land takes place on shorter time‐scales than over the ocean. The vertical wind at these levels becomes stronger as the land–SST increases during the summer afternoon, and the leading edge of the head of the inland wind ascends from 920 hPa to about 850 hPa in the 3 h after 16:00 UTC. Model simulation and satellite observations show that the cloud line becomes very weak after 21:00 UTC when the diurnal heating decreases.  相似文献   

3.
Remotely sensed data are the best and perhaps the only possible way for monitoring large‐scale, human‐induced land occupation and biosphere‐atmosphere processes in regions such as the Brazilian tropical savanna (Cerrado). Landsat imagery has been intensively employed for these studies because of their long‐term data coverage (>30 years), suitable spatial and temporal resolutions, and ability to discriminate different land‐use and land‐cover classes. However, cloud cover is the most obvious constraint for obtaining optical remote sensing data in tropical regions, and cloud cover analysis of remotely sensed data is a requisite step needed for any optical remote sensing studies. This study addresses the extent to which cloudiness can restrict the monitoring of the Brazilian Cerrado from Landsat‐like sensors. Percent cloud cover from more than 35 500 Landsat quick‐looks were estimated by the K‐means unsupervised classification technique. The data were examined by month, season, and El Niño Southern Oscillation event. Monthly observations of any part of the biome are highly unlikely during the wet season (October–March), but very possible during the dry season, especially in July and August. Research involving seasonality is feasible in some parts of the Cerrado at the temporal satellite sampling frequency of Landsat sensors. There are several limitations at the northern limit of the Cerrado, especially in the transitional area with the Amazon. During the 1997 El Niño event, the cloudiness over the Cerrado decreased to a measurable but small degree (5% less, on average). These results set the framework and limitations of future studies of land use/land cover and ecological dynamics using Landsat‐like satellite sensors.  相似文献   

4.
The extreme learning machine (ELM), a single hidden layer neural network based supervised classifier is used for remote sensing classifications. In comparison to the backpropagation neural network, which requires the setting of several user‐defined parameters and may produce local minima, the ELM requires setting of one parameter, and produces a unique solution for a set of randomly assigned weights. Two datasets, one multispectral and another hyperspectral, were used for classification. Accuracies of 89.0% and 91.1% are achieved with this classifier using multispectral and hyperspectral data, respectively. Results suggest that the ELM provides a classification accuracy comparable to a backpropagation neural network with both datasets. The computational cost using the ELM classifier (1.25 s with Enhanced Thematic Mapper (ETM+) and 0.675 s with Digital Airborne Imaging Spectrometer (DAIS) data) is very small in comparison to the backpropagation neural network.  相似文献   

5.
Cloud computing is a recent trend in IT, which has attracted lots of attention. In cloud computing, service reliability and service performance are two important issues. To improve cloud service reliability, fault tolerance techniques such as fault recovery may be used, which in turn has impact on cloud service performance. Such impact deserves detailed research. Although there exist some researches on cloud/grid service reliability and performance, very few of them addressed the issues of fault recovery and its impact on service performance. In this paper, we conduct detailed research on performance evaluation of cloud service considering fault recovery. We consider recovery on both processing nodes and communication links. The commonly adopted assumption of Poisson arrivals of users’ service requests is relaxed, and the interarrival times of service requests can take arbitrary probability distribution. The precedence constraints of subtasks are also considered. The probability distribution of service response time is derived, and a numerical example is presented. The proposed cloud performance evaluation models and methods could yield results which are realistic, and thus are of practical value for related decision-makings in cloud computing.  相似文献   

6.
Since service level agreement(SLA)is essentially used to maintain reliable quality of service between cloud providers and clients in cloud environment,there has been a growing effort in reducing power consumption while complying with the SLA by maximizing physical machine(PM)-level utilization and load balancing techniques in infrastructure as a service.However,with the recent introduction of container as a service by cloud providers,containers are increasingly popular and will become the major deployment model in the cloud environment and specifically in platform as a service.Therefore,reducing power consumption while complying with the SLA at virtual machine(VM)-level becomes essential.In this context,we exploit a container consolidation scheme with usage prediction to achieve the above objectives.To obtain a reliable characterization of overutilized and underutilized PMs,our scheme jointly exploits the current and predicted CPU utilization based on local history of the considered PMs in the process of the container consolidation.We demonstrate our solution through simulations on real workloads.The experimental results show that the container consolidation scheme with usage prediction reduces the power consumption,number of container migrations,and average number of active VMs while complying with the SLA.  相似文献   

7.
Cloud migration allows organizations to benefit from reduced operational costs, improved flexibility, and greater scalability, and enables them to focus on core business goals. However, it also has the flip side of reduced visibility. Enterprises considering migration of their IT systems to the cloud only have a black box view of the offered infrastructure. While information about server pricing and specification is publicly available, there is limited information about cloud infrastructure performance. Comparison of alternative cloud infrastructure offerings based only on price and server specification is difficult because cloud vendors use heterogeneous hardware resources, offer different server configurations, apply different pricing models and use different virtualization techniques to provision them. Benchmarking the performance of software systems deployed on the top of the black box cloud infrastructure offers one way to evaluate the performance of available cloud server alternatives. However, this process can be complex, time-consuming and expensive, and cloud consumers can greatly benefit from tools that can automate it. Smart CloudBench is a generic framework and system that offers automated, on-demand, real-time and customized benchmarking of software systems deployed on cloud infrastructure. It provides greater visibility and insight into the run-time behavior of cloud infrastructure, helping consumers to compare and contrast available offerings during the initial cloud selection phase, and monitor performance for service quality assurance during the subsequent cloud consumption phase. In this paper, we first discuss the rationale behind our approach for benchmarking the black box cloud infrastructure. Then, we propose a generic architecture for benchmarking representative applications on the heterogeneous cloud infrastructure and describe the Smart CloudBench benchmarking workflow. We also present simple use case scenarios that highlight the need for tools such as Smart CloudBench.  相似文献   

8.
Cloud storage has seen an increasing rise in demand and diffusion. Consequently, the cloud storage market is also becoming an increasingly commoditised market. That is, homogenous products are offered at equal prices, and this offer makes it more difficult for cloud storage providers to generate revenue and differentiate themselves from their competitors. Therefore, it is vital for providers to precisely understand customer preferences so that these can be targeted with appropriate services. To examine these preferences, we conduct a choice experiment and analyse choice decisions gathered from 340 German students by means of a conjoint analysis. We perform an individual-level analysis of preferences, which reveals significant differences and heterogeneity within the sample. By using a subsequent cluster analysis, we identify three distinct customer segments that also show significant differences in, for example, the perceptions of information privacy and risks. Our findings contribute to the literature by uncovering the preference structure and trade-offs that users make in their choice of storage services when employed for the purpose of archiving. We conclude the study with a discussion of practical implications that can aid cloud storage providers in service design decisions, and highlight the limitations associated with our research approach and drawn sample.  相似文献   

9.
Chong  L.S.K.  Hui  S.C.  Yeo  C.K.  Foo  S. 《World Wide Web》1998,1(4):209-219
This paper describes a WWWassisted fax system (WAX) that is developed to provide reliable and enhanced Internet faxtofax communication. It integrates the easytouse WWW interface with conventional faxing procedures, resulting in an Internet fax system which not only circumvents the cost of long distance fax charges but also adds enhanced functionality not otherwise possible. The WAX system comprises two gateways, namely, the FaxIn and the FaxOut Gateways. The FaxIn Gateway accepts fax messages over Public Switched Telephone Network (PSTN) and stores them in a transit database. The system interfaces with the user over the WWW to provide access to his stored faxes, with the basic ability to send them out over the Internet to recipients. The FaxOut Gateway receives fax files from the FaxIn Gateway through the Internet and transmits them out to the intended recipients via the local PSTN. WAX users do not require any additional hardware except for a fax machine and a personal computer with Internet connectivity to gain access to WAX via any WWW browser. In addition, WAX provides a host of other enhanced features such as the ability to construct minifaxes from a single incoming fax as well as dynamically attach cover notes to outgoing faxes.  相似文献   

10.
A cloud workflow system is a type of platform service which facilitates the automation of distributed applications based on the novel cloud infrastructure. One of the most important aspects which differentiate a cloud workflow system from its other counterparts is the market-oriented business model. This is a significant innovation which brings many challenges to conventional workflow scheduling strategies. To investigate such an issue, this paper proposes a market-oriented hierarchical scheduling strategy in cloud workflow systems. Specifically, the service-level scheduling deals with the Task-to-Service assignment where tasks of individual workflow instances are mapped to cloud services in the global cloud markets based on their functional and non-functional QoS requirements; the task-level scheduling deals with the optimisation of the Task-to-VM (virtual machine) assignment in local cloud data centres where the overall running cost of cloud workflow systems will be minimised given the satisfaction of QoS constraints for individual tasks. Based on our hierarchical scheduling strategy, a package based random scheduling algorithm is presented as the candidate service-level scheduling algorithm and three representative metaheuristic based scheduling algorithms including genetic algorithm (GA), ant colony optimisation (ACO), and particle swarm optimisation (PSO) are adapted, implemented and analysed as the candidate task-level scheduling algorithms. The hierarchical scheduling strategy is being implemented in our SwinDeW-C cloud workflow system and demonstrating satisfactory performance. Meanwhile, the experimental results show that the overall performance of ACO based scheduling algorithm is better than others on three basic measurements: the optimisation rate on makespan, the optimisation rate on cost and the CPU time.  相似文献   

11.
Traceability is a key issue to ensure consistency among software artifacts of subsequent phases of the development cycle. However, few works have so far addressed the theme of tracing object oriented (OO) design into its implementation and evolving it. This paper presents an approach to checking the compliance of OO design with respect to source code and support its evolution. The process works on design artifacts expressed in the OMT (Object Modeling Technique) notation and accepts C++ source code. It recovers an “as is” design from the code, compares the recovered design with the actual design and helps the user to deal with inconsistencies. The recovery process exploits the edit distance computation and the maximum match algorithm to determine traceability links between design and code. The output is a similarity measure associated to design‐code class pairs, which can be classified as matched and unmatched by means of a maximum likelihood threshold. A graphic display of the design with different green levels associated to different levels of match and red for the unmatched classes is provided as a support to update the design and improve its traceability to the code.  相似文献   

12.
Abstract:

In this self‐study, the author gained in‐depth understanding of how to plan and implement problem‐based learning (PBL), a student‐centred approach to teaching and learning that is driven by messy, open‐ended problems. This paper focuses primarily on the issues and concerns that arose as she developed and implemented a modified form of traditional PBL (Barrows, 1996) in large, pre‐service science‐teacher education classes. To view the research from many perspectives, a variety of data collection methods and sources were used, including field notes, semi‐structured interviews, student‐generated documents, and student journals. The outcomes of this study describe challenges (problem development, facilitation of groups, and assessment) encountered by the author as she planned for and implemented PBL. Furthermore, changes in the author's classroom practice, the connection between these changes and constructivist learning principles, and implications for science‐teacher education are addressed.  相似文献   

13.
This paper addresses the cross‐calibration of the infrared channels 4 (3.9 µm), 9 (10.8 µm) and 10 (12.0 µm) of the Spinning Enhanced Visible and Infra‐Red Imager (SEVIRI) onboard the Meteosat Second Generation 1 (MSG1) satellite with the channels of the MODerate resolution Imaging Spectroradiometer (MODIS) onboard Terra. The cross‐calibrations, including the Ray‐Matching (RM) method and the Radiative Transfer Modelling (RTM) method, were developed and implemented over a tropical area using SEVIRI and MODIS measurements of July 2005 and July 2006 with absolute view zenith angle differences (|ΔVZA|)<0.5°, absolute view azimuth angle differences (|ΔVAA|)<0.5° and absolute time differences (|ΔTime|)<10 min. The results obtained by the RM and RTM methods revealed calibration discrepancies between the two sensors. The results obtained by the RM method were consistent with previously published results. The results obtained by the RTM method were consistent with the results obtained by the RM method if the temperature differences caused by the spectral differences between the two sensors were taken into account. From the cross‐calibration results obtained by the two methods, the use of the results obtained by the RTM method to recalibrate the SEVIRI data is recommended. The recalibrations remove the overestimation of the Land Surface Temperature (LST) retrieved from the SEVIRI data by a split‐window method.  相似文献   

14.
Adoption of cloud infrastructure promises enterprises numerous benefits, such as faster time-to-market and improved scalability enabled by on-demand provisioning of pooled and shared computing resources. In particular, hybrid clouds, by combining the private in-house capacity with the on-demand capacity of public clouds, promise to achieve both increased utilization rate of the in-house infrastructure and limited use of the more expensive public cloud, thereby lowering the total costs for a cloud user organization. In this paper, an analytical model of hybrid cloud costs is introduced, wherein the costs of computing and data communication are taken into account. Using this model, a cost-efficient division of the computing capacity between the private and the public portion of a hybrid cloud can be identified. By analyzing the model, it can be shown that, given fixed prices for private and public capacity, a hybrid cloud incurs the minimum costs. Furthermore, it is shown that, as the volume of data transferred to/from the public cloud increases, a greater portion of the capacity should be allocated to the private cloud. Finally, the paper illustrates analytically that, when the unit price of capacity declines with the volume of acquired capacity, a hybrid cloud may become more expensive than a private or a public cloud.  相似文献   

15.
The extraction of texture features from high‐resolution remote sensing imagery provides a complementary source of data for those applications in which the spectral information is not sufficient for identification or classification of spectrally similar landscape features. This study presents the results of grey‐level co‐occurrence matrix (GLCM) and wavelet transform (WT) texture analysis for forest and non‐forest vegetation types differentiation in QuickBird imagery. Using semivariogram fitting, the optimal GLCM windows for the land cover classes within the scene were determined. These optimal window sizes were then applied to eight GLCM texture measures (mean, variance, homogeneity, dissimilarity, contrast, entropy, angular second moment, and correlation) for the scene classification. Using wavelet transformation, up to five levels of macro‐texture were computed and tested in the classification process. Comparing the classification results, (1) the spectral‐only bands classification gave an overall accuracy of 58.69%; (2) the statistically derived 21×21 optimal mean texture combined with spectral information gave the best results among the GLCM optimal windows with an accuracy of 73.70%; and (3) the combined optimal WT‐texture levels 4 and 5 gave an accuracy of 63.56%. The combined classification of these three optimal results gave an overall accuracy of 77.93%. The results indicate that even though vegetation texture was generally measured better by the GLCM‐mean texture (micro‐textures) than by WT‐derived texture (macro‐textures), the results show that the micro–macro texture combination would improve the differentiation and classification of the overall vegetation types. Overall, the results suggests that computer‐assisted classification of high‐spatial‐resolution remotely sensed imagery has a good potential to augment the present ground‐based forest inventory methods.  相似文献   

16.
In this letter, a modification to a phase–correlation‐(PC‐)based supervised classification method for hyperspectral data is proposed. An adaptive approach using different numbers of multiple class representatives (CRs) extracted using PC‐based k‐means clustering for each class is compared with the use of selecting a small, pre‐determined number of dissimilar CRs. PC is used as a distance measure in k‐means clustering to determine the spectral similarity between each pixel and cluster centre. The number of representatives for each class is chosen adaptively, depending on the number of training samples in each class. Classification is performed for each pixel according to the maximum value of PCs obtained between test samples and the CRs. Experimental results show that the adaptive method gave the highest classification accuracy (CA). Experiments on the effect of reducing the size of the feature vectors found that CA increased as the feature vector decreased.  相似文献   

17.
Cloud computing is an innovative computing paradigm designed to provide a flexible and low-cost way to deliver information technology services on demand over the Internet. Proper scheduling and load balancing of the resources are required for the efficient operations in the distributed cloud environment. Since cloud computing is growing rapidly and customers are demanding better performance and more services, scheduling and load balancing of the cloud resources have become very interesting and important area of research. As more and more consumers assign their tasks to cloud, service-level agreements (SLAs) between consumers and providers are emerging as an important aspect. The proposed prediction model is based on the past usage pattern and aims to provide optimal resource management without the violations of the agreed service-level conditions in cloud data centers. It considers SLA in both the initial scheduling stage and in the load balancing stage, and it looks into different objectives to achieve the minimum makespan, the minimum degree of imbalance, and the minimum number of SLA violations. The experimental results show the effectiveness of the proposed system compared with other state-of-the-art algorithms.  相似文献   

18.
One of the applications of crop simulation models is to estimate crop yield during the current growing season. Several studies have tried to integrate crop simulation models with remotely sensed data through data‐assimilation methods. This approach has the advantage of allowing reinitialization of model parameters with remotely sensed observations to improve model performance. In this study, the Cropping System Model‐CERES‐Maize was integrated with the Moderate Resolution Imaging Spectroradiometer (MODIS) leaf area index (LAI) products for estimating corn yield in the state of Indiana, USA. This procedure, inversion of crop simulation model, facilitates several different user input modes and outputs a series of agronomic and biophysical parameters, including crop yield. The estimated corn yield in 2000 compared reasonably well with the US Department of Agriculture National Agricultural Statistics Service statistics for most counties. Using the seasonal LAI in the optimization procedure produced the best results compared with only the green‐up LAIs or the highest LAI values. Planting, emergence and maturation dates, and N fertilizer application rates were also estimated at a regional level. Further studies will include investigating model uncertainties and using other MODIS products, such as the enhanced vegetation index.  相似文献   

19.
The relevance vector machine (RVM), a Bayesian extension of the support vector machine (SVM), has considerable potential for the analysis of remotely sensed data. Here, the RVM is introduced and used to derive a multi‐class classification of land cover with an accuracy of 91.25%, a level comparable to that achieved by a suite of popular image classifiers including the SVM. Critically, however, the output of the RVM includes an estimate of the posterior probability of class membership. This output may be used to illustrate the uncertainty of the class allocations on a per‐case basis and help to identify possible routes to further enhance classification accuracy.  相似文献   

20.
This paper explores the potential of an artificial immune‐based supervised classification algorithm for land‐cover classification. This classifier is inspired by the human immune system and possesses properties similar to nonlinear classification, self/non‐self identification, and negative selection. Landsat ETM+ data of an area lying in Eastern England near the town of Littleport are used to study the performance of the artificial immune‐based classifier. A univariate decision tree and maximum likelihood classifier were used to compare its performance in terms of classification accuracy and computational cost. Results suggest that the artificial immune‐based classifier works well in comparison with the maximum likelihood and the decision‐tree classifiers in terms of classification accuracy. The computational cost using artificial immune based classifier is more than the decision tree but less than the maximum likelihood classifier. Another data set from an area in Spain is also used to compare the performance of immune based supervised classifier with maximum likelihood and decision‐tree classification algorithms. Results suggest an improved performance with the immune‐based classifier in terms of classification accuracy with this data set, too. The design of an artificial immune‐based supervised classifier requires several user‐defined parameters to be set, so this work is extended to study the effect of varying the values of six parameters on classification accuracy. Finally, a comparison with a backpropagation neural network suggests that the neural network classifier provides higher classification accuracies with both data sets, but the results are not statistically significant.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号