首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Empirical relationships between lake chlorophyll a and total phosphorus concentrations are widely used to develop predictive models. These models are often estimated using sample averages as implicit surrogates for unknown lake-wide means, a practice than can result in biased parameter estimation and inaccurate predictive uncertainty. We develop a Bayesian network model based on empirical chlorophyll-phosphorus relationships for Saginaw Bay, an embayment on Lake Huron. The model treats the means as unknown parameters, and includes structure to accommodate the observation error associated with estimating those means. Compared with results from an analogous simple model using sample averages, the observation error model has a lower predictive uncertainty and predicts lower chlorophyll and phosphorus concentrations under contemporary lake conditions. These models will be useful to guide pending decision-making pursuant to the 2012 Great Lakes Water Quality Agreement.  相似文献   

2.
In this paper, the problem of estimating uncertainty regions for identified models is considered. A typical approach in this context is to resort to the asymptotic theory of Prediction Error Methods for system identification, by means of which ellipsoidal uncertainty regions can be constructed for the uncertain parameters.We show that the uncertainty regions worked out through the asymptotic theory can be unreliable in certain situations, precisely characterized in the paper.Then, we critically analyze the theoretical conditions for the validity of the asymptotic theory, and prove that the asymptotic theory also applies under new assumptions which are less restrictive than the usually required ones. Thanks to this result, we single out the classes of models among standard ones (ARX, ARMAX, Box-Jenkins, etc.) where the asymptotic theory can be safely used in practical applications to assess the quality of the identified model.These results are of interest in many applications, including iterative controller design schemes.  相似文献   

3.
The land surface albedo is a key parameter influencing the climate near the ground. Therefore, it must be determined with sufficient accuracy. In this paper, a statistical inversion method is presented in support of the application of kernel-based Bi-directional Reflectance Distribution Function (BRDF) models for the calculation of the surface albedo. The method provides the best linear unbiased estimations (BLUE) of the BRDF model coefficients for an arbitrary number of available angular measurements. When the number of measurements exceeds the number of the estimated coefficients, the QR decomposition method is proposed to improve the ill-conditional features of the inversion matrix. In other cases, the singular value decomposition (SVD) method is suggested. The proposed inversion method is innovative in that it provides confidence intervals for each of the BRDF model coefficients with a prescribed significance expressed by a probability level. Five candidate kernel-driven BRDF models were used in the present simulation study: Li-Sparse, Roujean, Li-Sparse-Wanner, Li-Dense and Walthall. A ground-based reflectance measurement data set including 11 surface types forms the background for the inversion experiments. The results show a strong dependence on the solar zenith angle (SZA) and on the land cover type (LCT) for all candidate models. Owing to this, none model could be recommend in a general manner. The Li-Sparse and the Li-Sparse-Wanner models performed the best for the grass and wheat LCT, while the Roujean model appeared as a favorite for the pine and deciduous forests. The implementation of the confidence interval technique shows that the BRDF model coefficients can be retrieved with an uncertainty of 20-30%, and somewhat greater in the case of forest. The measured angular reflectance curves lie, as a rule, within the uncertainty bands related to the 5% significance level (95% probability). The corresponding albedo estimates can be characterized by an absolute uncertainty of 1-2% in the visible band and 5-10% in the near infrared band, or by 10-30% in relative terms. The reflectance measurements at low SZA values are preferable for BRDF model inversion for the grassland and crop, while medium range of SZA seems to provide more information on forest features. For the majority of LCT, the results of BRDF model inversion seem to be less reliable when considering multi-angular measurements for various SZA than for a single SZA.  相似文献   

4.
Spatial Decision Support Systems (SDSSs) often include models that can be used to assess the impact of possible decisions. These models usually simulate complex spatio-temporal phenomena, with input variables and parameters that are often hard to measure. The resulting model uncertainty is, however, rarely communicated to the user, so that current SDSSs yield clear, but therefore sometimes deceptively precise outputs. Inclusion of uncertainty in SDSSs requires modeling methods to calculate uncertainty and tools to visualize indicators of uncertainty that can be understood by its users, having mostly limited knowledge of spatial statistics. This research makes an important step towards a solution of this issue. It illustrates the construction of the PCRaster Land Use Change model (PLUC) that integrates simulation, uncertainty analysis and visualization. It uses the PCRaster Python framework, which comprises both a spatio-temporal modeling framework and a Monte Carlo analysis framework that together produce stochastic maps, which can be visualized with the Aguila software, included in the PCRaster Python distribution package. This is illustrated by a case study for Mozambique in which it is evaluated where bioenergy crops can be cultivated without endangering nature areas and food production now and in the near future, when population and food intake per capita will increase and thus arable land and pasture areas are likely to expand. It is shown how the uncertainty of the input variables and model parameters effects the model outcomes. Evaluation of spatio-temporal uncertainty patterns has provided new insights in the modeled land use system about, e.g., the shape of concentric rings around cities. In addition, the visualization modes give uncertainty information in an comprehensible way for users without specialist knowledge of statistics, for example by means of confidence intervals for potential bioenergy crop yields. The coupling of spatio-temporal uncertainty analysis to the simulation model is considered a major step forward in the exposure of uncertainty in SDSSs.  相似文献   

5.
Measured data often incorporates some amount of uncertainty, which is generally modeled as a distribution of possible samples. In this paper, we consider second‐order symmetric tensors with uncertainty. In the 3D case, this means the tensor data consists of 6 coefficients – uncertainty, however, is encoded by 21 coefficients assuming a multivariate Gaussian distribution as model. The high dimension makes the direct visualization of tensor data with uncertainty a difficult problem, which was until now unsolved. The contribution of this paper consists in the design of glyphs for uncertain second‐order symmetric tensors in 2D and 3D. The construction consists of a standard glyph for the mean tensor that is augmented by a scalar field that represents uncertainty. We show that this scalar field and therefore the displayed glyph encode the uncertainty comprehensively, i.e., there exists a bijective map between the glyph and the parameters of the distribution. Our approach can extend several classes of existing glyphs for symmetric tensors to additionally encode uncertainty and therefore provides a possible foundation for further uncertain tensor glyph design. For demonstration, we choose the well‐known superquadric glyphs, and we show that the uncertainty visualization satisfies all their design constraints.  相似文献   

6.
This paper investigates the impact of decision criteria on incentives in a project management setting, where a project manager operates a project consisting of two tasks performed sequentially by two different subcontractors. The completion time is characterized as an uncertain variable, which depends on the subcontractor’s unobservable effort of each task. Within the framework of uncertainty theory, four classes of uncertain principal agent models are presented under the expected value criterion and the critical value criterion. According to the structural characteristics, each model can be decomposed into two equivalent sub-models, which can be solved to obtain the optimal deadline-based incentive contracts via a two-step optimization method. Further, the interconnections among these contracts are discussed. It’s demonstrated that the optimal deadline-based incentive contract depends on the confidence level of the party (either the project manager or the subcontractor) who adopts the critical value criterion. Meanwhile, the more conservative the subcontractor is, the higher the incentive coefficients will be. And the more conservative the project manager is, the lower the incentive coefficients will be. Finally, given some special confidence levels, it’s interesting to find that the four models can be equivalent.  相似文献   

7.
A model is a representation of a system that can be used to answer questions about the system. In many situations in which models are used, there exists no set of universally accepted modeling assumptions. The term model uncertainty commonly refers to uncertainty about a model's structure, as distinguished from uncertainty about parameters. This paper presents alternative formal approaches to treating model uncertainty, discusses methods for using data to reduce model uncertainty, presents approaches for diagnosing inadequate models, and discusses appropriate use of models that are subject to model uncertainty  相似文献   

8.
9.
On-line tool wear monitoring in turning using neural networks   总被引:1,自引:0,他引:1  
The on-line supervision of a tool's wear is the most difficult task in the context of tool monitoring. Based on an in-process acquisition of signals with multi-sensor systems, it is possible to estimate or classify wear parameters by means of neural networks. This article demonstrates that solutions can be improved significantly by using available secondary information about physical models of the cutting process and about the temporal development of wear. Process models describing the influence of process parameters are used for a dedicated pre-processing of the sensor signals. The essential signal behaviour in a certain time window is described by means of polynomial coefficients. These coefficients are used as inputs for feedforward networks considering the temporal development of wear (multilayer perceptrons with a sliding window technique and time-delay neural networks). With a combination of the proposed measures it is possible to obtain remarkable improvements of both tool wear estimation and classification.  相似文献   

10.
A method for assessing health care technology that models the demand on the clinician's attention exerted by patients' data (diagnostic and therapeutic) can provide a means for simultaneously reducing the cost and improving the quality of health care. The attentional demand exerted by patients' data can be measured by the amount of uncertainty in the data. Uncertainty can be expressed mathematically by the concept of entropy in information theory.  相似文献   

11.
The problem of regression analysis in a fuzzy setting is discussed. A general linear regression model for studying the dependence of a LR fuzzy response variable on a set of crisp explanatory variables, along with a suitable iterative least squares estimation procedure, is introduced. This model is then framed within a wider strategy of analysis, capable to manage various types of uncertainty. These include the imprecision of the regression coefficients and the choice of a specific parametric model within a given class of models. The first source of uncertainty is dealt with by exploiting the implicit fuzzy arithmetic relationships between the spreads of the regression coefficients and the spreads of the response variable. Concerning the second kind of uncertainty, a suitable selection procedure is illustrated. This consists in maximizing an appropriately introduced goodness of fit index, within the given class of parametric models. The above strategy is illustrated in detail, with reference to an application to real data collected in the framework of an environmental study. In the final remarks, some critical points are underlined, along with a few indications for future research in this field.  相似文献   

12.
In this paper we show how Bayesian network models can be used to perform a sensitivity analysis using symbolic, as opposed to numeric, computations. An example of damage assessment of concrete structures of buildings is used for illustrative purposes. Initially, normal or Gaussian Bayesian network models are described together with an algorithm for numerical propagation of uncertainty in an incremental form. Next, the algorithm is implemented symbolically, in Mathematica code, and applied to answer some queries related to the damage assessment of concrete structures of buildings. Finally, the conditional means and variances of the nodes given the evidence are shown to be rational functions of the parameters, thus, discovering its parametric structure, which can be efficiently used in sensitivity analysis.  相似文献   

13.
The effect that the resolution of spatial data has on uncertainty is important to many areas of research. In order to understand this better, the effect of changing resolution is considered for a range of data. An estimate is presented for how the average uncertainty of each grid value varies with grid size, which is shown to be in good agreement with observed uncertainties. The effect of bilinear interpolation is also investigated and is observed to provide no reduction in uncertainty relative to uninterpolated data. Finally, the effects of combining aggregated spatial data are found to obey standard properties of error propagation, which means that the presented estimate of uncertainty can be used to estimate resolution-related uncertainty in spatial model results, relative to the input data. The study quantitatively demonstrates the important role of the spatial autocorrelation of data in uncertainties associated with the resolution of spatial data.  相似文献   

14.
15.
《Applied Soft Computing》2007,7(1):425-440
Uncertainty management has been considered essential for real world applications, and spatial data and geographic information systems in particular require some means for managing uncertainty and vagueness. Rough sets have been shown to be an effective tool for data mining and uncertainty management in databases. The 9-intersection, region connection calculus (RCC) and egg–yolk methods have proven useful for modeling topological relations in spatial data. In this paper, we apply rough set definitions for topological relationships based on the 9-intersection, RCC and egg–yolk models for objects with broad boundaries. We show that rough sets can be used to express and improve on topological relationships and concepts defined with these models.  相似文献   

16.
17.
Traditional formulations on reliability optimization problems have assumed that the coefficients of models are known as fixed quantities and reliability design problem is treated as deterministic optimization problems. Because that the optimal design of system reliability is resolved in the same stage of overall system design, model coefficients are highly uncertainty and imprecision during design phase and it is usually very difficult to determine the precise values for them. However, these coefficients can be roughly given as the intervals of confidence.

In this paper, we formulated reliability optimization problem as nonlinear goal programming with interval coefficients and develop a genetic algorithm to solve it. The key point is how to evaluate each solution with interval data. We give a new definition on deviation variables which take interval relation into account. Numerical example is given to demonstrate the efficiency of the proposed approach.  相似文献   


18.
For many optimization applications a complicated computational simulation is replaced with a simpler response surface model. These models are built by fitting a limited number of evaluations of the full simulation with a simple function that captures the trends in the evaluated data. In many cases the values of the data at the evaluation points have some uncertainty. This paper uses Bayesian model selection to derive two objective metrics that can be used to determine which response surface model provides the most appropriate representation of the evaluated data given the associated uncertainty. These metrics are shown to be consistent with modelling intuition based on Occam’s principle. The uncertainty may be due to numerical error, approximations, uncertain input conditions, or to higher order effects in the simulation that do not need to be fit by the response surface. Two metrics, Q and G, are derived in this paper. The metric Q assumes that a good estimate of the simulation uncertainty is available. The metric G assumes the uncertainty, although present, is unknown. Application of these metrics in one and two dimensions are demonstrated. Received June 28, 2000  相似文献   

19.
20.
Nowadays, most of the mathematical models used in predictive microbiology are deterministic, i.e. their model output is only one single value for the microbial load at a certain time instant. For more advanced exploitation of predictive microbiology in the context of hazard analysis and critical control points (HACCP) and risk analysis studies, stochastic models should be developed. Such models predict a probability mass function for the microbial load at a certain time instant. An excellent method to deal with stochastic variables is Monte Carlo analysis. In this research, the sensitivity of microbial growth model parameter distributions with respect to data quality and quantity is investigated using Monte Carlo analysis. The proposed approach is illustrated with experimental growth data. There appears to be a linear relation between data quality (expressed by means of the standard deviation of the normal distribution assumed on experimental data) and model parameter uncertainty (expressed by means of the standard deviation of the model parameter distribution). The quantity of data (expressed by means of the number of experimental data points) as well as the positioning of these data in time have a substantial influence on model parameter uncertainty. This has implications for optimal experiment design.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号