首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
The present work is concerned with the development of a set of tools for the incorporation of various control measures—best management practices into an analytical probabilistic modeling approach for urban storm-water total maximum daily load (TMDL) estimation. Control measures are divided into two major groups—upstream and downstream, each requiring application of separate modeling principles elaborated in this paper. Applying Monte Carlo simulation to the developed set of expressions allows modeling the “end-of-pipe” parameters of urban storm-water discharges (runoff volume, discharge rate, and pollutant load) on an event average basis, as well as the stream parameters downstream of a storm-water discharge outlet. Model application is illustrated for a catchment regulated with an extended detention dry pond. Representative model results are presented, and a range of potential model applications is discussed. The capability to model the behavior of an urban storm-water system with the application of various control measures is the key precondition for the design of an optimal configuration of a water-protective strategy.  相似文献   

2.
A real-valued random field {Zi,j} with piecewise constant samples and defined on a lattice L in 2 is developed to characterize two-dimensional metallic polycrystals. The subsets defined by constant values of {Zi,j} are virtual grains and the values of {Zi,j} give Euler angles at the nodes of L. The field {Zi,j} is completely defined by its marginal distribution and conditional probabilities associated with the nearest neighbor model. The defining probabilities of {Zi,j} need to be estimated from measurements of atomic lattice orientation. Random fields {Zi,j} calibrated to the measurements of crystallographic texture in two AA7075 aluminum plates have been used to generate virtual polycrystals. Virtual and actual polycrystals are similar.  相似文献   

3.
This study introduces a new probabilistic project control concept to assure an acceptable forecast of final project performance, in terms of not exceeding planned budget and schedule risk levels. This concept consists in the implementation of performance control limit curves for both actual cost and elapsed time, obtained with a probabilistic approach and a graphical representation referred to as Stochastic S curves (SS curves). In order to facilitate the project control process, control limit curves can be used to display and evaluate actual project performance status without the need of actualizing at completion performance forecasts. Three different approaches (quality, benchmarking, and incremental variance) are proposed in this paper for obtaining the project performance control limit curves. In order to find the control limit curve definition with more conservative acceptable performance variations, these approaches are tested in an example project. A further managerial advantage is found in the recommended approach, as it allows monitoring the use of both cost and scheduling contingencies, along the project execution.  相似文献   

4.
The physical uncertainty associated with fiber-reinforced polymer composites has to be quantified and dealt with for their widespread use to be reliable. Developing probabilistic models based on experimental studies form an important part of this task. In the present paper, such models are developed for glass fiber-reinforced polymer (GFRP) composites based on an experimental study on panels obtained from Mondial House, a 32?year old building demolished in 2006 in London. Having an average size of 1.5?m×1.7?m and made of chopped strand mat composites, these panels have been exposed to varying ambient conditions, protected only by a fire retardant gel coat for self-cleaning. Tensile and compressive tests are performed to quantify the variability in stiffness and strength properties of these panels. Intra- and interpanel effects and correlations between random variables are studied using statistical methods. A range of probability distributions is tested and suggestions are made with regard to their suitability for modeling different mechanical and geometric properties.  相似文献   

5.
In this paper, a numerical procedure for probabilistic slope stability analysis is presented. This procedure extends the traditional limit equilibrium method of slices to a probabilistic approach that accounts for the uncertainties and spatial variation of the soil strength parameters. In this study, two-dimensional random fields were generated based on a Karhunen-Loève expansion in a fashion consistent with a specified marginal distribution function and an autocorrelation function. A Monte Carlo simulation was then used to determine the statistical response based on the generated random fields. This approach makes no assumption about the critical failure surface. Rather, the critical failure surface corresponding to the input random fields of soil properties is searched during the process of analysis. A series of analyses was performed to verify the application potential of the proposed method and to study the effects of uncertainty due to the spatial heterogeneity on the stability of slope. The results show that the proposed method can efficiently consider the various failure mechanisms caused by the spatial variability of soil property in the probabilistic slope stability assessment.  相似文献   

6.
Probabilistic Analysis of Soil-Water Characteristic Curves   总被引:1,自引:0,他引:1  
Direct measurement of the soil-water characteristic curve (SWCC) is costly and time consuming. A first-order estimate from statistical generalization of experimental data belonging to soils with similar textural and structural properties is useful. A simple approach is to fit the data with a nonlinear function and to construct an appropriate probability model of the curve-fitting parameters. This approach is illustrated using sandy clay loam, loam, loamy sand, clay, and silty clay data in Unsaturated Soil Database. This paper demonstrates that a lognormal random vector is suitable to model the curve-fitting parameters of the SWCC. Other probability models using normal, gamma, Johnson, and other distributions do not provide better fit than the proposed lognormal model. The engineering impact of adopting a probabilistic SWCC is briefly discussed by studying the uncertainty of unsaturated shear strength due to the uncertainty of SWCC.  相似文献   

7.
A refined full-order method is presented for estimating the extreme wind load effects of rigid structures with given mean recurrence intervals (MRIs) by combining the distributions of annual maximum wind speed and extreme load coefficients. This refined method is capable of dealing with any type of asymptotic extreme value distribution. With this full-order method, the predictions of wind load effects by using distributions of annual maximum wind velocity pressure and wind speed are compared that provide information on the sensitivity of predictions to the upper tail of wind speed distribution. The efficacy of the first-order method is examined. The influences of the type of distributions and the variations of annual maximum wind speed and extreme load coefficient on the predictions are quantified. Finally, the first- and full-order methods are extended to wind load effects of dynamically sensitive structures which facilitate a comprehensive probabilistic analysis as compared to the Monte Carlo simulation schemes used in literature. It is pointed out that 78% fractile extreme load coefficient can be used for defining the characteristic load effects of both rigid and dynamically sensitive structures. The wind load factor is insensitive to the variation of extreme load coefficient. It can be approximately estimated through the wind speed factor and the growth rate of extreme wind load effect with increasing wind speed. The result concerning the wind load factor justifies the advantage of specifying design wind speeds with various MRIs in reducing the uncertainties of design wind loading.  相似文献   

8.
In a previous paper in this Journal, a “hybrid method” was proposed for the joint propagation of probability distributions (expressing variability) and possibility distributions (i.e., fuzzy numbers, expressing imprecision or partial ignorance) in the computation of risk. In order to compare the results of the hybrid computation (a random fuzzy set) to a tolerance threshold (a tolerable level of risk), a postprocessing method was proposed. Recent work has highlighted a shortcoming of this postprocessing step which yields overly conservative results. A postprocessing method based on Shafer’s theory of evidence provides a rigorous answer to the problem of comparing a random fuzzy set with a threshold. The principles behind the new postprocessing scheme are presented and illustrated with a synthetic example.  相似文献   

9.
Raising the Bar for Civil Engineering Education: Systems Thinking Approach   总被引:1,自引:0,他引:1  
The civil engineering profession has been undergoing an identity search. With the advent of information technology and the global market, competition from engineering offices elsewhere and from other local professions is unprecedented. Technical engineering knowledge is no longer a guarantee for career success; rather a combination of numerous professional skills is required. The growing unease of civil engineers about their undefined role in the knowledge economy has led many to question civil engineering education. Although there is a push to enhance the humanistic and business aspects of the curriculum, there is a shove in the opposite direction to strengthen the technical content and keep abreast of technical change. Discussion of this socioeconomic problem within the ASCE forum has often used linear deterministic thinking that is characteristic of technical problems. Social and economic systems are usually more complex and harder to understand than technological systems. If we start making new policies to address the problems of the profession based on fuzzy, incomplete, and imprecise mental models, we may end up with counterintuitive results. This paper proposes a systems thinking approach to the reform of civil engineering education based on System Dynamics modeling, a feedback-based object-oriented modeling paradigm. Such a tool can capture the dynamic nature of complex systems and the nonlinear feedback loops that are often responsible for counterintuitive results of policy making.  相似文献   

10.
A method is proposed for generating samples of irregular masonry walls that capture the essential statistics of a given population. The method first entails characterizing the geometry of scaled star-like inclusions by means of a non-Gaussian random field model and second packing these inclusions together to form a virtual material specimen. The model used in the first step is a nonlinear memoryless mapping of a sum of harmonic functions with Gaussian coefficients while in the second step the model proposed transforms Poisson fields into a domain of inclusions with a sieving curve that matches the sample specimen. The two random field models are used to develop Monte Carlo algorithms which produce virtual material specimens that include two levels of probabilistic characterization, a first level that is correlated to the inclusion geometry, and a second that is dictated by the global morphology of the sample material specimen.  相似文献   

11.
Recently, Valiantzas proposed a new two-parameter vertical infiltration equation that can be transformed to a linearized-form equation that essentially states that the shape of the cumulative infiltration data, when presented in the form of (i2/t) versus i, is linear. In this paper, the presentation of the numerical data to the Valiantzas linearized-form equation is proposed as an additional criterion to detect easily and rapidly possible errors of the numerical solutions and eventually to choose the best spatial discretization for a simulated infiltration event that is used as setup parameter to the numerical infiltration models. Numerical data and analytical solutions were used to validate the proposed method.  相似文献   

12.
In this paper, probabilistic models for structural analysis are put forward, with particular emphasis on model uncertainty. Context is provided by the finite-element method and the need for probabilistic prediction of structural performance in contemporary engineering. Sources of model uncertainty are identified and modeled. A Bayesian approach is suggested for the assessment of new model parameters within the element formulations. The expressions are formulated by means of numerical “sensors” that influence the model uncertainty, such as element distortion and degree of nonlinearity. An assessment procedure is proposed to identify the sensors that are most suitable to capture model uncertainty. This paper presents the general methodology and specific implementations for a general-purpose structural element. Two numerical examples are presented to demonstrate the methodology and its implications for probabilistic prediction of structural response.  相似文献   

13.
A two part probabilistic model for polycrystalline microstructures is described. The model utilizes a Poisson–Voronoi tessellation for the grain geometry and a vector random field model for the crystallographic orientation. The grain geometry model is calibrated to experimental data through the intensity of the Poisson point field underlying the Poisson–Voronoi tessellation and the orientation random field is calibrated to experimental data through its marginal distributions and second moment properties. Realizations of the random microstructure are generated by use of translation methods and are used, with simplified mechanical models, to investigate the problem of intergranular fracture. It is found that intergranular cracks exhibit some statistical properties of a scaled Brownian motion process.  相似文献   

14.
This study examined the effects of uncertain model boundary conditions on dissolved oxygen (DO) predictions for the lower Truckee River, Nevada using an augmented version of the EPA’s Water Quality Analysis Simulation Program Version 5 (WASP5) that included periphyton, or attached algae, in eutrophication kinetics. Uncertainty analyses were performed on selected organic nitrogen (ON) and carbonaceous biochemical oxygen demand boundary conditions using Monte Carlo techniques. The stochastic model was run using boundary concentrations assigned from observed probability distributions. Ranges of simulated values were used to construct confidence intervals, the magnitudes of which indicated the uncertainty associated with model predictions. Uncertainty in agricultural ditch return concentrations had minimal effects on in-stream model predictions, as predicted values of daily minimum and maximum DOs, daily average ON, and periphyton biomass all failed to show significant variability as a result of ditch concentration uncertainty. This result indicates that while ditch return nutrient loads are not trivial, their exact concentrations are not needed to make relatively accurate predictions of in-stream DO. However, uncertainty in the upstream ON boundary did result in significant uncertainty during summer months with regard to in-stream model predictions of ON, periphyton biomass, and DO. The model is clearly more sensitive to changes in this boundary than to changes in agricultural ditch concentrations.  相似文献   

15.
Surface soil contamination is often regulated by using guidance values that specify the maximum amount of pollutant that can be present without prompting a regulatory response. In the United States, there are at least 88 value sets, and another 35 worldwide, that provide guidance for at least one chlorinated ethene. Trichloroethene is the most commonly regulated chlorinated ethene (118 values) and may be the most commonly regulated synthetic organic surface soil contaminant. Cis- and trans-1,2-dichloroethene are the least regulated chlorinated ethenes. Overall, there are 617 guidance values for specific chlorinated ethenes plus another 32 for mixed isomers of dichlorethene. This analysis explores the origin, magnitude, and form of the variability of these values. Results indicate that values span from 4.9 to 6.6 orders of magnitude and follow distributions similar to lognormal random variables. However, distributions include value clusters similar to values advocated by the U.S. Environmental Protection Agency (USEPA) or the Canadian Council of Ministers of the Environment (CCME). Although only 9.5% of the regulatory guidance values (RGVs) are identical to USEPA or CCME values, 55% of these fall within the uncertainty bounds estimated for USEPA risk models. Results suggest that stronger national leadership and reduced risk model uncertainty could be effective in reducing the RGV variability of chlorinated ethenes.  相似文献   

16.
Current models of the modulus of elasticity, E, of concrete recommended by the American Concrete Institute and the American Association of State Highway and Transportation Officials are derived for normally vibrated concrete (NVC). Because self-consolidated concrete (SCC) mixtures differ from NVC in the quantities and types of constituent materials, supplementary cementing materials, and chemical admixtures, the current models, may not take into consideration the complexity of SCC, and thus they may predict the E of SCC inaccurately. Although some authors recommend specific models to predict E of SCC, they include only a single variable of assumed importance, namely, the design compressive strength of concrete, fc′. However, there are other parameters that may need to be accounted for while developing a prediction model for E of SCC. In this paper, a Bayesian variable selection method is used to identify the significant parameters in predicting the E of SCC, and more accurate models for E are generated using these variables. The models have a parsimonious parametrization for ease of use in practice and properly account for the prevailing uncertainties.  相似文献   

17.
Project managers implement the concept of time contingency to consider uncertainty in duration estimates and prevent project completion delays. Some project managers also build a distribution of the project time contingency into the project activities to create a more manageable schedule. Generally, both the estimation and distribution of the project time contingency are conducted by using subjective approaches. Because the project schedule feasibility mainly depends on the variable behavior of the project activities, the estimate of project time contingency and its allocation at the activity level should be obtained by considering the performance variability of each activity rather than basing on human judgment. In this paper, the stochastic allocation of project allowances method, which is based on Monte?Carlo simulation, is proposed to estimate the project time contingency and allocate it among the project activities. The application of this method to a three-span bridge project results in a fair allocation of the project time contingency and provides practical means to control time contingencies at the activity level.  相似文献   

18.
A new approach for achieving guaranteed reliable results within the context of finite-element approximation of mechanical systems is developed. A reliable analysis requires that all the sources of uncertainty and errors be accommodated. The appropriateness of a partial differential equation to a given physical problem is beyond the scope of this work. Parameter uncertainty is treated as intervals in this work and guaranteed bounds on the “unknown” true solutions are obtained. In this paper an element-by-element penalty-based interval finite-element analysis of linear elastic structural mechanics and solid mechanics problem is introduced. Material and load uncertainties are handled simultaneously. Presented numerical examples illustrate the ability of the method to maintain very sharp solution enclosures even when the number of the interval parameters or the size of the problems is increased.  相似文献   

19.
In the present paper, a simple method is proposed for predicting the extreme response of uncertain structures subjected to stochastic excitation. Many of the currently used approaches to extreme response predictions are based on the asymptotic generalized extreme value distribution, whose parameters are estimated from the observed data. However, in most practical situations, it is not easy to ascertain whether the given response time series contain data above a high level that are truly asymptotic, and hence the obtained parameter values by the adopted estimation methods, which points to the appropriate extreme value distribution, may become inconsequential. In this paper, the extreme value statistics are predicted taking advantage of the regularity of the tail region of the mean upcrossing rate function. This method is instrumental in handling combined uncertainties associated with nonergodic processes (system uncertainties) as well as ergodic ones (stochastic loading). For the specific applications considered, it can be assumed that the considered time series has an extreme value distribution that has the Gumbel distribution as its asymptotic limit. The present method is numerically illustrated through applications to a beam with spatially varying random properties and wind turbines subjected to stochastic loading.  相似文献   

20.
The compression index is an important soil property that is essential to many geotechnical designs. Over the decades, a number of empirical correlations have been proposed to relate the compressibility to other soil index properties, such as the liquid limit, plasticity index, in situ water content, void ratio, specific gravity, etc. The reliability and thus predictability of these correlations are always being questioned. Moreover, selection between simple and complicated models is a difficult task and often depends on subjective judgments. A more complicated model obviously provides “better fit” to the data but not necessarily offers an acceptable degree of robustness to measurement noise and modeling error. In the present study, the Bayesian probabilistic approach for model class selection is used to revisit the empirical multivariate linear regression formula of the compression index. The criterion in the formula structure selection is based on the plausibility of a class of formulas conditional on the measurement, instead of considering the likelihood only. The plausibility balances between the data fitting capability and sensitivity to measurement and modeling error, which is quantified by the Ockham factor. The Bayesian method is applied to analyze a data set of 795 records, including the compression index and other well-known geotechnical index properties of marine clay samples collected from various sites in South Korea. It turns out that the correlation formula linking the compression index to the initial void ratio and liquid limit possesses the highest plausibility among a total of 18 candidate classes of formulas. The physical significance of this most plausible correlation is addressed. It turns out to be consistent with previous studies and the Bayesian method provides the confirmation from another angle.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号