共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
Wilks DS 《Philosophical transactions. Series A, Mathematical, physical, and engineering sciences》2008,366(1875):2477-2490
Conceptual climate models are very simple mathematical representations of climate processes, which are especially useful because their workings can be readily understood. The usual procedure of representing effects of unresolved processes in such models using functions of the prognostic variables (parametrizations) that include no randomness generally results in these models exhibiting substantially less variability than do the phenomena they are intended to simulate. A viable yet still simple alternative is to replace the conventional deterministic parametrizations with stochastic parametrizations, which can be justified theoretically through the central limit theorem. The result is that the model equations are stochastic differential equations. In addition to greatly increasing the magnitude of variability exhibited by these models, and their qualitative fidelity to the corresponding real climate system, representation of unresolved influences by random processes can allow these models to exhibit surprisingly rich new behaviours of which their deterministic counterparts are incapable. 相似文献
3.
S.K. Aytulun 《国际生产研究杂志》2013,51(10):2743-2764
Modelling and analysis of business processes is critical to identify current business processes and to understand the contributions of new processes to the system. The quality of the results obtained by modelling and analysis significantly influences the success of business process reengineering (BPR). Therefore, a constant development in techniques used in business process modelling (BPM) and business process analysis (BPA) is necessary. However, when these proposed techniques are analysed it becomes obvious that they repeat the same basic approach, although a few offer different visions. In BPM development studies, the use of time-activity scheduling is often considered secondary (even neglected). The reason for this is that process modelling may be considered as project management and remain under this label. Many organizations may use these techniques in managing their daily activities if the maturity level and the simplicity of project management techniques are considered. It also enables the modelling of stochastic situations, otherwise not possible to do by any BPM method. In this study an existing business process with network properties is analysed using project scheduling techniques. Thus, business processes are described as networks, modelled and timed by network properties and stochastically analysed using GERT, a project based process scheduling method. Finally the results obtained by GERT are examined using the PERT-path approach. 相似文献
4.
Holger Luczak Christopher Schlick Alexander Kuenzer Frank Ohmann 《Theoretical Issues in Ergonomics Science》2013,14(2):97-123
An approach to user modelling with discrete stochastic processes is presented, which aims at the dynamic individualization of user interfaces on the syntactic layer. The state of the art of syntactic user modelling is surveyed. The mathematical background of simple Markov Chains and the 'classic' Hidden Markov Model is presented. Furthermore, dynamic Bayesian Networks are introduced, which generalize these Markovian Models. A case study of the simulation experiments uses a multimodal user interface for supervisory control of advanced manufacturing cells. A corresponding simulation model is created and exploited to generate interaction cases, which are the empirical basis for the evaluation. Six topologies of dynamic Bayesian Networks are evaluated for 100 interaction cases and 50 replications each: (1) Markov Chain of order 1, (2) Hidden Markov Model, (3) autoregressive Hidden Markov Model, (4) factorial Hidden Markov Model, (5) simple hierarchical Hidden Markov Model, and (6) tree structured Hidden Markov Model. The dependent variable is the prediction accuracy for a single prediction lead. In a first step, a one-way analysis of variance in conjunction with Tukey's post-hoc test demonstrated a significant superiority of the simple hierarchical Hidden Markov Model. In a second step, an additional two-way analysis of variance also indicated a significantly better prediction accuracy of the simple hierarchical Hidden Markov Model compared to the Hidden Markov Model, but the number of interaction cases also had a significant effect. Hence, the modeller has to take both factors--model topology and number of interaction cases--into account when designing syntactic user models with stochastic processes. 相似文献
5.
This paper presents the measurement and a statistical analysis of the resultant force system, consisting of an axial force and torque, in BTA deep hole machining. The measurements were performed using a specially designed two-component piezoelectric dynamometer and adopting the rotating cutting tool-stationary workpiece procedure. The dynamometer was calibrated for static and dynamic outputs and techniques were employed for increasing the measuring accuracy and reducing the cross-interference by obtaining the elements of the system transfer function. Experiments were carried out to measure the mean values and the dynamic fluctuations of the axial force and torque. The recorded data was processed and analysed to establish all major statistical properties of the axial force and torque. Results show that the dynamic fluctuations of the axial force and torque in BTA deep hole machining can be represented by a stationary wideband process with a gaussian density distribution function. Such a mathematical model is essential for evaluating the dynamic response of the machine-workpiece system as well as the true motion of the cutting tool tip, and to establish the reliability of the machining process. 相似文献
6.
Wesley W. Ingwersen Ahjond S. Garmestani Michael A. Gonzalez Joshua J. Templeton 《Clean Technologies and Environmental Policy》2014,16(4):719-730
The science of climate change integrates many scientific fields to explain and predict the complex effects of greenhouse gas concentrations on the planet’s energy balance, weather patterns, and ecosystems as well as economic and social systems. A changing climate requires responses to curtail climate forcing as well as to adapt to impending changes. Responses can be categorized into mitigation and adaptation—the former involving efforts to reduce greenhouse gas emissions, and the latter involving strategies to adapt to predicted changes. These responses must be of significant scale and extent to be effective, but significant tradeoffs and unintended effects must be avoided. Concepts and science based on systems theory are needed to reduce the risk of unintended consequences from potential responses to climate change. We propose expanding on a conventional risk-based approach to include additional ways of analyzing risks and benefits, such as considering potential cascading ecological effects, full life cycle environmental impacts, and unintended consequences, as well as considering possible co-benefits of responses. Selected responses to climate change are assessed with this expanded set of criteria, and we find that mitigation measures that involve reducing emissions of greenhouse gases that provide corollary benefits are likely to have less negative indirect impacts than large-scale solar radiation management approaches. However, because effects of climate change are unavoidable in the near and medium-term, adaptation strategies that will make societies more resilient in the face of impending change are essential to sustainability. 相似文献
7.
Concrete is a highly heterogeneous material, because of its composite structure, but also because of the physical phenomena
that take place during hardening (initial stresses, drying shrinkage, heat exchanges). This heterogeneity can explain some
aspects of the complex mechanical behaviour of concrete, particularly the transition from uniform to localized cracking and
the important size effect. A numerical procedure taking the statistical aspects of this heterogeneity into account has been
developed and implemented. It permits us to reproduce and explain the principal experimental results for the behaviour of
concrete under tension. 相似文献
8.
Palmer TN Williams PD 《Philosophical transactions. Series A, Mathematical, physical, and engineering sciences》2008,366(1875):2421-2427
Finite computing resources limit the spatial resolution of state-of-the-art global climate simulations to hundreds of kilometres. In neither the atmosphere nor the ocean are small-scale processes such as convection, clouds and ocean eddies properly represented. Climate simulations are known to depend, sometimes quite strongly, on the resulting bulk-formula representation of unresolved processes. Stochastic physics schemes within weather and climate models have the potential to represent the dynamical effects of unresolved scales in ways which conventional bulk-formula representations are incapable of so doing. The application of stochastic physics to climate modelling is a rapidly advancing, important and innovative topic. The latest research findings are gathered together in the Theme Issue for which this paper serves as the introduction. 相似文献
9.
The fibre component of many fibre reinforced composites can be modelled by a system of non-overlapping straight cylinders. In this paper we discuss a model based on a random sequential adsorption (RSA) process. Geometric characteristics of the fibre system such as the fibre volume fraction or the fibre direction distribution are estimated from tomographic images of composite samples. Using this information, we fit RSA models to samples of a glass fibre reinforced polymer and a fibre reinforced ultra high performance concrete. 相似文献
10.
H. Bargmann 《Acta Mechanica》1997,125(1-4):63-71
11.
In a production flow line with stochastic environment, variability affects the system performance. These stochastic nature of real-world processes have been classified in three types: arrival, service and departure process variability. So far, only service process – or task time – variation has been considered in assembly line (AL) balancing studies. In this study, both service and flow process variations are modelled along with AL balancing problem. The best task assignment to stations is sought to achieve the maximal production. A novel approach which consists of queueing networks and constraint programming (CP) has been developed. Initially, the theoretical base for the usage of queueing models in the evaluation of AL performance has been established. In this context, a diffusion approximation is utilised to evaluate the performance of the line and to model the variability relations between the work stations. Subsequently, CP approach is employed to obtain the optimal task assignments to the stations. To assess the effectiveness of the proposed procedure, the results are compared to simulation. Results show that, the procedure is an effective solution method to measure the performance of stochastic ALs and achieve the optimal balance. 相似文献
12.
Hamid Teimouri Thao N. Nguyen Anatoly B. Kolomeisky 《Journal of the Royal Society Interface》2021,18(182)
Antimicrobial peptides (AMPs) produced by multi-cellular organisms as their immune system’s defence against microbes are actively considered as natural alternatives to conventional antibiotics. Although substantial progress has been achieved in studying the AMPs, the microscopic mechanisms of their functioning remain not well understood. Here, we develop a new theoretical framework to investigate how the AMPs are able to efficiently neutralize bacteria. In our minimal theoretical model, the most relevant processes, AMPs entering into and the following inhibition of the single bacterial cell, are described stochastically. Using complementary master equations approaches, all relevant features of bacteria clearance dynamics by AMPs, such as the probability of inhibition and the mean times before the clearance, are explicitly evaluated. It is found that both processes, entering and inhibition, are equally important for the efficient functioning of AMPs. Our theoretical method naturally explains a wide spectrum of efficiencies of existing AMPs and their heterogeneity at the single-cell level. Theoretical calculations are also consistent with existing single-cell measurements. Thus, the presented theoretical approach clarifies some microscopic aspects of the action of AMPs on bacteria. 相似文献
13.
Annan JD Hargreaves JC 《Philosophical transactions. Series A, Mathematical, physical, and engineering sciences》2007,365(1857):2077-2088
In this paper, we review progress towards efficiently estimating parameters in climate models. Since the general problem is inherently intractable, a range of approximations and heuristic methods have been proposed. Simple Monte Carlo sampling methods, although easy to implement and very flexible, are rather inefficient, making implementation possible only in the very simplest models. More sophisticated methods based on random walks and gradient-descent methods can provide more efficient solutions, but it is often unclear how to extract probabilistic information from such methods and the computational costs are still generally too high for their application to state-of-the-art general circulation models (GCMs). The ensemble Kalman filter is an efficient Monte Carlo approximation which is optimal for linear problems, but we show here how its accuracy can degrade in nonlinear applications. Methods based on particle filtering may provide a solution to this problem but have yet to be studied in any detail in the realm of climate models. Statistical emulators show great promise for future research and their computational speed would eliminate much of the need for efficient sampling techniques. However, emulation of a full GCM has yet to be achieved and the construction of such represents a substantial computational task in itself. 相似文献
14.
考虑到传统的图像滤波算法在图像去噪的同时削弱了图像特征,以及图像系统所固有的自相似性和经验模式分解(EMD)算法的完备性和稳定性,提出了一种基于随机微分的改进EMD图像去噪算法。该算法首先利用EMD对图像进行分解,得到图像的多个固有模式函数(IMF)图像和剩余函数图像,然后根据IMF图像和剩余函数图像采取不同的随机微分滤波策略分别得到各层滤波结果,最后重组得到原始图像去噪后的结果。Matlab仿真证明,该算法在图像去噪的同时保留了图像特征。 相似文献
15.
Quasi-static material tests using specimens cut from a generic cast component are performed to study the behaviour of the high-pressure die-cast magnesium alloy AM60 under different stress states. The experimental data set is applied to establish a validated probabilistic methodology for finite element modelling of thin-walled die-castings subjected to quasi-static loading. The test specimens are modelled in the explicit finite element (FE) code LS-DYNA using shell elements. The cast magnesium alloy AM60 is modelled using an elasto-plastic constitutive model including a high-exponent, isotropic yield criterion, the associated flow law and isotropic hardening. To simulate fracture, the Cockcroft-Latham fracture criterion is adopted, and the fracture parameter is chosen to follow a modified weakest-link Weibull distribution. Comparison between the experimental and predicted behaviour of the cast magnesium specimens gives very promising results. 相似文献
16.
Neelin JD Peters O Lin JW Hales K Holloway CE 《Philosophical transactions. Series A, Mathematical, physical, and engineering sciences》2008,366(1875):2581-2604
Convective quasi-equilibrium (QE) has for several decades stood as a key postulate for parametrization of the impacts of moist convection at small scales upon the large-scale flow. Departures from QE have motivated stochastic convective parametrization, which in its early stages may be viewed as a sensitivity study. Introducing plausible stochastic terms to modify the existing convective parametrizations can have substantial impact, but, as for so many aspects of convective parametrization, the results are sensitive to details of the assumed processes. We present observational results aimed at helping to constrain convection schemes, with implications for each of conventional, stochastic or 'superparametrization' schemes. The original vision of QE due to Arakawa fares well as a leading approximation, but with a number of updates. Some, like the imperfect connection between the boundary layer and the free troposphere, and the importance of free-tropospheric moisture to buoyancy, are quantitatively important but lie within the framework of ensemble-average convection slaved to the large scale. Observations of critical phenomena associated with a continuous phase transition for precipitation as a function of water vapour and temperature suggest a more substantial revision. While the system's attraction to the critical point is predicted by QE, several fundamental properties of the transition, including high precipitation variance in the critical region, need to be added to the theory. Long-range correlations imply that this variance does not reduce quickly under spatial averaging; scaling associated with this spatial averaging has potential implications for superparametrization. Long tails of the distribution of water vapour create relatively frequent excursions above criticality with associated strong precipitation events. 相似文献
17.
18.
A. M. Maniatty P. R. Dawson Y. -S. Lee 《International journal for numerical methods in engineering》1992,35(8):1565-1588
An algorithm for integrating the constitutive equations for an elasto-viscoplastic cubic crystal is presented which is shown to be easily employed in a polycrystalliné analysis. Anisotropic elastic behaviour is incorporated into the standard constitutive equations for ductile single crystals. The algorithm is shown to be efficient, robust and general. The primary advantage of this algorithm is that is provides an implicit integration of the plastic deformation gradient while including the elastic response. This permits taking large time steps while maintaining accuracy and stability. Several polycrystalline examples are presented to demonstrate the effect of the time step on the solution. Examples also are presented which compare the algorithm described herein to an algorithm which neglects the elastic part of the deformation. In addition, the effect of the anisotropic component of the elasticity is investigated by comparing the results with those obtained assuming isotropic elasticity. 相似文献
19.
The extensive use of FRP composite materials in a wide range of industries, and their inherent variability, has prompted many researchers to assess their performance from a probabilistic perspective. This paper attempts to quantify the uncertainty in FRP composites and to summarise the different stochastic modelling approaches suggested in the literature. Researchers have considered uncertainties starting at a constituent (fibre/matrix) level, at the ply level or at a coupon or component level. The constituent based approach could be further classified as a random variable based stochastic computational mechanics approach (whose usage is comparatively limited due to complex test data requirements and possible uncertainty propagation errors) and the more widely used morphology based random composite modelling which has been recommended for exploring local damage and failure characteristics. The ply level analysis using either stiffness/strength or fracture mechanics based models is suggested when the ply characteristics influence the composite properties significantly, or as a way to check the propagation of uncertainties across length scales. On the other hand, a coupon or component level based uncertainty modelling is suggested when global response characteristics govern the design objectives. Though relatively unexplored, appropriate cross-fertilisation between these approaches in a multi-scale modelling framework seems to be a promising avenue for stochastic analysis of composite structures. It is hoped that this review paper could facilitate and strengthen this process. 相似文献
20.
Kwasniok F 《Philosophical transactions. Series A, Mathematical, physical, and engineering sciences》2012,370(1962):1061-1086
A new approach for data-based stochastic parametrization of unresolved scales and processes in numerical weather and climate prediction models is introduced. The subgrid-scale model is conditional on the state of the resolved scales, consisting of a collection of local models. A clustering algorithm in the space of the resolved variables is combined with statistical modelling of the impact of the unresolved variables. The clusters and the parameters of the associated subgrid models are estimated simultaneously from data. The method is implemented and explored in the framework of the Lorenz '96 model using discrete Markov processes as local statistical models. Performance of the cluster-weighted Markov chain scheme is investigated for long-term simulations as well as ensemble prediction. It clearly outperforms simple parametrization schemes and compares favourably with another recently proposed subgrid modelling scheme also based on conditional Markov chains. 相似文献