首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Some speculative proposals are made for extending current stochastic sub-gridscale parametrization methods using the techniques adopted from the field of computer graphics and flow visualization. The idea is to emulate sub-filter-scale physical process organization and time evolution on a fine grid and couple the implied coarse-grained tendencies with a forecast model. A two-way interaction is envisaged so that fine-grid physics (e.g. deep convective clouds) responds to forecast model fields. The fine-grid model may be as simple as a two-dimensional cellular automaton or as computationally demanding as a cloud-resolving model similar to the coupling strategy envisaged in 'super-parametrization'. Computer codes used in computer games and visualization software illustrate the potential for cheap but realistic simulation where emphasis is placed on algorithmic stability and visual realism rather than pointwise accuracy in a predictive sense. In an ensemble prediction context, a computationally cheap technique would be essential and some possibilities are outlined. An idealized proof-of-concept simulation is described, which highlights technical problems such as the nature of the coupling.  相似文献   

2.
Conceptual climate models are very simple mathematical representations of climate processes, which are especially useful because their workings can be readily understood. The usual procedure of representing effects of unresolved processes in such models using functions of the prognostic variables (parametrizations) that include no randomness generally results in these models exhibiting substantially less variability than do the phenomena they are intended to simulate. A viable yet still simple alternative is to replace the conventional deterministic parametrizations with stochastic parametrizations, which can be justified theoretically through the central limit theorem. The result is that the model equations are stochastic differential equations. In addition to greatly increasing the magnitude of variability exhibited by these models, and their qualitative fidelity to the corresponding real climate system, representation of unresolved influences by random processes can allow these models to exhibit surprisingly rich new behaviours of which their deterministic counterparts are incapable.  相似文献   

3.
Concrete is a highly heterogeneous material, because of its composite structure, but also because of the physical phenomena that take place during hardening (initial stresses, drying shrinkage, heat exchanges). This heterogeneity can explain some aspects of the complex mechanical behaviour of concrete, particularly the transition from uniform to localized cracking and the important size effect. A numerical procedure taking the statistical aspects of this heterogeneity into account has been developed and implemented. It permits us to reproduce and explain the principal experimental results for the behaviour of concrete under tension.  相似文献   

4.
Dynamic recrystallisation (DRX) governs the plastic flow behaviour and the final microstructure of many crystalline materials during thermomechanical processing. Understanding the recrystallisation process is the key to linking dislocation activities at the mesoscopic scale to mechanical properties at the macroscopic scale. A modelling methodology coupling fundamental metallurgical principles with the cellular automaton (CA) technique is here derived to simulate the dynamic recrystallisation process. Experimental findings of a titanium alloy are considered for comparison with theory. The model takes into account practical experimental parameters and predicts the nucleation and the growth kinetics of dynamically recrystallised grains. Hence it can simulate different stages of microstructural evolution during thermomechanical processing. The effects of hot working temperature and strain rate on microstructure were studied, and the results compared with experimental findings.  相似文献   

5.
The Workbench is an adaptable and efficient tool for performing thermodynamic and kinetic calculations. This computer program uses various artificial intelligence techniques to provide a more versatile modelling system than has been possible with conventional programs. The structure and operation of the program is described and a simplified example is worked through illustrating the Workbench environment. Practical applications are given of the use of the Workbench to predict magnetic microstructures in Nd-Fe-B permanent magnets and to study the long term structural stability of Co-Pd multilayers.  相似文献   

6.
Quasi-static material tests using specimens cut from a generic cast component are performed to study the behaviour of the high-pressure die-cast magnesium alloy AM60 under different stress states. The experimental data set is applied to establish a validated probabilistic methodology for finite element modelling of thin-walled die-castings subjected to quasi-static loading. The test specimens are modelled in the explicit finite element (FE) code LS-DYNA using shell elements. The cast magnesium alloy AM60 is modelled using an elasto-plastic constitutive model including a high-exponent, isotropic yield criterion, the associated flow law and isotropic hardening. To simulate fracture, the Cockcroft-Latham fracture criterion is adopted, and the fracture parameter is chosen to follow a modified weakest-link Weibull distribution. Comparison between the experimental and predicted behaviour of the cast magnesium specimens gives very promising results.  相似文献   

7.
We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stochastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single-column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared with deterministic ensembles describing initial condition uncertainty and also with combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS (GEWEX Cloud System Study) community, simulating transitions between active and suppressed periods of tropical convection.  相似文献   

8.
Optimization and Engineering - Designing inspection frequency to efficiently track stochastic dynamics is a fundamental engineering problem. Especially, tracking environmental variables like water...  相似文献   

9.
10.
Microbiological safety of food relies on microbial examination of raw materials and final products, coupled with monitoring process parameters and hygiene standards. The concept of predictive microbiology was developed to evaluate the effect of processing, distribution and storage operations on food safety. The objective of this paper is to review the approaches proposed by researchers to quantify the effect of competitiveness or fluctuating conditions on bacterial behaviour. The main microbial models that quantify the effects of various hurdles on microbial kinetics are presented. To provide complementary information for microbial models, three areas have to be considered: process engineering that characterises and models mass and heat transfer; microbiology that characterises and models bacterial behaviour and metabolite production, and; applied thermodynamics that characterises and models the physico-chemical properties of a food product. Global modelling approaches, developed by integrating the previous models, are illustrated with recent results.  相似文献   

11.
Y. Narahari  N. Viswanadham 《Sadhana》1987,11(1-2):187-208
The fault-tolerant multiprocessor (ftmp) is a bus-based multiprocessor architecture with real-time and fault-tolerance features and is used in critical aerospace applications. A preliminary performance evaluation is of crucial importance in the design of such systems. In this paper, we review stochastic Petri nets (spn) and developspn-based performance models forftmp. These performance models enable efficient computation of important performance measures such as processing power, bus contention, bus utilization, and waiting times.  相似文献   

12.
13.
The systems modelling and the assessment of their performances are the two key phases of any reliability or risk analysis study. However, it is well known and admitted that most of the methods devoted to this goal are unsuitable if the physical behaviour of the systems cannot be made independent from the probabilistic behaviour, as is the case for dynamic process systems. To overcome this inefficiency, some alternative methods have recently appeared. But they are not very approachable and could be, for this reason, ignored by most of the practitioners. Among these methods, however, simulation methods should be confirmed. From this point of view, the authors propose a straightforward approach to this problem by using stochastic Petri nets on a simple and well-known dynamic system from the literature. In addition, for testing this approach, the authors have carried it out on a system with periodically tested components. The results obtained are compared to those already published, the limits of the method are underlined and its efficiency in solving this kind of problem is being examined.  相似文献   

14.
15.
Modelling and analysis of business processes is critical to identify current business processes and to understand the contributions of new processes to the system. The quality of the results obtained by modelling and analysis significantly influences the success of business process reengineering (BPR). Therefore, a constant development in techniques used in business process modelling (BPM) and business process analysis (BPA) is necessary. However, when these proposed techniques are analysed it becomes obvious that they repeat the same basic approach, although a few offer different visions. In BPM development studies, the use of time-activity scheduling is often considered secondary (even neglected). The reason for this is that process modelling may be considered as project management and remain under this label. Many organizations may use these techniques in managing their daily activities if the maturity level and the simplicity of project management techniques are considered. It also enables the modelling of stochastic situations, otherwise not possible to do by any BPM method. In this study an existing business process with network properties is analysed using project scheduling techniques. Thus, business processes are described as networks, modelled and timed by network properties and stochastically analysed using GERT, a project based process scheduling method. Finally the results obtained by GERT are examined using the PERT-path approach.  相似文献   

16.
A major challenge for crash failure analysis of laminated composites is to find a modelling approach, which is both sufficiently accurate, for example, able to capture delaminations, and computationally efficient to allow full‐scale vehicle crash simulations. Addressing this challenge, we propose a methodology based on an equivalent single‐layer shell formulation which is adaptively through‐the‐thickness refined to capture initiating and propagating delaminations. To be specific, single shell elements through the laminate thickness are locally and adaptively enriched using the extended finite element method such that delaminations can be explicitly modelled without having to be represented by separate elements. Furthermore, the shell formulation is combined with a stress recovery technique which increases the accuracy of predicting delamination initiations. The paper focuses on the parameters associated with identifying, introducing and extending the enrichment areas; especially on the impact of these parameters on the resulting structural deformation behaviour. We conclude that the delamination enrichment must be large enough to allow the fracture process to be accurately resolved, and we propose a suitable approach to achieve this. The proposed methodology for adaptive delamination modelling shows potential for being computationally efficient, and thereby, it has the potential to enable efficient and accurate full vehicle crash simulations of laminated composites. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

17.
Bayesian forecasting models provide distributional estimates for random parameters, and relative to classical schemes, have the advantage that they can rapidly capture changes in nonstationary systems using limited historical data. Unlike deterministic optimization, stochastic programs explicitly incorporate distributions for random parameters in the model formulation, and thus have the advantage that the resulting solutions more fully hedge against future contingencies. In this paper, we exploit the strengths of Bayesian prediction and stochastic programming in a rolling-horizon approach that can be applied to solve real-world problems. We illustrate the methodology on an employee production scheduling problem with uncertain up-times of manufacturing equipment and uncertain production rates. Computational results indicate the value of our approach.  相似文献   

18.
Blindfolded or disoriented people have the tendency to walk in circles rather than on a straight line even if they wanted to. Here, we use a minimalistic walking model to examine this phenomenon. The bipedal spring-loaded inverted pendulum exhibits asymptotically stable gaits with centre of mass (CoM) dynamics and ground reaction forces similar to human walking in the sagittal plane. We extend this model into three dimensions, and show that stable walking patterns persist if the leg is aligned with respect to the body (here: CoM velocity) instead of a world reference frame. Further, we demonstrate that asymmetric leg configurations, which are common in humans, will typically lead to walking in circles. The diameter of these circles depends strongly on parameter configuration, but is in line with empirical data from human walkers. Simulation results suggest that walking radius and especially direction of rotation are highly dependent on leg configuration and walking velocity, which explains inconsistent veering behaviour in repeated trials in human data. Finally, we discuss the relation between findings in the model and implications for human walking.  相似文献   

19.
The use of quality function deployment (QFD) to aid decision making in product planning has gained extensive international attention, but current QFD approaches are unable to cope with complex product planning (CPP) characterized by involving multiple engineering characteristics (ECs) associated with significant uncertainty. To tackle this difficulty, in this paper, fuzzy set theory is embedded into a QFD framework and a novel fuzzy QFD program modelling approach to CPP is proposed to optimize the values of ECs by taking the design uncertainty and financial considerations into account. In the proposed methodology, fuzzy set theory is used to account for design uncertainty, and the method of imprecision (MoI) is employed to perform multiple-attribute synthesis to generate a family of synthesis strategies by varying the value of s, which indicates the different compensation levels among ECs. The proposed methodology will allow QFD practitioners to control the distribution of their development budget by presetting the value of s to determine the compensation levels among ECs. An illustrative example of the quality improvement of the design of a motor car is provided to demonstrate the application and performance of the modelling approach.  相似文献   

20.
In order to perform a fatigue-life analysis of structures the parameters of the structure loading spectra must be assessed. If the load time series are counted using a two-parametric rainflow counting method, the structure loading spectrum provides a probability for the occurrence of a load-cycle with certain amplitude and mean values. It is beneficial for the prediction of the fatigue life to describe the loading spectrum by a continuous function. We have previously discovered that mixtures of Gaussian probability density functions can be used to model the loading spectra. The main problems of this approach that have not been satisfactorily resolved before are related to the estimation of the number of components in the applied mixture models, and to the modelling of the load-cycle distributions with relatively fat tails. In this article, we describe a method for estimating the parameters of mixture models, which allows automatic determination of the number of components in a mixture model. The presented method is applied for modelling simulated and measured loading spectra using mixtures of the multivariate Gaussian or t probability density functions. In the article we also show that the mixture of t probability density functions sometimes better describes the loading spectra than the mixture of Gaussian probability density functions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号