首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper provides a general treatment of statistical inference for the reliability in copula-based stress-strength models. Most of the current literature is either focused on specific models that yield clean formulas or restricted to estimation and engineering aspects without addressing statistical inference. We present two general frameworks, one parametric, one nonparametric, for the estimation of the reliability. The parametric methodology is presented under the general framework of estimating equations, mostly as a combination of existing methodologies from the fields of multivariate analysis, reliability, and econometrics, with some new results. The nonparametric methodology is a novel application based on an existing bivariate kernel method combined with Monte Carlo estimation of the reliability without specification of the copula or the margins. We present results from a small simulation study designed to assess the robustness of the methods discussed in terms of model misspecification. We used geotechnical data and data from the Brazilian Household Survey to illustrate the proposed methodologies in the estimation of factors of safety and financial fragility.  相似文献   

2.
The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which “experimental” data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport “universe”, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new “experiments” within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies.  相似文献   

3.
In this paper two multivariate statistical methodologies are compared in order to estimate a multi‐input multi‐output transfer function model in an industrial polymerization process. In these contexts, process variables are usually autocorrelated (i.e. there is time‐dependence between observations), posing some problems to classical linear regression models. The two methodologies to be compared are both related to the analyses of multivariate time series: Box‐Jenkins methodology and partial least squares time series. Both methodologies are compared keeping in mind different issues, such as the simplicity of the process modeling (i.e. the steps of the identification, estimation and validation of the model), the usefulness of the graphical tools, the goodness of fit, and the parsimony of the estimated models. Real data from a polymerization process are used to illustrate the performance of the methodologies under study. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

4.
A large metabolomics study was performed on 600 plasma samples taken at four time points before and after a single intake of a high fat test meal by obese and lean subjects. All samples were analyzed by a liquid chromatography-mass spectrometry (LC-MS) lipidomic method for metabolic profiling. A pragmatic approach combining several well-established statistical methods was developed for processing this large data set in order to detect small differences in metabolic profiles in combination with a large biological variation. Such metabolomics studies require a careful analytical and statistical protocol. The strategy included data preprocessing, data analysis, and validation of statistical models. After several data preprocessing steps, partial least-squares discriminant analysis (PLS-DA) was used for finding biomarkers. To validate the found biomarkers statistically, the PLS-DA models were validated by means of a permutation test, biomarker models, and noninformative models. Univariate plots of potential biomarkers were used to obtain insight in up- or downregulation. The strategy proposed proved to be applicable for dealing with large-scale human metabolomics studies.  相似文献   

5.
Pesticides are ubiquitous in environments of many rural communities due to drift from agricultural applications and home/garden use. Studies of childhood leukaemia predominantly relied on retrospective pesticide exposure assessment and parental recall of use or proximity to fields or pesticide applications. Sample size requirements mostly preclude the collection of individual-level exposure information, biomarkers or environmental measurements of pesticides prospectively in cohorts. Yet such measures can be used in nested case-control approaches or for validating exposure models that can be applied to large populations. Recently developed models incorporate geographic information system technology and environmental databases of pesticide and/or crop data to assess exposure. Models developed in California to estimate residential exposures are presented by linking addresses to agricultural pesticide application data and land-use maps. Results from exposure validation and simulation studies and exposure measurement error issues are discussed.  相似文献   

6.
Application of metabonomics to nutritional sciences, also termed as nutrimetabonomics, offers the possibility to measure metabolic responses associated with the consumption of specific nutrients and foods. As dietary differences generally only lead to subtle metabolic changes, measuring diet associated metabolic phenotypes is a challenge, and also an opportunity to develop and test new chemometric strategies that can highlight metabolic information in relation to different dietary habits. While multivariate statistical techniques have long been used to analyse dietary data from diet records and questionnaires, to date no attempt has been made to link dietary patterns with metabolic profiles. Using a three-step strategy, it was possible to merge 1H NMR plasma metabolic profile data with specific dietary patterns as assessed by Principal Component Analysis (PCA) and Partial Least Squares-Discriminant Analysis (PLS-DA). Five dietary patterns (energy intake, plant versus animal based diet, “traditional diet” versus sugar-rich diet, “traditional” versus “modern” diets, and consumption of skim versus whole dairy products) were found by applying PCA to the food frequency questionnaire data which explained 50% of the variation. Metabolic phenotypes associated with these dietary patterns were obtained by PLS-DA and were mainly based on differences in lipids and amino acid profiles in plasma. This new approach to assess relationships between dietary intake and metabolic profiling data will allow greater steps to be made in merging nutritional epidemiology with metabonomics.  相似文献   

7.
Rate‐dependent models require creep or mechanical tests at various strain rates in order to be identified and validated. Different geometries coexist for creep and static tests (normative geometry) and for dynamic tests. Therefore, due to geometrical sample considerations, experimental results could be inconsistent for identification or validation procedures, inducing, for example, differences on the shear modulus only due to the change of geometry. The objective of this work is to present an improved sample geometry that allows to obtain consistent mechanical tests results at various strain rates highlighting the rate dependencies of laminates. In particular, a complete mechanical validation of the sample geometry for dynamic tests is successfully performed in order to avoid inconsistency. Results of static and dynamic tests on the validated geometry are analysed, and the rate dependency of the elastic properties of the UD T700GC/M21 mesoscopic ply is highlighted on a wide strain rate range (10?3 to 102 s?1). Finally, the identification of a non‐linear viscoelastic model is performed on dynamic and creep tests results in order to obtain a representative model for dynamic, static and creep loadings, and to demonstrate the importance of introducing the improved geometry for the dynamic tests.  相似文献   

8.
Material models for steels, used widely in numerical simulations of manufacturing chains, require identification of their coefficients on the basis of measurements obtained from laboratory test. Precision of the identification highly influences modelling reliability. This is visible especially in the case of phase transformation models, which are crucial in predicting of the modern Advanced High Strength Steels (AHSS) properties after applied heat treatment. However, identification of phase transformation models for steels based on dilatometric tests presents serious difficulties. Two problems are investigated in the paper i.e. (i) efficiency of the inverse algorithms used for identification of phase transformation models, (ii) final reliability of the identified models in numerical simulations of manufacturing processes. In the work two phase transformation models were selected as an example. The first was a modified JMAK (Johnson-Mehl-Avrami-Kolmogorov) equation. The second was an upgrade of the Leblond equation, in which second order derivative with respect to time was introduced. The identification was performed by coupling the selected model with nature inspired optimization techniques and performing inverse analysis for the experimental data. Dilatometric tests performed for various cooling rates were used as an experiment, which supplies data for the inverse analysis. Finally, validation of identified models is presented by using industrial data.  相似文献   

9.
One of the major fallouts of the human genome project relates to the investigation of the molecular mechanisms of diseases. Identification of genes which are involved in a specific pathological process and characterization of their interactions is of fundamental importance for supporting the drug design processes. Discovery of targets and the related experimental validation is a critical step in the development of new drugs. The new experimental methods for gene expression analysis, such as microarray technology, allows for the concurrent evaluation of the expression of multiple genes. The outcome of these new experimental methods requires a subsequent validation of the gene function by using in vitro or in vivo models. In the last decade, one of the most promising methodologies for the investigation of gene function relies upon antisense oligonucleotides (ASO). The crucial step in antisense experiment design is the characterization of the nucleotide domains that can efficiently be targeted by this kind of synthetic molecule. At present, no standardized procedures for target selection are available. In this paper, we propose an integrative approach to ASO target selection: the proposed tool Automatic Gene Walk (AgeWa) combines a neural filter with database mining for the prediction of the optimal target for antisense action.  相似文献   

10.
This paper illustrates the inverse reliability design techniques for exponential and Weibull stress-strength models. Expressions for factor of safety, mean strength and its variability have been derived for a given range of target reliability and external load conditions. In the case of exponential models a global solution has been found out whereas in the case of the Weibull model only some specific cases have been considered. The difficulty in obtaining a global solution in the Weibull model has been discussed with possible methods to attain it.  相似文献   

11.
Environmental hazards can be caused from radioactive waste after their disposal. It was therefore important that safety assessment methodologies be developed and established to study and estimate the possible hazards, and institute certain safety methodologies that lead and prevent the evolution of these hazards. Spent sealed sources are specific type of radioactive waste. According to IAEA definition, spent sealed sources are unused sources because of activity decay, damage, misuse, loss, or theft. Accidental exposure of humans from spent sealed sources can occur at the moment they become spent and before their disposal. Because of that reason, safety assessment methodologies were tailored to suit the management of spent sealed sources. To provide understanding and confidence of this study, validation analysis was undertaken by considering the scenario of an accident that occurred in Egypt, June 2000 (the Meet-Halfa accident from an iridium-192 source). The text of this work includes consideration related to the safety assessment approaches of spent sealed sources which constitutes assessment context, processes leading an active source to be spent, accident scenarios, mathematical models for dose calculations, and radiological consequences and regulatory criteria. The text also includes a validation study, which was carried out by evaluating a theoretical scenario compared to the real scenario of Meet-Halfa accident depending on the clinical assessment of affected individuals.  相似文献   

12.
A historical and technical overview of a paradigm for automating research procedures on the area of constitutive identification of composite materials is presented. Computationally controlled robotic, multiple degree-of-freedom mechatronic systems are used to accelerate the rate of performing data-collecting experiments along loading paths defined in multidimensional loading spaces. The collected data are utilized for the inexpensive data-driven determination of bulk material non-linear constitutive behavior models as a consequence of generalized loading through parameter identification/estimation methodologies based on the inverse approach. The concept of the dissipated energy density is utilized as the representative encapsulation of the non-linear part of the constitutive response that is responsible for the irreversible character of the overall behavior. Demonstrations of this paradigm are given for the cases of polymer matrix composite materials systems. Finally, this computational and mechatronic infrastructure is used to create conceptual, analytical and computational models for describing and predicting material and structural performance.  相似文献   

13.
Feature-based process planning has been popular in academia and industry owing to its ability rigorously to integrate design and manufacturing. In spite of this benefit, feature-based process planning has been difficult to implement because it is not easy to extract even simple features from the design data automatically and efficiently. Furthermore, it is even more difficult to construct the temporal precedence relationships among the extracted features with regard to plan alternatives and sequence options. The objective here is to present elaboration and validation methodologies for AND/OR graph-based non-linear process plans that help users construct, validate or modify a controller-friendly process plan easily and efficiently. An elaborated process plan created by a CAPP or an experienced process planner should be validated with respect to various validation criteria, such as the shape of a finished part, feature interaction, feature interference, feature manufacturability and flexibility for shop floor control. For each validation criteria, the invalid indicator matrix proposed here can quantify the degree of invalidity of the process plan. The invalid indicator matrix will help the process planner fix and refine rapidly the elaborated process plan. The methodologies proposed will save cost and time in the production of a controller-friendly process plan. Initial results are promising.  相似文献   

14.
15.
Detection and inhibition of bacteria are universally required in clinics and daily life for health care. Developing a dual‐functional material is challenging and in demand, engaging advanced applications for both defined bioanalysis and targeted biotoxicity. Herein, magnetic silver nanoshells are designed as a multifunctional platform for the detection and inhibition of bacteria. The optimized magnetic silver nanoshells enable direct laser desorption/ionization mass spectrometry based metabolic analysis of bacteria (≈10 µL?1), in complex biofluids. The serum infection process (0–10 h) is monitored by statistics toward clinical classification. Moreover, magnetic silver nanoshells facilitate surface adhesion on bacteria due to nanoscale surface roughness and thus display long‐term antibacterial effects. Bacteria metabolism is studied with metabolic biomarkers (e.g., malate and lysine) identified during inhibition, showing cell membrane destruction and dysfunctional protein synthesis mechanisms. This work not only guides the design of material‐based approaches for bioanalysis and biotoxicity, but contributes to bacteria‐related diagnosis by using specific metabolic biomarkers for sensitive detection and new insights by monitoring metabolomic change of bacteria for antibacterial applications.  相似文献   

16.
This paper presents a hybrid methodology for conceptual design of large systems with the goal of enhancing system reliability. It integrates the features of several design methodologies and maintenance planning concepts with the traditional reliability analysis. The methodology considers the temporal quality characteristic “reliability” as the main objective and determines the optimal system design. Key ideas from several design methodologies, namely axiomatic design, robust design, and the theory of inventive problem solving, have been integrated with the functional prioritization framework provided by reliability-centered maintenance. A case study of the conceptual design of a multiphase pumping station for crude oil production is presented. The methodology provides a new design tool for determining system configurations with enhanced reliability taking into account maintenance resources and variability.  相似文献   

17.
提出了基于可靠性理论的故障诊断法,旨在准确识别故障。文中使用的可靠性理论包括寿命数据分析理论、概率分布干涉模型和系统可靠性模型。给出了可靠性理论应用于故障诊断的数学模型和汽轮发电机组的应用实例。  相似文献   

18.
The structural dynamic modeling errors, which at times are difficult to eliminate in a structural FE model, can affect the accuracy and reliability of the vibro-acoustic FE models for NVH design of the cavities. A large number of methods have been proposed for structural finite element model updating. However, most of the studies conducted are mainly focused on structural dynamic applications and no work is reported on vibro-acoustic systems. The objective of this paper is to compare through a simulated study two recently proposed methodologies for vibro-acoustic FE model updating of cavities with weak acoustic coupling to address structural dynamic modeling errors. These methodologies utilize a direct and an iterative method of model updating developed for purely structural systems. A simulated example of a 2D rectangular cavity with a flexible surface is presented. Cases of incomplete and noisy data are considered. The comparison is done on the basis of accuracy of prediction of vibro-acoustic natural frequencies and the responses both inside and outside the frequency range of interest. It is concluded that both the methodologies give an accurate prediction of the vibro-acoustic natural frequencies and the response inside the updating frequency range. However, beyond this range, the predictions based on the direct updated vibro-acoustic models are not accurate. It is noted that the success of updating using IESM is dependent on the correct knowledge of the modeling inaccuracies, uncertainties or approximations and also on the choice of the suitable updating parameters, which could be very challenging for complex cavities. The vibro-acoustic FE model updating using the direct method could be handy in such situations where the iterative methods are difficult to be effectively applied.  相似文献   

19.
基于可靠性框图的可靠性建模研究   总被引:1,自引:0,他引:1       下载免费PDF全文
针对传统可靠性仿真模型建模繁琐、编程困难的问题,以可靠性框图为基础,设计了基于ExtendSim的可靠性建模流程,以两单元并联可修系统为例,建立了基于“致命修复”和“即坏即修”策略的仿真模型,并与解析模型进行了对比验证.仿真结果表明建立的可靠性模型是可信的,且此建模方法易于工程技术人员掌握,具有一定的推广价值.  相似文献   

20.
Validation of reliability computational models using Bayes networks   总被引:9,自引:2,他引:9  
This paper proposes a methodology based on Bayesian statistics to assess the validity of reliability computational models when full-scale testing is not possible. Sub-module validation results are used to derive a validation measure for the overall reliability estimate. Bayes networks are used for the propagation and updating of validation information from the sub-modules to the overall model prediction. The methodology includes uncertainty in the experimental measurement, and the posterior and prior distributions of the model output are used to compute a validation metric based on Bayesian hypothesis testing. Validation of a reliability prediction model for an engine blade under high-cycle fatigue is illustrated using the proposed methodology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号