首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper proposes a non-stationary random response analysis method of structures with uncertain parameters. The structural physical parameters and the input parameters are considered as random variables or interval variables. By using the pseudo-excitation method and the direct differentiation method (DDM), the analytical expression of the time-varying power spectrum and the time-varying variance of the structure response can be obtained in the framework of first order perturbation approaches. In addition, the analytical expression of the first-order and second-order partial derivative (e.g., time-varying sensitivity coefficient) for the time-varying power spectrum and the time-varying variance of the structure response expressed via the uncertainty parameters can also be determined. Based on this and the perturbation technique, the probabilistic and non-probabilistic analysis methods to calculate the upper and lower bounds of the time-varying variance of the structure response are proposed. Finally the effectiveness of the proposed method is demonstrated by numerical examples compared with the Monte Carlo solutions and the vertex solutions.  相似文献   

2.
Decision-making under uncertainty describes most environmental remediation and waste management problems. Inherent limitations in knowledge concerning contaminants, environmental fate and transport, remedies, and risks force decision-makers to select a course of action based on uncertain and incomplete information. Because uncertainties can be reduced by collecting additional data., uncertainty and sensitivity analysis techniques have received considerable attention. When costs associated with reducing uncertainty are considered in a decision problem, the objective changes; rather than determine what data to collect to reduce overall uncertainty, the goal is to determine what data to collect to best differentiate between possible courses of action or decision alternatives. Environmental restoration and waste management requires cost-effective methods for characterization and monitoring, and these methods must also satisfy regulatory requirements. Characterization and monitoring activities imply that, sooner or later, a decision must be made about collecting new field data. Limited fiscal resources for data collection should be committed only to those data that have the most impact on the decision at lowest possible cost.Applying influence diagrams in combination with data worth analysis produces a method which not only satisfies these requirements but also gives rise to an intuitive representation of complex structures not possible in the more traditional decision tree representation. This paper demonstrates the use of influence diagrams in data worth analysis by applying to a monitor-and-treat problem often encountered in environmental decision problems.  相似文献   

3.
2-(4-Morpholinyl)benzothiazole (24MoBT) exists in automobile tire rubber as an impurity of a vulcanization accelerator and has been proposed as a potential molecular marker of street runoff (Spies, R. B.; Andresen, B. D.; Rice, D. W., Jr. Nature 1987, 327, 697-699). The present paper describes an analytical method for 24MoBT in environmental samples (e.g., street dusts and river sediments) by gas chromatography. The method relies upon extraction with a 6:4 (v/v) mixture of benzene and methanol, purification by acid extraction, and adsorption column chromatography, followed by determination using capillary GC equipped with a sulfur-selective detector (i.e., FPD). The recovery of 24MoBT for the entire procedure was 85%, and the relative standard deviation for four replicated analyses was 1.5%. The detection limit was 0.08 ng injected 24MoBT, corresponding to 0.20 ng/g of dry sample. The selectivity and sensitivity of the present method permit the determination of 24MoBT at the trace levels (e.g., ~ng/g) encountered in environmental samples. 24MoBT concentrations in various environmental samples are also reported.  相似文献   

4.
Genetically engineered bioreporters are an excellent complement to traditional methods of chemical analysis. The application of fluorescence flow cytometry to detection of bioreporter response enables rapid and efficient characterization of bacterial bioreporter population response on a single-cell basis. In the present study, intrapopulation response variability was used to obtain higher analytical sensitivity and precision. We have analyzed flow cytometric data for an arsenic-sensitive bacterial bioreporter using an artificial neural network-based adaptive clustering approach (a single-layer perceptron model). Results for this approach are far superior to other methods that we have applied to this fluorescent bioreporter (e.g., the arsenic detection limit is 0.01 microM, substantially lower than for other detection methods/algorithms). The approach is highly efficient computationally and can be implemented on a real-time basis, thus having potential for future development of high-throughput screening applications.  相似文献   

5.
A rapid, sensitive, and specific assay for detection and quantitation of (p-chlorophenyl)aniline (CPA) in biological samples was developed. The assay was established based on rapid electrochemical oxidation of CPA to a dimerized product (1.0 V vs Pd) with the enhanced detection sensitivity of electrospray mass spectrometer (ES/MS). A "head-to-tail" dimer ([M + H]+ at m/z 217) was exhibited as the predominant species after electrochemical conversion of CPA. Optimal detection sensitivity and specificity for the dimer of CPA that was present in the biological matrix (e.g., rat urine) were achieved through on-line electrochemistry (EC) coupled with high-performance liquid chromatography tandem mass spectrometry. No matrix-associated ion suppression was observed. The limit of detection (S/N approximately 6) was 20 ng/mL, and the limit of quantitation was 50 ng/mL. The calibration curve was exhibited to be quadratic over the range of 50-2000 ng/mL with r2 > 0.99 in various biological matrixes. The assay was validated and used to study the biotransformation of p-chlorophenyl isocyanate (CPIC) to CPA in rats administered intraperitoneally with CPIC (50 mg/kg). The present LC/EC/MS/MS assay of CPA brings important technical advantages to assist in the risk assessment of new chemical entities, which have the potential to produce anilines via biotransformation.  相似文献   

6.
Systems that are intelligent have the ability to sense their surroundings, analyze, and respond accordingly. In nature, many biological systems are considered intelligent (e.g., humans, animals, and cells). For man‐made systems, artificial intelligence is achieved by massively sophisticated electronic machines (e.g., computers and robots operated by advanced algorithms). On the other hand, freestanding materials (i.e., not tethered to a power supply) are usually passive and static. Hence, herein, the question is asked: can materials be fabricated so that they are intelligent? One promising approach is to use stimuli‐responsive materials; these “smart” materials use the energy supplied by a stimulus available from the surrounding for performing a corresponding action. After decades of research, many interesting stimuli‐responsive materials that can sense and perform smart functions have been developed. Classes of functions discussed include practical functions (e.g., targeting and motion), regulatory functions (e.g., self‐regulation and amplification), and analytical processing functions (e.g., memory and computing). The pathway toward creating truly intelligent materials can involve incorporating a combination of these different types of functions into a single integrated system by using stimuli‐responsive materials as the basic building blocks.  相似文献   

7.
8.
9.
It is well known that clutter (spectral interference) from atmospheric constituents can be a severe limit for spectroscopic point sensors, especially where high sensitivity and specificity are required. In this paper, we will show for submillimeter/terahertz (SMM/THz) sensors that use cw electronic techniques the clutter limit for the detection of common target gases with absolute specificity (probability of false alarm ? 10?1?) is in the ppt (1 part in 1012) range or lower. This is because the most abundant atmospheric gases are either transparent to SMM/THz radiation (e.g., CO?) or have spectra that are very sparse relative to the 10? Doppler-limited resolution elements available (e.g., H?O). Moreover, the low clutter limit demonstrated for cw electronic systems in the SMM/THz is independent of system size and complexity.  相似文献   

10.
Environmental regulatory policy states a goal of "sound science." The practice of good science is founded on the systematic identification and management of uncertainties; i.e., knowledge gaps that compromise our ability to make accurate predictions. Predicting the consequences of decisions about risk and risk reduction at contaminated sites requires an accurate model of the nature and extent of site contamination, which in turn requires measuring contaminant concentrations in complex environmental matrices. Perfecting analytical tests to perform those measurements has consumed tremendous regulatory attention for the past 20-30 years. Yet, despite great improvements in environmental analytical capability, complaints about inadequate data quality still abound. This paper argues that the first generation data quality model that equated environmental data quality with analytical quality was a useful starting point, but it is insufficient because it is blind to the repercussions of multifaceted issues collectively termed "representativeness." To achieve policy goals of "sound science" in environmental restoration projects, the environmental data quality model must be updated to recognize and manage the uncertainties involved in generating representative data from heterogeneous environmental matrices.  相似文献   

11.
Addressing long-term potential human exposures to, and health risks from contaminants in the subsurface environment requires the use of models. Because these models must project contaminant behavior into the future, and make use of highly variable landscape properties, there is uncertainty associated with predictions of long-term exposure. Many parameters used in both subsurface contaminant transport simulation and health risk assessment have variance owing to uncertainty and/or variability. These parameters are best represented by ranges or probability distributions rather than single values. Based on a case study with information from an actual site contaminated with trichloroethylene (TCE), we demonstrate the propagation of variance in the simulation of risk using a complex subsurface contaminant transport simulation model integrated with a multi-pathway human health risk model. Ranges of subsurface contaminant concentrations are calculated with the subsurface transport simulator T2VOC (using the associated code ITOUGH2 for uncertainty analysis) for a three-dimensional system in which TCE migrates in both the vadose and saturated zones over extended distances and time scales. The subsurface TCE concentration distributions are passed to CalTOX, a multimedia, multi-pathway exposure model, which is used to calculate risk through multiple exposure pathways based on inhalation, ingestion and dermal contact. Monte Carlo and linear methods are used for the propagation of uncertainty owing to parameter variance. We demonstrate how rank correlation can be used to evaluate contributions to overall uncertainty from each model system. In this sample TCE case study, we find that although exposure model uncertainties are significant, subsurface transport uncertainties are dominant.  相似文献   

12.
A Bayesian approach to diagnosis and prognosis using built-in test   总被引:3,自引:0,他引:3  
Accounting for the effects of test uncertainty is a significant problem in test and diagnosis, especially within the context of built-in test. Of interest here, how does one assess the level of uncertainty and then utilize that assessment to improve diagnostics? One approach, based on measurement science, is to treat the probability of a false indication [e.g., built-in-test (BIT) false alarm or missed detection] as the measure of uncertainty. Given the ability to determine such probabilities, a Bayesian approach to diagnosis, and by extension, prognosis suggests itself. In the following, we present a mathematical derivation for false indication and apply it to the specification of Bayesian diagnosis. We draw from measurement science, reliability theory, signal detection theory, and Bayesian decision theory to provide an end-to-end probabilistic treatment of the fault diagnosis and prognosis problem.  相似文献   

13.
Kim Y  Amemiya S 《Analytical chemistry》2008,80(15):6056-6065
A highly sensitive analytical method is required for the assessment of nanomolar perchlorate contamination in drinking water as an emerging environmental problem. We developed the novel approach based on a voltammetric ion-selective electrode to enable the electrochemical detection of "redox-inactive" perchlorate at a nanomolar level without its electrolysis. The perchlorate-selective electrode is based on the submicrometer-thick plasticized poly(vinyl chloride) membrane spin-coated on the poly(3-octylthiophene)-modified gold electrode. The liquid membrane serves as the first thin-layer cell for ion-transfer stripping voltammetry to give low detection limits of 0.2-0.5 nM perchlorate in deionized water, commercial bottled water, and tap water under a rotating electrode configuration. The detection limits are not only much lower than the action limit (approximately 246 nM) set by the U.S. Environmental Protection Agency but also are comparable to the detection limits of the most sensitive analytical methods for detecting perchlorate, that is, ion chromatography coupled with a suppressed conductivity detector (0.55 nM) or electrospray ionization mass spectrometry (0.20-0.25 nM). The mass transfer of perchlorate in the thin-layer liquid membrane and aqueous sample as well as its transfer at the interface between the two phases were studied experimentally and theoretically to achieve the low detection limits. The advantages of ion-transfer stripping voltammetry with a thin-layer liquid membrane against traditional ion-selective potentiometry are demonstrated in terms of a detection limit, a response time, and selectivity.  相似文献   

14.
Abstract

Stability-indicating analytical methods are developed to monitor the stability of pharmaceutical dosage forms during the investigational phase of drug development, and, once the drug is marketed, for the ongoing stability studies which must be conducted. The development of these methods for pharmaceutical dosage forms forms can be approached from several avenues. Methods can be developed which measure the amount of drug remaining, the amount of drug lost (or the appearance of degradation products), or both.

Traditionally, the analytical methods used to monitor the stability of dosage forms have involved a generally non-specific spectrophotometric or titrimetric procedure for the assay of the active coupled with thin layer chromatography for the estimation of impurities and degradation products. In the last five years, this approach has changed dramatically. Currently, the method of choice for the quantitation of the active and degradation products is rapidly becoming high performance liquid chromatography. This method has obvious advantages since it both separates and measures and it lends itself well to automation. The disadvantages are that, in the absence of automation, the technique can be time-consuming, it is by no means universal, and it is relatively expensive. Recent advances in column technology have reduced some separation times to seconds and, in the next few years, this technique may find even greater utility.

HPLC, however, is not the only way to go. Other chromatographic methods still find a place, particularly gas chromatography when the stability of the component of interest does not pose a problem and thin layer chromatography for the rapid determination of degradation products. Other methods may also be used, including electrometric, e.g., polarography, and spectrophotometric, e.g., fluorimetry or NMR. The choice of an appropriate method must depend on both a scientific and practical evaluation of the drug and its dosage form.

Once an analytical method is chosen, the most important aspect of the development of a stability-indicating procedure is method validation. Validation should include evaluation of the following parameters: specificity, linearity, precision, accuracy, sensitivity, and ruggedness.

There are many other aspects to stability that could also be considered, e.g., the stability of the bulk drug and physical and organoleptic changes in a dosage form. These should be part of a separate discussion. It very often happens that, during the course of product development, analytical methods evolve. As more is learned about the drug and its dosage form, methods can be refined and revised.  相似文献   

15.
Stability-indicating analytical methods are developed to monitor the stability of pharmaceutical dosage forms during the investigational phase of drug development, and, once the drug is marketed, for the ongoing stability studies which must be conducted. The development of these methods for pharmaceutical dosage forms forms can be approached from several avenues. Methods can be developed which measure the amount of drug remaining, the amount of drug lost (or the appearance of degradation products), or both.

Traditionally, the analytical methods used to monitor the stability of dosage forms have involved a generally non-specific spectrophotometric or titrimetric procedure for the assay of the active coupled with thin layer chromatography for the estimation of impurities and degradation products. In the last five years, this approach has changed dramatically. Currently, the method of choice for the quantitation of the active and degradation products is rapidly becoming high performance liquid chromatography. This method has obvious advantages since it both separates and measures and it lends itself well to automation. The disadvantages are that, in the absence of automation, the technique can be time-consuming, it is by no means universal, and it is relatively expensive. Recent advances in column technology have reduced some separation times to seconds and, in the next few years, this technique may find even greater utility.

HPLC, however, is not the only way to go. Other chromatographic methods still find a place, particularly gas chromatography when the stability of the component of interest does not pose a problem and thin layer chromatography for the rapid determination of degradation products. Other methods may also be used, including electrometric, e.g., polarography, and spectrophotometric, e.g., fluorimetry or NMR. The choice of an appropriate method must depend on both a scientific and practical evaluation of the drug and its dosage form.

Once an analytical method is chosen, the most important aspect of the development of a stability-indicating procedure is method validation. Validation should include evaluation of the following parameters: specificity, linearity, precision, accuracy, sensitivity, and ruggedness.

There are many other aspects to stability that could also be considered, e.g., the stability of the bulk drug and physical and organoleptic changes in a dosage form. These should be part of a separate discussion. It very often happens that, during the course of product development, analytical methods evolve. As more is learned about the drug and its dosage form, methods can be refined and revised.  相似文献   

16.
A new reliability methodology and tools have been created for setting reliability requirements. At the heart of the new methodology are reliability requirements based on specified minimum failure‐free operating (MFFOP) intervals and a maximum acceptable level of the probability of premature failure. These types of requirements are suitable to industries where the consequences of failure and the cost of intervention for maintenance are very high (e.g. deepwater offshore oil and gas industries). The methodology proposed includes models and tools for: (i) setting reliability requirements to limit the risk of premature failure below an acceptable level; (ii) setting reliability requirements to minimize the total losses; and (iii) setting reliability requirements to guarantee a set of MFFOP intervals. An advantage of the MFFOP approach is that it directly links the reliability requirements with health, safety, environmental and business risks. Another advantage is that the MFFOP requirements are suitable for non‐constant hazard rates where the mean time to failure (MTTF) reliability measure is often misleading. A solution to the important problem of determining the maximum hazard rate that guarantees with a required probability the existence of a specified set of MFFOP intervals has also been found. The reliability tools proposed also permit the extraction of useful information from data sets containing a given number of random failures, in cases where the failure times are unknown. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

17.
Sun G  Yang K  Zhao Z  Guan S  Han X  Gross RW 《Analytical chemistry》2007,79(17):6629-6640
A shotgun metabolomics approach using MALDI-TOF/TOF mass spectrometry was developed for the rapid analysis of negatively charged water-soluble cellular metabolites. Through the use of neutral organic solvents to inactivate endogenous enzyme activities (i.e., methanol/chloroform/H2O extraction), in conjunction with a matrix having minimal background noise (9-amnioacridine), a set of multiplexed conditions was developed that allowed identification of 285 peaks corresponding to negatively charged metabolites from mouse heart extracts. Identification of metabolite peaks was based on mass accuracy and was confirmed by tandem mass spectrometry for 90 of the identified metabolite peaks. Through multiplexing ionization conditions, new suites of metabolites could be ionized and "spectrometric isolation" of closely neighboring peaks for subsequent tandem mass spectrometric interrogation could be achieved. Moreover, assignments of ions from isomeric metabolites and quantitation of their relative abundance was achieved in many cases through tandem mass spectrometry by identification of diagnostic fragmentation ions (e.g., discrimination of ATP from dGTP). The high sensitivity of this approach facilitated the detection of extremely low abundance metabolites including important signaling metabolites such as IP3, cAMP, and cGMP. Collectively, these results identify a multiplexed MALDI-TOF/TOF MS approach for analysis of negatively charged metabolites in mammalian tissues.  相似文献   

18.
Implementation of a Quality Systems approach to making defensible environmental program decisions depends upon multiple, interrelated components. Often, these components are developed independently and implemented at various facility and program levels in an attempt to achieve consistency and cost savings. The U.S. Department of Energy, Office of Environmental Management (DOE-EM) focuses on three primary system components to achieve effective environmental data collection and use. (1) Quality System guidance, which establishes the management framework to plan, implement, and assess work performed; (2) A Standardized Statement of Work for analytical services, which defines data generation and reporting requirements consistent with user needs; and (3) A laboratory assessment program to evaluate adherence of work performed to defined needs, e.g., documentation and confidence. This paper describes how DOE-EM fulfills these requirements and realizes cost-savings through participation in interagency working groups and integration of system elements as they evolve.  相似文献   

19.
本文研究了水和废水中氯苯的萃取及分析方法。采用气相色谱法、大口径毛细管柱分离氯苯,以ECD进行检测,得到了良好的分离效果和较高的灵敏度,方法的检出限可达0.01mg/L。本方法完全能满足环境样品分析的要求。  相似文献   

20.
In this paper, we examine a single-product, discrete-time, non-stationary, inventory replenishment problem with both supply and demand uncertainty, capacity limits on replenishment quantities, and service level requirements. A scenario-based stochastic program for the static, finite-horizon problem is presented to determine replenishment orders over the horizon. We propose a heuristic that is based on the first two moments of the random variables and a normal approximation, whose solution is compared with the optimal from a simulation-based optimization method. Computational experiments show that the heuristic performs very well (within 0.25% of optimal, on average) even when the uncertainty is non-normal or when there are periods without any supply. We also present insights obtained from sensitivity analyses on the effects of supply parameters, shortage penalty costs, capacity limits, and demand variance. A rolling-horizon implementation is illustrated.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号