首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
Online surveillance of disease has become an important issue in public health. In particular, the space-time monitoring of disease plays an important part in any syndromic system. However, methodology for these systems is generally lacking. One approach to space-time monitoring of health data is to consider the space-time model parameters as the focus and to monitor their changes as multivariate time series (Lawson AB. Some considerations in spatial-temporal analysis of public health surveillance data. In Brookmeyer R, Stroup DF eds. Monitoring the Health of Populations. Oxford University Press, 2004; Vidal Rodeiro CL, Lawson AB. Monitoring changes in spatio-temporal maps of disease. Biometrical Journal 2006; to appear). However with complex space-time models, this becomes very time consuming. Some simplifications may be necessary and these can be made in a number of ways. In this article, the focus is on particle filters that can be used to resample the history of the process and thereby reduce computation time. This article describes a particular case of particle filters, the resample-move algorithm, proposed by Gilks and Berzuini (Gilks WR, Berzuini C. Following a moving target--Monte Carlo inference for dynamic Bayesian models. Journal of the Royal Statistical Society, Series B 2001; 63: 127-46), in the context of disease map surveillance. This is followed by an application to a real data set in which a comparison between the use of Markov chain Monte Carlo methods and the resample-move algorithm is carried out.  相似文献   

2.
Environmental metric software can be used to evaluate the sustainability of a chemical based upon data from the chemical process used to manufacture it. An obstacle to the development of environmental metric software for use in chemical process modeling software has been the inability to obtain information about the process directly from the model. There have been past attempts to develop environmental metrics that make use of the process models, but there has not been an integrated, standardized approach to obtaining the process information required for calculating metrics. As a result, environmental evaluation packages are largely limited to use in a single simulation package, further limiting the development and adoption of these tools.This paper proposes a standardized mechanism for obtaining process information directly from a process model using a strongly integrated interface set, called flowsheet monitoring. The flowsheet monitoring interface provides read-only access to the unit operation and streams within the process model, and can be used to obtain the material flow data from the process streams. This material flow data can then be used to calculate process-based environmental metrics. The flowsheet monitoring interface has been proposed as an extension of the CAPE-OPEN chemical process simulation interface set.To demonstrate the capability of the flowsheet monitoring interfaces, the US Environmental Protection Agency (USEPA) WAste Reduction (WAR) algorithm is demonstrated in AmsterCHEM's COFE (CAPE-OPEN Flowsheeting Environment). The WAR add-in accesses the material flows and unit operations directly from the process simulator and uses flow data to calculate the potential environmental impact (PEI) score for the process. The WAR algorithm add-in is included in the latest release of COCO Simulation Environment, available from http://www.cocosimulator.org/.  相似文献   

3.
Image analysis presents a set of powerful methods to receive additional information about multiphase processes. It enables the development of advanced applications for process monitoring and optimization or, so-called, soft sensors. However, the integration of advanced smart sensor systems based on image analysis into the process control system presents a complex task. To address this challenge, a modular automation concept offers a standardized interface to integrate modules. This paper presents an integration profile as a service specification that allows a plug-and-measure integration of smart visual sensors into modular plants. To verify the concept, we applied it to three different use cases. At the end, we discuss open challenges in the integration of complex analysis systems with multidimensional data streams into modular plants.  相似文献   

4.
Abstract

The aim of sequential surveillance is on-line detection of an important change in an underlying process as soon as possible after the change has occurred. Statistical methods suitable for surveillance differ from hypothesis testing methods. In addition, the criteria for optimality differ from those used in hypothesis testing. The need for sequential surveillance in industry, economics, and medicine, and for environmental purposes is described. Even though the methods have been developed under different scientific cultures, inferential similarities can be identified. Applications contain complexities such as autocorrelations, complex distributions, complex types of changes, and spatial as well as other multivariate settings. Approaches to handling these complexities are discussed. Expressing methods for surveillance through likelihood functions makes it possible to link the methods to various optimality criteria. This approach also facilitates the choice of an optimal surveillance method for each specific application and provides some directions for improving earlier suggested methods.  相似文献   

5.
Randomized trials may be designed and interpreted as single experiments or they may be seen in the context of other similar or relevant evidence. The amount and complexity of available randomized evidence vary for different topics. Systematic reviews may be useful in identifying gaps in the existing randomized evidence, pointing to discrepancies between trials, and planning future trials. A new, promising, but also very much debated extension of systematic reviews, mixed treatment comparison (MTC) meta-analysis, has become increasingly popular recently. MTC meta-analysis may have value in interpreting the available randomized evidence from networks of trials and can rank many different treatments, going beyond focusing on simple pairwise-comparisons. Nevertheless, the evaluation of networks also presents special challenges and caveats. In this article, we review the statistical methodology for MTC meta-analysis. We discuss the concept of inconsistency and methods that have been proposed to evaluate it as well as the methodological gaps that remain. We introduce the concepts of network geometry and asymmetry, and propose metrics for the evaluation of the asymmetry. Finally, we discuss the implications of inconsistency, network geometry and asymmetry in informing the planning of future trials.  相似文献   

6.
Investigating the subcellular organization of biomolecules is important for understanding their biological functions. Over the past decade, proximity-dependent labeling methods have emerged as powerful tools for mapping biomolecules in their native context. These methods often capitalize on the in-situ generation of highly reactive intermediates for covalently tagging biomolecules located within nanometers to sub-micrometers of the source of labeling. Among these, photocatalytic proximity labeling methods achieve precise spatial and temporal control of labeling with visible light illumination. In this review, we summarize the mechanisms and applications of existing photocatalytic proximity labeling methods and discuss future opportunities for improving the method.  相似文献   

7.
The influence of several solvents (anhydrous ethanol, white spirit, alkylbenzene AB9, diesel) on the physicochemical parameters of gasoline was studied according to ASTM international standard methods. The parameters investigated (distillation curves, density, Reid vapor pressure) showed differentiated behavior, depending on the class of the solvent (oxygenated, light and heavy aliphatic, aromatic) and the quantity added to the gasoline. The azeotropic mixtures formed by ethanol and hydrocarbons showed a strong influence on the behavior of the distillation curves and the location of the point of a sudden change in temperature was shown to be a possible way to detect adulterations and determine the quantity of solvent added to the gasoline.  相似文献   

8.
Current methods to detect and monitor pathogens in biological systems are largely limited by the tradeoffs between spatial context and temporal detail. A new generation of molecular tracking that provides both information simultaneously involves in situ detection coupled with non-invasive imaging. An example is antisense imaging that uses antisense oligonucleotide probes complementary to a target nucleotide sequence. In this study, we explored the potential of repurposing antisense oligonucleotides initially developed as antiviral therapeutics as molecular probes for imaging of viral infections in vitro and in vivo. We employed nuclease-resistant phosphorodiamidate synthetic oligonucleotides conjugated with cell-penetrating peptides (i.e., PPMOs) previously established as antivirals for dengue virus serotype-2 (DENV2). As proof of concept, and before further development for preclinical testing, we evaluated its validity as in situ molecular imaging probe for tracking cellular DENV2 infection using live-cell fluorescence imaging. Although the PPMO was designed to specifically target the DENV2 genome, it was unsuitable as in situ molecular imaging probe. This study details our evaluation of the PPMOs to assess specific and sensitive molecular imaging of DENV2 infection and tells a cautionary tale for those exploring antisense oligonucleotides as probes for non-invasive imaging and monitoring of pathogen infections in experimental animal models.  相似文献   

9.
This study was performed to develop a Real-Time Risk Monitoring System which helps to do fault detection using the information from plant information systems in a chemical process. In this study, to do fault detection, principal component analysis (PCA) methods of multivariate statistical analysis were used. The fundamental notions are a set of variable combinations, that is, detection of principal components which indicate the tendency of variables and operating data. Besides classical statistic process control, PCA can reduce the dimension of variables with monitoring process. Therefore, they are known as suitable methods to treat enormous data composed of many dimensions. The developed Real-Time Risk Monitoring System can analyze and manage the plant information on-line, diagnose causes of abnormality and so prevent major accidents. It’s useful for operators to treat numerous process faults efficiently.  相似文献   

10.
Three process sensors are discussed, one for the determination of surface tension and two for the determination of liquid density. All three methods are on-line, compatible with digital data acquisition systems, and capable of monitoring flowing process streams. The instruments are described and calibration data are given. Two mathematical models of the bubble tensiometer are developed. Data showing the application of these instruments to the monitoring of batch and continuous emulsion polymerization are given.  相似文献   

11.
The hydrolysis of nucleotides is of paramount importance as an energy source for cellular processes. In addition, the transfer of phosphates from nucleotides onto proteins is important as a post-translational protein modification. Monitoring the enzymatic turnover of nucleotides therefore offers great potential as a tool to follow enzymatic activity. While a number of fluorescence sensors are known, so far, there are no methods available for the real-time monitoring of ATP hydrolysis inside live cells. We present the synthesis and application of a novel fluorogenic adenosine 5′-tetraphosphate (Ap4) analog suited for this task. Upon enzymatic hydrolysis, the molecule displays an increase in fluorescence intensity, which provides a readout of its turnover. We demonstrate how this can be used for monitoring cellular processes involving Ap4 hydrolysis. To this end, we visualized the enzymatic activity in live cells using confocal fluorescence microscopy of the Ap4 analog. Our results demonstrate that the Ap4 analog is hydrolyzed in lysosomes. We show that this approach is suited to visualize the lysosome distribution profiles within the live cell and discuss how it can be employed to gather information regarding autophagic flux.  相似文献   

12.
Current methods of artificial intelligence may often proof ineffective in the process industry, usually because of insufficient data availability. In this contribution, we investigate how data standards can contribute to fulfill the data availability requirements of machine learning methods. We give an overview of AI use cases relevant in the process industry, name related requirements and discuss known standards in the context of implicit vs. explicit data. We conclude with a roadmap sketching how to bring the results of this contribution into practical application.  相似文献   

13.
The design of sustainable supply chains, which recently emerged as an active area of research in process systems engineering, is vital to ensure sustainable development. Despite past and ongoing efforts, the available methods often overlook impacts beyond climate change or incorporate them via standard life cycle assessment metrics that are hard to interpret from an absolute sustainability viewpoint. We here address the design of biomass supply chains considering critical ecological limits of the Earth—planetary boundaries—which should never be surpassed by anthropogenic activities. Our method relies on a mixed-integer linear program that incorporates a planetary boundaries-based damage model to quantify absolute sustainability precisely. We apply this approach to the sugarcane-to-ethanol industry in Argentina, identifying the optimal combination of technologies and network layout that minimize the impact on these ecological boundaries. Our framework can find applications in a wide range of supply chain problems related to chemicals and fuels production, energy systems, and agriculture planning.  相似文献   

14.
The effects of natural disasters, pandemic-induced lockdowns, and other disruptions often cascade across networks. In this work, we use minimum cost of resilience (MCOR) and operation-based resilience metrics to quantify network performance against single-connectivity failures and identify critical connections in interconnected networks. MCOR corresponds to the minimum additional infrastructure investment that is required to achieve a certain level of resilience. To guarantee MCOR, we incorporate the metrics in a multi-scenario mixed-integer linear program (MILP) that accounts for resilience in the design phase of interconnected networks. The goal is to obtain optimal generation and transportation capacities with flexible operation under all single-connectivity disruption scenarios. We demonstrate the applicability of our resilience-aware framework on a water-energy nexus (WEN) example focusing on grass-root design and retrofitting. We further apply the framework to analyze a regional WEN and observe that it is possible to achieve “full” resilience in the expense of additional regional investments.  相似文献   

15.
Economic, energy, and sustainability metrics are key performance indicators for process operations. The relative importance of these metrics varies from plant to plant, and often some metrics are in conflict with each other (sustainability vs. profitability). In this paper we discuss the current plant environment and how various metrics can be aligned by focusing on energy efficiency. Power-steam systems are the major energy drivers for most plants, and we discuss possible operational changes that might improve energy efficiency, as well as the role of process control. Managing the interplay of real-time optimization and regulatory control is a challenge for the future, as well as interfacing with the implementation of smart power grids by the utility industry. Combined heat and power along with energy storage presents interesting control and optimization opportunities to maximize energy efficiency.  相似文献   

16.
Robust statistics is an extension of classical parametric statistics that specifically takes into account the fact that the assumed parametric models used by the researchers are only approximate. In this article, we review and outline how robust inferential procedures may routinely be applied in practice in the biomedical research. Numerical illustrations are given for the t-test, regression models, logistic regression, survival analysis and ROC curves, showing that robust methods are often more appropriate than standard procedures.  相似文献   

17.
In this article, we review the mathematical foundations of convolutional neural nets (CNNs) with the goals of: (i) highlighting connections with techniques from statistics, signal processing, linear algebra, differential equations, and optimization, (ii) demystifying underlying computations, and (iii) identifying new types of applications. CNNs are powerful machine learning models that highlight features from grid data to make predictions (regression and classification). The grid data object can be represented as vectors (in 1D), matrices (in 2D), or tensors (in 3D or higher dimensions) and can incorporate multiple channels (thus providing high flexibility in the input data representation). CNNs highlight features from the grid data by performing convolution operations with different types of operators. The operators highlight different types of features (e.g., patterns, gradients, geometrical features) and are learned by using optimization techniques. In other words, CNNs seek to identify optimal operators that best map the input data to the output data. A common misconception is that CNNs are only capable of processing image or video data but their application scope is much wider; specifically, datasets encountered in diverse applications can be expressed as grid data. Here, we show how to apply CNNs to new types of applications such as optimal control, flow cytometry, multivariate process monitoring, and molecular simulations.  相似文献   

18.
19.
Recently, techniques available for identifying clusters of individuals or boundaries between clusters using genetic data from natural populations have expanded rapidly. Consequently, there is a need to evaluate these different techniques. We used spatially-explicit simulation models to compare three spatial Bayesian clustering programs and two edge detection methods. Spatially-structured populations were simulated where a continuous population was subdivided by barriers. We evaluated the ability of each method to correctly identify boundary locations while varying: (i) time after divergence, (ii) strength of isolation by distance, (iii) level of genetic diversity, and (iv) amount of gene flow across barriers. To further evaluate the methods' effectiveness to detect genetic clusters in natural populations, we used previously published data on North American pumas and a European shrub. Our results show that with simulated and empirical data, the Bayesian spatial clustering algorithms outperformed direct edge detection methods. All methods incorrectly detected boundaries in the presence of strong patterns of isolation by distance. Based on this finding, we support the application of Bayesian spatial clustering algorithms for boundary detection in empirical datasets, with necessary tests for the influence of isolation by distance.  相似文献   

20.
Abstract

Multivariate exponentially weighted moving average (MEWMA) charts are popular, handy, and effective procedures to detect distributional changes in a stream of multivariate data. For doing appropriate performance analysis, dealing with the steady-state behavior of the MEWMA statistic is essential. Going beyond early papers, we derive quite accurate approximations of the respective steady-state densities of the MEWMA statistic. It turns out that these densities could be rewritten as the product of two functions depending on one argument only that allows feasible calculation. For proving the related statements, the presentation of the noncentral chi-square density deploying the confluent hypergeometric limit function is applied. Using the new methods it was found that for large dimensions, the steady-state behavior becomes different from what one might expect from the univariate monitoring field. Based on the integral equation driven methods, steady-state and worst-case average run lengths are calculated with higher accuracy than before. Eventually, optimal MEWMA smoothing constants are derived for all considered measures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号