首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
A statistically more reliable approach than the traditional visual inspection of peptide maps to identify a drug compound is to generate a set of reference standards from a designed experiment that incorporates many possible factors that affect variation of peptide mapping. In fact, the experiment can be done for a ruggedness study as part of a high-performance liquid chromatography (HPLC) method validation. Once the ruggedness is proved with the study, those articles in the experiment may form a set of reference standards, and future articles can be compared to the set later to prove identity. A quantitative analysis of the ruggedness study can be done using a chemometrics approach, principal component analysis (PCA). The analysis is used to reduce the many channels of peptide maps to a few manageable dimensions. The scores projected onto the reduced dimensions are used to test factor effects of the ruggedness study. As a by-product, the analysis provides visual inspection of the set of articles in the experiment for any outliers and anomalies.  相似文献   

2.
Peptide mapping is a key analytical method for studying the primary structure of proteins. The sensitivity of the peptide map to even the smallest change in the covalent structure of the protein makes it a valuable “fingerprint” for identity testing and process monitoring. We recently conducted a full method validation study of an optimized reverse-phase high-performance liquid chromatography (RP-HPLC) tryptic map of a therapeutic anti-CD4 monoclonal antibody. We have used this method routinely for over a year to test production lots for clinical trials and to support bioprocess development. One of the difficulties in the validation of the peptide mapping method is the lack of proper quantitative measures of its reproducibility. A reproducibility study may include method and system precision study, ruggedness study, and robustness study. In this paper, we discuss the use of principal component analysis (PCA) to quantitate peptide maps properly using its projected scores on the reduced dimensions. This approach allowed us not only to summarize the reproducibility study properly, but also to use the method as a diagnostic tool to investigate any troubles in the reproducibility validation process.  相似文献   

3.
Peptide mapping is a key analytical method for studying the primary structure of proteins. The sensitivity of the peptide map to even the smallest change in the covalent structure of the protein makes it a valuable “fingerprint” for identity testing and process monitoring. We recently conducted a full method validation study of an optimized reverse-phase high-performance liquid chromatography (RP-HPLC) tryptic map of a therapeutic anti-CD4 monoclonal antibody. We have used this method routinely for over a year to test production lots for clinical trials and to support bioprocess development. One of the difficulties in the validation of the peptide mapping method is the lack of proper quantitative measures of its reproducibility. A reproducibility study may include method and system precision study, ruggedness study, and robustness study. In this paper, we discuss the use of principal component analysis (PCA) to quantitate peptide maps properly using its projected scores on the reduced dimensions. This approach allowed us not only to summarize the reproducibility study properly, but also to use the method as a diagnostic tool to investigate any troubles in the reproducibility validation process.  相似文献   

4.
A chemometrics approach, multivariate calibration in particular, was used to determine the polymorphism of a drug compound based on Fourier transform infrared (FTIR) spectroscopy. The partial least-squares projection to latent structure makes use of all of the data, and the latent variables created by the method make use of hidden or partially separated peaks for quantitation. This paper illustrates the usefulness of the partial least-squares multivariate calibration method as an efficient tool to determine the polymorphism of a drug. Also, the analysis suggests the use of information from the modeling as diagnostic tools to gain more insight from the data. In particular, the diagnostic tools allow an analyst to assess design characteristics and any shortcomings of a calibration experiment for the polymorphism of a drug compound.  相似文献   

5.
Various conflicting proposals for degrees of freedom associated with the residuals of a principal component analysis have been published in the chemometrics-oriented literature. Here, a detailed derivation is given of the ‘standard’ formula from statistics. This derivation intends to be more accessible to chemometricians than, for example, the impeccable, but condensed proof that was published by John Mandel in a relatively unknown paper (J. Res. Nat. Bur. Stand., 74B (1970) 149–154). The derivation is presented in the form of a two-stage recipe that also appears to apply to more complex multiway models like the ones considered by Ceulemans and Kiers (Br. J. Math. Stat. Psych., 59 (2006) 133–150).  相似文献   

6.
A shape grammar is a production system that can be used to create new product designs. Traditionally, a product shape grammar’s rules are created by a skilled person that understands the language of the design. In this paper the results of a principal component analysis of vehicles are used to create a vehicle shape grammar by basing the rules upon the determined shape relationships. The advantages are that: rules can be created according the results of a statistical analysis, and not according to a designer’s subjective observations; class specific vehicles can be created with fewer rule applications; and those rule applications encourage divergent designs. Using the principal component analysis based shape grammar, unique vehicles are created to demonstrate the potential of statistically based concept creation for the generation of product forms.  相似文献   

7.
Thii report provides a simple graphical procedure for obtaining the slope and intercept, of the straight line of best fit to a set of points in two dimensions. The solution is obtained in the dual space (coordinate system) by use of mapping. Although thii procedure is useful in itself for two dimensional problems, it may be even more useful as a teaching aid in illustrating some simple properties of mapping, dual spaces, the geometric meaning of an inverse, and the basic properties of curve fitting.  相似文献   

8.
9.
The use of response surface methods are well established in the global optimization of expensive functions, the response surface acting as a surrogate to the expensive function objective.In structural design however, the change in objective may vary little between the two models: it is more often the constraints that change with models of varying fidelity. Here approaches are described whereby the coarse model constraints are mapped so that the mapped constraints more faithfully approximate the fine model constraints. The shape optimization of a simple structure demonstrates the approach.  相似文献   

10.
In this paper, a new method to approximate a data set by another data set with constrained covariance matrix is proposed. The method is termed Approximation of a DIstribution for a given COVariance (ADICOV). The approximation is solved in any projection subspace, including that of Principal Component Analysis (PCA) and Partial Least Squares (PLS). Given the direct relationship between covariance matrices and projection models, ADICOV is useful to test whether a data set satisfies the covariance structure in a projection model. This idea is broadly applicable in chemometrics. Also, ADICOV can be used to simulate data with a specific covariance structure and data distribution. Some applications are illustrated in an industrial case of study.  相似文献   

11.
Data normalization plays a crucial role in metabolomics to take into account the inevitable variation in sample concentration and the efficiency of sample preparation procedure. The conventional methods such as constant sum normalization (CSN) and probabilistic quotient normalization (PQN) are widely used, but both methods have their own shortcomings. In the current study, a new data normalization method called group aggregating normalization (GAN) is proposed, by which the samples were normalized so that they aggregate close to their group centers in a principal component analysis (PCA) subspace. This is in contrast with CSN and PQN which rely on a constant reference for all samples. The evaluation of GAN method using both simulated and experimental metabolomic data demonstrated that GAN produces more robust model in the subsequent multivariate data analysis, more superior than both CSN and PQN methods. The current study also demonstrated that some of the differential metabolites identified using the CSN or PQN method could be false positives due to improper data normalization.  相似文献   

12.
13.
An analytical survey of 20 paper and board (P&B) materials intended for food use was carried out with the aim to identify chemicals with a potential to migrate into foods. Representative materials covering a range of uses (primary and secondary packaging and article for take away foods) were obtained from distributors. A screening approach was applied by means of solvent extraction with subsequent analysis by gas chromatography/mass spectrometry. A large number of analytes were detected, and a chemometric approach was used to explore the data. Principal component analysis was used to identify and select some compounds as markers for sample classification. In the corrugated and printed packaging, it is worth emphasizing the presence of residual solvents, probably coming from printing inks, as well as hydrocarbons and aromatic compounds, mainly toluene and plasticizers linked also to the recycled pulp content such as diisobutyl phthalate or diisopropylnaphthalenes, whereas in the plastic‐laminated samples, triacetin was identified as the prevailing compound. A literature search for safety data or legislative restrictions of the identified substances was performed. Additionally, the semi‐quantification of the compounds in the packaging allowed a worst case estimation of food contamination by means of the infinite total migration model; occasionally, migration estimations overcame the specific migration limits. The chosen analytical methods coupled with a chemometric approach proved to be an effective way to describe the data; it may be concluded that only the simultaneous consideration of several chemicals with a multivariate approach allowed the investigated packaging materials to be distinguished. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

14.
The successful execution and management of Offshore Software Maintenance Outsourcing (OSMO) can be very beneficial for OSMO vendors and the OSMO client. Although a lot of research on software outsourcing is going on, most of the existing literature on offshore outsourcing deals with the outsourcing of software development only. Several frameworks have been developed focusing on guiding software system managers concerning offshore software outsourcing. However, none of these studies delivered comprehensive guidelines for managing the whole process of OSMO. There is a considerable lack of research working on managing OSMO from a vendor’s perspective. Therefore, to find the best practices for managing an OSMO process, it is necessary to further investigate such complex and multifaceted phenomena from the vendor’s perspective. This study validated the preliminary OSMO process model via a case study research approach. The results showed that the OSMO process model is applicable in an industrial setting with few changes. The industrial data collected during the case study enabled this paper to extend the preliminary OSMO process model. The refined version of the OSMO process model has four major phases including (i) Project Assessment, (ii) SLA (iii) Execution, and (iv) Risk.  相似文献   

15.
19-Nor-1α,25-dihydroxyvitamin D2, an analog of vitamin D2, is a nonpolar compound with limited solubility in water. An injectable solution was formulated using a cosolvent system consisting of water, ethanol, and propylene glycol. A statistical response surface approach was used to evaluate the effect of these three solvents on the solubility of the drug (25°C) in the ternary cosolvent system. The data generated from five selected formulations were used to develop a multiple linear regression model that quantitatively defines the solubility of the drug as a function of the cosolvent composition. Close agreement was found between the experimental data and data calculated using the model. The capability of this model to predict drug solubility in cosolvent systems with various combinations of the three solvents was also verified.  相似文献   

16.
Supply chain management, a field that developed from business practice and research, is undergoing a major transformation. It is changing from tactical in nature (where the major focus is on cost and delivery) to a field that is strategic in nature. However, the future issues and challenges facing managers and executives are just now becoming understood. This paper reports these issues by drawing on the findings generated by a three-phase study consisting of a literature review, a two-round Delphi study, and a workshop. Unique in this Delphi study is that it brings together leading practitioners in supply chain management with leading supply chain management researchers. The findings show that while the focus of the current tactical supply chain view is relatively limited to issues of delivery, risk, and leadership, the supply chain view of the future (i.e., five years from now) is more complex and demanding. The findings also show that there is generally no difference between researchers and practitioners in terms of how they view the issues. Finally, the study uncovers major obstacles that must be resolved before the strategic potential of future-state supply chains can be realised.  相似文献   

17.
Since its beginning, lean manufacturing has built a worldwide reputation based on results related to production improvement and cost reduction in several companies. This management philosophy focuses on customer value creation through the elimination of production wastes. Lean methods and techniques have spread their scope from the automotive industry to a wide range of industries and services. This article presents a case study that describes the use of the lean tool value stream mapping in the production process of automotive parts for a major automotive company. At the beginning of the project, relevant data from the process were collected and analysed. Subsequently, the initial process was mapped, the related wastes were identified, and then future processes were mapped and financial results were estimated. The proposals were presented on kaizen meetings, the action plan was discussed and the decision regarding which option to choose was taken. Consequently, the Cycle Time and the level of the workforce were reduced, the process was improved and savings were obtained.  相似文献   

18.
Abstract:  For aerospace components there is undoubtedly a critical need to detect incipient damage in the structure, as any microscopic crack or defect can potentially lead to catastrophic failure and loss of human life. This paper investigates the scattering of an ultrasonic-guided wave into a hollow cylinder-like structure, under both damaged and undamaged conditions. Hollow cylinder structures are widely used not only in aerospace components but also in other engineering applications. The wave was sequentially transmitted and captured by means of a 'real-time data-acquisition system' combined with integrated disc-shaped piezoceramic transducers. The integration of the tested structure and the transducers formed a structural health monitoring system. Wave responses were recorded from both of the structural conditions for the purpose of damage identification using a novelty detection method called 'outlier analysis'. The principal component analysis method of reducing the dimensionality of the feature space is also presented in this paper, with its main aim being to visualise how the data sets behave as a function of the structural conditions.  相似文献   

19.
Sixteen priority polycyclic aromatic (PAHs) in PM(2.5) and PM(2.5-10) samples collected from 20 sites in Beijing, China in December 2005 and January 2006 were analyzed to determine the composition, spatial distribution and sources. Total PAHs of PM(2.5) and PM(2.5-10) ranged from 5.2 to 1062.2 ng m(-3) and 7.6 to 759.7 ng m(-3), respectively, categorized as heavier pollution. Among five kind of functional zones involved, industrial center, commercial area and village were heavily polluted. The mean concentration of PAHs in PM(2.5) of 407 ng m(-3) was 1.67-fold of that in PM(2.5-10), which was relatively high compared to the previous studies (winter in 2001 and 2002). The most evident change was the increase of Flu, BbkF and InP, which are believed to be less harmful and related to the increasing use of clean energy. However, pollution distribution was spatially heterogeneous inside the city. The most polluted sites located in the southeast of the city. Unlike previous studies, fluoranthene was the most abundant component quantified, which could be associated with increasing use of natural gas as clean energy. Compositional analysis and principal component analysis (PCA) suggested that different kinds of combustion were the main source of the PAHs in PM, though contribution of coal was still evident.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号