共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper demonstrates the use of two-dimensional (2D) correlation spectroscopy in conjunction with alternating least squares (ALS) based self-modeling curve resolution (SMCR) analysis of spectral data sets. This iterative regression technique utilizes the non-negativity constraints for spectral intensity and concentration. ALS-based SMCR analysis assisted with 2D correlation was applied to Fourier transform infrared (FT-IR) spectra of a polystyrene/methyl ethyl ketone/deuterated toluene (PS/MEK/d-toluene) solution mixture during the solvent evaporation process to obtain the pure component spectra and then the time-dependent concentration profiles of these three components during the evaporation process. We focus the use of asynchronous 2D correlation peaks for the identification of pure variables needed for the initial estimates of the ALS process. Choosing the most distinct bands via the positions of asynchronous 2D peaks is a viable starting point for ALS iteration. Once the pure variables are selected, ALS regression can be used to obtain the concentration profiles and pure component spectra. The obtained pure component spectra of MEK, d-toluene, and PS matched well with known spectra. The concentration profiles for components looked reasonable. 相似文献
2.
3.
A sequence of multiple parts is processed on a multi-position transfer line of conveyor type. This sequence consists of identical subsequences (batches). The sets of operations executed for each part at each position are given and these sets for different parts can intersect. Some operations executed at one position can be aggregated into blocks of operations. Each block is executed at a uniform rate (in particular, feed per minute) by a common drive unit. The set of potentially feasible blocks is specified. We consider the situation when the sets of operations for different blocks do not intersect and each potential block can be executed either completely aggregated (i.e. as one block) or completely disaggregated (individually). Aggregation reduces the investment costs, but can increase the consumption of tools due to excluding the individual selection of rates for aggregated operations. The accepted option of the aggregation and the rates of operations remain invariable during the line functioning. The problem is to select the optimal option of aggregation and rates of all operations that minimise the total batch processing cost while ensuring the required line productivity. A mathematical model of the problem and a two-level decomposition method for its solution are proposed. The statement of the problem and the results of its solution are illustrated on a real industrial example. The developed model and method can be applied to solve similar problems arising in other domains. 相似文献
4.
Ghassan S Kassab 《Journal of the Royal Society Interface》2006,3(11):719-740
Biomechanics relates the function of a physiological system to its structure. The objective of biomechanics is to deduce the function of a system from its geometry, material properties and boundary conditions based on the balance laws of mechanics (e.g. conservation of mass, momentum and energy). In the present review, we shall outline the general approach of biomechanics. As this is an enormously broad field, we shall consider a detailed biomechanical analysis of the aorta as an illustration. Specifically, we will consider the geometry and material properties of the aorta in conjunction with appropriate boundary conditions to formulate and solve several well-posed boundary value problems. Among other issues, we shall consider the effect of longitudinal pre-stretch and surrounding tissue on the mechanical status of the vessel wall. The solutions of the boundary value problems predict the presence of mechanical homeostasis in the vessel wall. The implications of mechanical homeostasis on growth, remodelling and postnatal development of the aorta are considered. 相似文献
5.
6.
Delphine Jouan-Rimbaud D.L. Massart C.A. Saby C. Puel 《Chemometrics and Intelligent Laboratory Systems》1998,40(2):1039-144
An approach for the investigation and comparison of the data structure in the multidimensional space is proposed. It is based on three properties, namely, the direction of the data sets, the variance–covariance of the data points, and the location of the data sets' centroids. A number of tests have been studied and are presented. It is shown that the combined use of these parameters allows a satisfactory estimation of the representativity between two data sets. Simulated data, as well as real case studies are presented. 相似文献
7.
In this paper we use parametric variational inequality problems for the purpose of describing entire solution sets of generalized Nash games with shared constraints. We prove two theoretical results and we introduce a computational method that practitioners can implement in applied problems modeled as generalized Nash games, under assumptions present in the current literature. Further, we give illustrative examples of how our computational technique is used to derive solution sets of known generalized Nash games previously not solved by existing techniques. We close with the presentation of an applied problem formulated as a generalized Nash game, namely a model of a joint implementation environmental accord between countries. We discuss the possible advantages of modeling it within a generalized Nash game framework. 相似文献
8.
Scientists, especially environmental scientists, often encounter trace level concentrations that are typically reported as less than a certain limit of detection, L. Type I left-censored data arises when certain low values lying below L are ignored or unknown as they cannot be measured accurately. In many environmental quality assurance and quality control (QA/QC), and groundwater monitoring applications of the United States Environmental Protection Agency (USEPA), values smaller than L are not required to be reported. However, practitioners still need to obtain reliable estimates of the population mean μ, and the standard deviation (S.D.) σ. The problem gets complex when a small number of high concentrations are observed with a substantial number of concentrations below the detection limit. The high-outlying values contaminate the underlying censored sample, leading to distorted estimates of μ and σ. The USEPA, through the National Exposure Research Laboratory-Las Vegas (NERL-LV), under the Office of Research and Development (ORD), has research interests in developing statistically rigorous robust estimation procedures for contaminated left-censored data sets. Robust estimation procedures based upon a proposed (PROP) influence function are shown to result in reliable estimates of population parameters of mean and S.D. using contaminated left-censored samples. It is also observed that the robust estimates thus obtained with or without the outliers are in close agreement with the corresponding classical estimates after the removal of outliers. Several classical and robust methods for the estimation of μ and σ using left-censored (truncated) data sets with potential outliers have been reviewed and evaluated. 相似文献
9.
In conformity with the experimental results on the density, heat conductivity, specific heat, and viscosity of water, it is established that 1) the fluid thermal activity coefficient can be expressed in a form analogous to the equation of state, and 2) a linear dependence exists between the fluid thermal activity and heat conductivity coefficients as well as the viscosity.
相似文献10.
11.
Two experiments were conducted to study how age affects street-crossing decisions in an estimation task, with particular emphasis on how oncoming vehicle speed and a time constraint influence the time gap deemed acceptable for crossing. Experiment 1 showed that when there was a time constraint, all age groups selected a shorter time gap for the higher speed. This was associated with a large number of missed opportunities for the low speed and many unsafe decisions for the high speed. In the second experiment, which had no time constraint, young pedestrians operated in a constant-time mode regardless of speed, whereas older pedestrians accepted shorter and shorter time gaps as speed increased. The results seem to indicate that the effect of speed is due to a mixed operating mode of participants, whose decisions may be based on either time or vehicle distance, depending on the task requirements and on the participant's own ability to meet those requirements. 相似文献
12.
The development process of an expert system for the automated interpretation of large EPMA data sets
《Chemometrics and Intelligent Laboratory Systems》1988,4(2):147-161
The applicability of the artificial intelligence language OPS5 to represent chemical knowledge in the field of X-ray analysis is evaluated. The problem studied here involves the automated interpretation of large numbers of X-ray spectra obtained by electron probe microanalysis of single particles. The algorithm used during the data reduction phase is outlined and the expert system's data and knowledge base are discussed. Special attention has been paid to the incremental growth process of the knowledge base of the system. Starting from a limited prototype system, which was based on the increase/decrease of probability values of chemical elements, a more powerful expert system was constructed by adding chemical knowledge to its rule base, enabling the system to deal with more complex X-ray spectra. Evaluation of the system's performance shows that it is able to interpret 80–90% of the spectra of a complex data set correctly. 相似文献
13.
14.
在电影制作数字化转型过程中,负责数字影像流程和影像技术质量控制的DIT(Digital Imaging Technician,数字影像工程师)伴随而生。本文梳理了现阶段常规电影拍摄中DIT的工作内容和管理方法,并介绍了电影《金刚川》在其特定的制作背景下对影像数据及现场调色的管理需求和实现方法,对电影前期拍摄的影像数据和现场调色的管理方法进行总结和展望。 相似文献
15.
A snake crawling on horizontal surfaces between two parallel walls exhibits a unique wave-like shape, which is different from the normal shape of a snake crawling without constraints. We propose that this intriguing system is analogous to a buckled beam under two lateral constraints. A new theoretical model of beam buckling, which is verified by numerical simulation, is firstly developed to account for the special boundary conditions. Under this theoretical model, the effect of geometrical parameters on the deformation shape, such as the distance between walls, length of the snake and radius of the snake, is examined. The buckling beam model is then applied to explain qualitatively the wave-like shape of the snake. 相似文献
16.
Liu J Erassov A Halina P Canete M Nguyen DV Chung C Cagney G Ignatchenko A Fong V Emili A 《Analytical chemistry》2008,80(20):7846-7854
Tandem mass spectrometry is the prevailing approach for large-scale peptide sequencing in high-throughput proteomic profiling studies. Effective database search engines have been developed to identify peptide sequences from MS/MS fragmentation spectra. Since proteins are polymorphic and subject to post-translational modifications (PTM), however, computational methods for detecting unanticipated variants are also needed to achieve true proteome-wide coverage. Different from existing "unrestrictive" search tools, we present a novel algorithm, termed SIMS (for Sequential Motif Interval Search), that interprets pairs of product ion peaks, representing potential amino acid residues or "intervals", as a means of mapping PTMs or substitutions in a blind database search mode. An effective heuristic software program was likewise developed to evaluate, rank, and filter optimal combinations of relevant intervals to identify candidate sequences, and any associated PTM or polymorphism, from large collections of MS/MS spectra. The prediction performance of SIMS was benchmarked extensively against annotated reference spectral data sets and compared favorably with, and was complementary to, current state-of-the-art methods. An exhaustive discovery screen using SIMS also revealed thousands of previously overlooked putative PTMs in a compendium of yeast protein complexes and in a proteome-wide map of adult mouse cardiomyocytes. We demonstrate that SIMS, freely accessible for academic research use, addresses gaps in current proteomic data interpretation pipelines, improving overall detection coverage, and facilitating comprehensive investigations of the fundamental multiplicity of the expressed proteome. 相似文献
17.
Arneberg R Rajalahti T Flikka K Berven FS Kroksveen AC Berle M Myhr KM Vedeler CA Ulvik RJ Kvalheim OM 《Analytical chemistry》2007,79(18):7014-7026
18.
Desmet G Clicq D Nguyen DT Guillarme D Rudaz S Veuthey JL Vervoort N Torok G Cabooter D Gzil P 《Analytical chemistry》2006,78(7):2150-2162
It is demonstrated that the kinetic plot representation of experimental plate height data can also account for practical constraints on the column length, the peak width, the viscous heating, and the mobile-phase velocity without needing any iterative solution routine. This implies that the best possible kinetic performance to be expected from a given tested support under any possible set of practical optimization constraints can always be found using a directly responding calculation spreadsheet template. To show how the resulting constrained kinetic plots can be used as a powerful design and selection tool, the method has been applied to a series of plate height measurements performed on a number of different commercial columns for the same component (butyl-parabene) and mobile-phase composition. The method, for example, allows one to account for the fact that the advantageous solutions displayed by the silica monolith and 5 microm particle columns in the large plate number range of the free kinetic plot are no longer accessible if applying a maximal column length constraint of Lmax = 30 cm. In the plate number range that remains accessible, the investigated sub-2 mum particle columns in any case perform (at least for the presently considered parabene separation) better than the 3.5 mum particle columns or silica monolith, especially if considering the use of system pressures exceeding 400 bar. The constrained kinetic plot method can also be used to select the best-suited column length from an available product gamma to perform a separation with a preset number of plates. One of the optimization results that is obtained in this case is that sometimes a significant gain in analysis time can be obtained by selecting a longer column, yielding the desired plate number at a larger velocity than that for a shorter column. 相似文献
19.
Statistical heterospectroscopy, an approach to the integrated analysis of NMR and UPLC-MS data sets: application in metabonomic toxicology studies 总被引:13,自引:0,他引:13
Crockford DJ Holmes E Lindon JC Plumb RS Zirah S Bruce SJ Rainville P Stumpf CL Nicholson JK 《Analytical chemistry》2006,78(2):363-371
Statistical heterospectroscopy (SHY) is a new statistical paradigm for the coanalysis of multispectroscopic data sets acquired on multiple samples. This method operates through the analysis of the intrinsic covariance between signal intensities in the same and related molecules measured by different techniques across cohorts of samples. The potential of SHY is illustrated using both 600-MHz 1H NMR and UPLC-TOFMS data obtained from control rat urine samples (n = 54) and from a corresponding hydrazine-treated group (n = 58). We show that direct cross-correlation of spectral parameters, viz. chemical shifts from NMR and m/z data from MS, is readily achievable for a variety of metabolites, which leads to improved efficiency of molecular biomarker identification. In addition to structure, higher level biological information can be obtained on metabolic pathway activity and connectivities by examination of different levels of the NMR to MS correlation and anticorrelation matrixes. The SHY approach is of general applicability to complex mixture analysis, if two or more independent spectroscopic data sets are available for any sample cohort. Biological applications of SHY as demonstrated here show promise as a new systems biology tool for biomarker recovery. 相似文献