首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Eto  Hajime 《Scientometrics》2000,47(1):25-42
Authorship and citation patterns in major journals in operational research (OR) are analysed. As a forerunner of interdisciplinary specialities applying mathematical or quantitative methods to social problems, OR has recently been in severe competition with new challengers with respect to applicable methods and real implementation. Through the analyses of authorship and citation patterns, this paper discusses behaviours of the journal editors and contributors with regard to the competition and reform policy of OR journals.  相似文献   

2.
Over the last decades the quantitative research based on Operations Research and Management Science (OR/MS) approaches has become one of the leading research paradigms in marketing. The aim of this article is, to give the reader of this special issue an overview of recent publications in OR/MS based marketing research. Its basis is a literature review of quantitative marketing publications with OR/MS orientation in the leading journals of marketing and management. The review reveals general differences in the quantity of the publications and the domain of research between those journals published in English and those in German. The review also provides an overview of research publications during the recent years and shows possible future trends in quantitative marketing research.  相似文献   

3.
Ligand binding assays (LBAs) are widely used for therapeutic monoclonal antibody (mAb) quantification in biological samples. Major limitations are long method development times, reagent procurement, and matrix effects. LC-MS/MS methods using signature peptides are emerging as an alternative approach, which typically use a stable isotope labeled signature peptide as the internal standard (IS). However, a new IS has to be generated for every candidate, and the IS may not correct for variations at all processing steps. We have developed a general LC-MS/MS method approach employing a uniformly heavy-isotope labeled common whole mAb IS and a common immunocapture for sample processing. The method was streamlined with automation for consistency and throughput. Method qualification of four IgG(2) and four IgG(1) mAbs showed sensitivity of 0.1 μg/mL and linearity of 0.1-15 μg/mL. Quality control (QC) data of these eight mAbs were accurate and precise. The QC performance of the whole molecule labeled IS was better than those of synthetic labeled IS peptides tested. The pharmacokinetic results of two mAbs (an IgG(2) and IgG(1) candidate) dosed in rats were comparable to those of LBA. The general LC-MS/MS method approach overcomes the limitations of current methods to reduce time and resources required for preclinical studies.  相似文献   

4.
A novel approach for on-line introduction of internal standard (IS) for quantitative analysis using LC-MS/MS has been developed. In this approach, analyte and IS are introduced into the sample injection loop in different steps. Analyte is introduced into the injection loop using a conventional autosampler (injector) needle pickup from a sample vial. IS is introduced into the sample injection loop on-line from a microreservoir containing the IS solution using the autosampler. As a result, both analyte and IS are contained in the sample loop prior to the injection into the column. Methodology allowed to reliably introduce IS and demonstrated injection accuracy and precision comparable to those obtained using off-line IS introduction (i.e., IS and analyte are premixed before injection) while maintaining chromatographic parameters (i.e., analyte and IS elution time and peak width). This new technique was applied for direct analysis of model compounds in rat plasma using on-line solid-phase extraction (SPE) LC-MS/MS quantification. In combination with on-line SPE, IS serves as a surrogate IS and compensates for signal variations attributed to sample preparation and instrumentation factors including signal suppression. The assays yielded accuracy (85-119%), precision (2-16%), and analyte recovery comparable to those obtained using off-line IS introduction. Furthermore, on-line IS introduction allows for nonvolumetric sample (plasma) collection and direct analysis without the need of measuring and aliquoting a fixed sample volume prior to the on-line SPE LC-MS/MS analysis. Therefore, this methodology enables direct sample (plasma) analysis without any sample manipulation and preparation.  相似文献   

5.
Various mathematical programming models are applied to evaluate operational (static one-period) and dynamic investment. (multiperiod) policies of regional solid waste management (SWM). Special attention is paid to the use of mixed integer programming (MIP) models for dynamic investment policies.

The common objective of minimizing the present value of overall investment and/or management costs is extended to deal explicitly with land-use policies by evaluating the sequencing of landfilling operations. The question of abandonment or upgrading of facilities is introduced into the mathematical framework because of its importance in the light of more restrictive standards. The decision as to which process should be installed at what location and at what time, as well as the decision which landfill should be operated and when, is identified in the MIP model.

Considering the limitations of the data base and the model formalism, the question of the usefulness of relying on one optimal solution from a model is discussed. The insensitivity of the ordinary present value criterion is displayed by analyzing substantially different policies which have only slightly different values of the criterion function. Attempts at introducing the notion of regret for the policy selection are presented in the context of Paretian environmental analysis.  相似文献   

6.
In the last few decades, researchers proceeded their foot for analyzing the weld characteristic index of materials since the development of novel materials began tremendously. Presently, evaluation of the optimum synchronization among considered input factors for materials are even now surviving into ill-defined mode; soliciting novel mathematical models as well as robustly designed decision support systems that could effectively handle nonpartial information (experimental data). In the present reporting, multiobjective optimization dilemmas have been answered in metal inert gas (MIG) welding process using MS plate (Grade: IS 2062) specimen. The considered specimen was checked to harmonize the optimum setting between input factors, for example, welding current, open circuit voltage, and thickness of plate, with respect to obtaining prosperous weld strength as well as bead geometry quality characteristics, for example, tensile strength, bead width, reinforcement, penetration, and dilution. In the present research work, the Taguchi’s L9 orthogonal array (OA) design was preferred to conduct the experiments on MS plate (Grade: IS 2062) specimens in the domain of MIG welding process. Thereafter, the evaluated multiple objectives transformed into a single response via exploration of grey relation analysis (GRA) and principal component analysis (PCA) approaches to determine the optimum setting between input factors. Next, the outset of signal-to-noise ratio (S/N ratio) along with Analysis of variance (ANOVA) productively was utilized to determine the priority weights against the defined input factors (significant factor). The significant contribution of the present report was to propose a robustly designed decision support system that could assist the readers/researchers to resolve the discussed problems.  相似文献   

7.
Abstract

Engineering management (EM) and management of technology (MOT) M.S. degree programs throughout the U.S. are entrusted with educating future leaders of industry and technology. However, little research has been conducted on the nature of the methods and tactics taught.

This research used the six schools of management thought as developed by Dr. Harold Koontz (1961) to classify various EM and MOT programs. Combining similar schools of management thought reduced them to three. These are management process/empirical, human behavior/social systems, and mathematical/decision theory. This study indicates heavy use of the mathematical/ decision theory and the management process/empirical schools by most of EM and MOT programs.  相似文献   

8.
Information systems permeate every business function, thereby requiring holistic Information Systems (IS) approaches. Much academic research is still discipline specific. More interdisciplinary research is needed to inform both industry and academe. Interdisciplinary research has been positively associated with increased levels of innovation, productivity and impact. IS research contributes to the knowledge creation and innovation within IS and other College of Business (COB) disciplines. This research defines the intellectual structures within IS and between IS and other COB disciplines. We use a large scale, diachronic bibliometric analysis of COB journals to assess reciprocal knowledge exchange and also to identify potential intra- and interdisciplinary publication outlets. Our findings show an increase in IS knowledge contributions to other COB disciplines, which supports the discussion that IS is a reference discipline. Our research also visually depicts the intellectual structures within IS and between IS and other COB disciplines. Anyone exploring research in IS and allied COB disciplines can peruse the proximity maps to identify groups of similar journals. The findings from this research inform decisions related to which journals to read, target as publication outlets, and include on promotion and tenure lists.  相似文献   

9.
A mathematical model for the growth of two coupled mathematical specialties, differential geometry and topology, is analyzed. The key variable is the number of theorems in use in each specialty. Obsolescences of theorems-in-use due to replacement by more general theorems introduces non-linear terms of the differential equations. The stability of stationary solutions is investigated. The phase portrait shows that the number of theorems in low-dimensional topology relative to those in differential geometry is increasing. The model is qualitatively consistent with the growth of publications in these two specialties, but does not give quantitative predictions, partly because we do not use an explicit solutions as a function of time and partly because only two specialties are used. The methods of analysis and some of the concepts can be extended to the development of more general and realistic models for the growth of specialties.Supported in part by grant IST 78-16629. The authors would like to express their thanks to Derek deSolla Price for very valuable comments that helped us to improve this paper.  相似文献   

10.
One contribution of this paper is an efficient algorithm for deciding membership in a subgroupH of an Abelian groupG whenG andH are in a special form. Our approach is particularly fast because calculations are postponed until needed, and because some decisions can be made based on the existence of certain objects without actually calculating them. This mathematical problem arises naturally in machine learning, and is particularly relevant to concept modeling. We use genetic algorithms as a nontrivial example of how concept formation may correspond to subgroup formation and to illustrate that analysis may reveal abstract concepts determined by group membership which are not initially apparent. This example forms another contribution of this paper; genetic search has not previously been framed within an algebraic context.This research was supported by the National Science Foundation (IRI-8917545)  相似文献   

11.
Ischemic stroke (IS) is one of the major causes of death and disability worldwide. However, the specific mechanism of gene interplay and the biological function in IS are not clear. Therefore, more research into IS is necessary. Dataset GSE110993 including 20 ischemic stroke (IS) and 20 control specimens are used to establish both groups and the raw RNA‐seq data were analysed. Weighted gene co‐expression network analysis (WGCNA) was used to screen the key micro‐RNA modules. The centrality of key genes were determined by module membership (mm) and gene significance (GS). The key pathways were identified by enrichment analysis with Kyoto Protocol Gene and Genome Encyclopedia (KEGG), and the key genes were validated by protein‐protein interactions network. Result: Upon investigation, 1185 up‐ and down‐regulated genes were gathered and distributed into three modules in response to their degree of correlation to clinical traits of IS, among which the turquoise module show a trait‐correlation of 0.77. The top 140 genes were further identified by GS and MM. KEGG analysis showed two pathways may evolve in the progress of IS. Discussion: CXCL12 and EIF2a may be important biomarkers for the accurate diagnosis and treatment in IS.  相似文献   

12.
Faced with the challenges associated with sustainably feeding the world’s growing population, the food industry is increasingly relying on operations research (OR) techniques to achieve economic, environmental and social sustainability. It is therefore important to understand the context-specific model-oriented applications of OR techniques in the sustainable food supply chain (SFSC) domain. While existing food supply chain reviews provide an excellent basis for this process, the explicit consideration of sustainability from a model-oriented perspective along with a structured outline of relevant SFSC research techniques are missing in extant literature. We attempt to fill this gap by reviewing 83 related scientific journal publications that utilise mathematical modelling techniques to address issues in SFSC. To this end, we first identify the salient dimensions that include economic, environmental and social issues in SFSC. We then review the models and methods that use these dimensions to solve issues that arise in SFSC. We identify some of the main challenges in analytical modelling of SFSC as well as future research directions.  相似文献   

13.
Ju S  Yeo WS 《Nanotechnology》2012,23(13):135701
Protein-coated nanoparticles have been used in many studies, including those related to drug delivery, disease diagnosis, therapeutics, and bioassays. The number and density of proteins on the particles' surface are important parameters that need to be calculable in most applications. While quantification methods for two-dimensional surface-bound proteins are commonly found, only a few methods for the quantification of proteins on three-dimensional surfaces such as nanoparticles have been reported. In this paper, we report on a new method of quantifying proteins on nanoparticles using matrix assisted laser desorption/ionization time of flight (MALDI-TOF) mass spectrometry (MS). In this method, the nanoparticle-bound proteins are digested by trypsin and the resulting peptide fragments are analyzed by MALDI-TOF MS after the addition of an isotope-labeled internal standard (IS) which has the same sequence as a reference peptide of the surface-bound protein. Comparing the mass intensities between the reference peptide and the IS allows the absolute quantification of proteins on nanoparticles, because they have the same molecular milieu. As a model system, gold nanoparticles were examined using bovine serum albumin (BSA) as a coating protein. We believe that our strategy will be a useful tool that can provide researchers with quantitative information about the proteins on surfaces of three-dimensional materials.  相似文献   

14.
High throughput-solid phase extraction tandem mass spectrometry (HT-SPE/MS) is a fully automated system that integrates sample preparation using ultrafast online solid phase extraction (SPE) with mass spectrometry detection. HT-SPE/MS is capable of conducting analysis at a speed of 5-10 s per sample, which is several fold faster than chromatographically based liquid chromatography-mass spectrometry (LC-MS). Its existing applications mostly involve in vitro studies such as high-throughput therapeutic target screening, CYP450 inhibition, and transporter evaluations. In the current work, the feasibility of utilizing HT-SPE/MS for analysis of in vivo preclinical and clinical samples was evaluated for the first time. Critical bioanalytical parameters, such as ionization suppression and carry-over, were systematically investigated for structurally diverse compounds using generic SPE operating conditions. Quantitation data obtained from HT-SPE/MS was compared with those from LC-MS analysis to evaluate its performance. Ionization suppression was prevalent for the test compounds, but it could be effectively managed by using a stable isotope labeled internal standard (IS). A structural analogue IS also generated data comparable to the LC-MS system for a test compound, indicating matrix effects were also compensated for to some extent. Carry-over was found to be minimal for some compounds and variable for others and could generally be overcome by inserting matrix blanks without sacrificing assay efficiency due to the ultrafast analysis speed. Quantitation data for test compounds obtained from HT-SPE/MS were found to correlate well with those from conventional LC-MS. Comparable accuracy, precision, linearity, and sensitivity were achieved with analysis speeds 20-30-fold higher. The presence of a stable metabolite in the samples showed no impact on parent quantitation for a test compound. In comparison, labile metabolites could potentially cause overestimation of the parent concentration if the ion source conditions are not optimized to minimize in-source breakdown. However, with the use of conditions that minimized in-source conversion, accurate measurement of the parent was achieved. Overall, HT-SPE/MS exhibited significant potential for high-throughput in vivo bioanalysis.  相似文献   

15.
Abstract

The major objective of in vitro–in vivo correlations is to be able to use in vitro data to predict in vivo performance serving as a surrogate for an in vivo bioavailability test and to support biowaivers. Therefore, the aims of this review are: (i) to clarify the factors involved during bio-predictive dissolution method development; and (ii) the elements that may affect the mathematical analysis in order to exploit all information available. This article covers the basic aspects of dissolution media and apparatus used in the development of in vivo predictive dissolution methods, including the latest proposals in this field as well as the summary of the mathematical methods for establishing the in vitro–in vivo relationship and their scope and limitations. The incorporation of physiological relevant factors in the in vitro dissolution method is essential to get accurate in vivo predictions. Standard quality control dissolution methods do not necessarily reflect the in vivo behavior, so they rarely are useful for predicting in vivo performance. The combination of physiological based dissolution methods with physiological-based pharmacokinetics models incorporating gastrointestinal variables will lead to robust tools for drug and formulation development, nevertheless their regulatory use for biowaiver application still require harmonization of the mathematical methods proposed and more detailed recommendations about the procedures for setting up dissolution specifications.  相似文献   

16.
Although LC-MS methods are increasingly used for the absolute quantification of proteins, the lack of appropriate internal standard (IS) hinders the development of rapid and standardized analytical methods for both in vitro and in vivo studies. Here, we have developed a novel method for the absolute quantification of a therapeutic protein, which is monoclonal antibody (mAb). The method combines liquid chromatography tandem mass spectrometry (LC-MS/MS) and protein cleavage isotope dilution mass spectrometry with the isotope-labeled mAb as IS. The latter was identical to the analyzed mAb with the exception that each threonine contains four (13)C atoms and one (15)N atom. Serum samples were spiked with IS prior to the overnight trypsin digestion and subsequent sample cleanup. Sample extracts were analyzed on a C18 ACE column (150 mm x 4.6 mm) using an LC gradient time of 11 min. Endogenous mAb concentrations were determined by calculating the peak height ratio of its signature peptide to the corresponding isotope-labeled peptide. The linear dynamic range was established between 5.00 and 1000 microg/mL mAb with accuracy and precision within +/-15% at all concentrations and below +/-20% at the LLOQ (lower limit of quantification). The overall method recovery in terms of mAb was 14%. The losses due to sample preparation (digestion and purification) were 72% from which about 32% was due to the first step of the method, the sample digestion. This huge loss during sample preparation strongly emphasizes the necessity to employ an IS right from the beginning. Our method was successfully applied to the mAb quantification in marmoset serum study samples, and the precision obtained on duplicate samples was, in most cases, below 20%. The comparison with enzyme-linked immunosorbent assay (ELISA) showed higher exposure in terms of AUC and Cmax with the LC-MS/MS method. Possible reasons for this discrepancy are discussed in this study. The results of this study indicate that our LC-MS/MS method is a simple, rapid, and precise approach for the therapeutic mAb quantification to support preclinical and clinical studies.  相似文献   

17.
The purpose of this research is to furnish the OR/MS research community with an updated assessment of the discipline’s journals set with refinements that also highlight the various characteristics of OR/MS journals. More specifically, we apply a refined PageRank method initially proposed by Xu et al. (2011) to evaluate the top 31 OR/MS journals for 2010, and report our findings. We also report the shifts in the rankings that span 5 years, from 2006 to 2010. We observe that Manufacturing and Service Operations Management, indexed by the SCI only in 2008, is a specialized journal that is consistently highly regarded within the discipline. The rankings also suggest that Management Science is more established as a generalized journal as it has more external impact. In general, our ranking results correlate with expert opinions, and we also observe, report and discuss some interesting patterns that have emerged over the past 5 years from 2006 to 2010.  相似文献   

18.
A novel technique is presented that addresses the issue of how to apply internal standard (IS) to dried matrix spot (DMS) samples that allows the IS to integrate with the sample prior to extraction. The TouchSpray, a piezo electric spray system, from The Technology Partnership (TTP), was used to apply methanol containing IS to dried blood spot (DBS) samples. It is demonstrated that this method of IS application has the potential to work in practice, for use in quantitative determination of circulating exposures of pharmaceuticals in toxicokinetic and pharmacokinetic studies. Three different methods of IS application were compared: addition of IS to control blood prior to DBS sample preparation (control 1), incorporation into extraction solvent (control 2), and the novel use of TouchSpray technology (test). It is demonstrated that there was no significant difference in accuracy and precision data using these three techniques obtained using both manual extraction and direct elution.  相似文献   

19.
A mathematical treatment is given for the family of scientometric laws (usually referred to as the Zipf-Pareto law) that have been described byPrice and do not conform with the usual Gaussian view of empirical distributions. An analysis of the Zipf-Pareto law in relationship with stable non Gaussian distributions. An analysis of the Zipf-Pareto law in relationship with stable non Gaussian distributions reveals, in particular, that the truncated Cauchy distribution asymptotically coincides with Lotka's law, the most well-known frequency form of the Zipf-Pareto law. The mathematical theory of stable non Gaussian distributions, as applied to the analysis of the Zipf-Pareto law, leads to several conclusions on the mechanism of their genesis, the specific methods of processing empirical data, etc. The use of non-Gaussian processes in scientometric models suggests that this approach may result in a general mathematical theory describing the distribution of science related variables.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号