首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Abstract

Current methods of data representation for electron backscatter diffraction (EBSD) measurements are reviewed. Obtaining diffraction data from microstructures using EBSD has become a relatively straightforward procedure, and EBSD software packages are used to represent these data as qualitative statistics in the form of ideal orientations, pole figures, inverse pole figures, Euler space, and Rodrigues–Frank space. Quantitative statistics in the form of secondary computations allow full microtextural analysis. Additionally, the power of EBSD is demonstrated through positional information representation. Through experimental examples, the conversion of EBSD data to statistical information to facilitate interpretation of results is demonstrated.

MST/3678  相似文献   

2.
This paper scrutinizes the attempts of 19th-century Belgian and Dutch physician-editors to convey foreign knowledge in new ways. By analysing their editorial practices and strategies—including the duplication (reprinting in full), reduction, modification, and translation of texts—it shows that the introduction of periodical publishing in medicine involved experimentation not only with the style of professional debate, but also with the format of medical knowledge. My analysis of reprinting reveals how editors, in cooperation with publishers, succeeded in broadening the readership of scientific medical texts by including private practitioners. The model for these publishing experiments was not so much the (polemical) newspaper, as was the case in the natural sciences, but rather the encyclopaedia and the handbook. If science was to interest practicing doctors, medical journals had to present scientific texts in a useful and easily consultable form. The core idea of the “reprint journal” was to continuously assemble and render insightful a growing body of international knowledge. This exercise, however, was fraught with tensions: reprint journals were reproached for not having a sufficiently international selection of articles, but at the same time for not being sufficiently “national.” As the framework of the nation-state gained strength around 1850 and an awareness of authors' rights took a stronger hold, reprinting in medical journals lost its popularity.  相似文献   

3.
Sensitive and high-resolution chromatographic-driven metabonomomics studies experienced major growth with the aid of new analytical technologies and bioinformatics software packages. Hence, data collections by LC-MS and data analyses by multivariate statistical methods are by far the most straightforward steps, and the detection of biomarker candidates can easily be achieved. However, the unequivocal identification of the detected metabolite candidates, including isomer elucidation, is still a crux of current metabonomics studies. Here we present a comprehensive analytical strategy for the elucidation of the molecular structure of metabolite biomarkers detected in a metabonomics study, exemplified analyzing spot urine of a cohort of healthy, insulin sensitive subjects and clinically well characterized prediabetic, insulin resistant individuals. An integrated approach of LC-MS fingerprinting, multivariate statistic analysis, LC-MSn experiments, micro preparation, FTICR-MS, GC retention index, database search, and generation of an isotope labeled standard was applied. Overall, we could demonstrate the efficiency of our analytical approach by the unambiguous elucidation of the molecular structure of an isomeric biomarker candidate detected in a complex human biofluid. The proposed strategy is a powerful new analytical tool, which will allow the definite identification of physiologically important molecules in metabonomics studies from basic biochemistry to clinical biomarker discovery.  相似文献   

4.
Internet technology is an indispensable tool in scientific research. Prior research confirms the importance of professional activities, professional networks, scientific collaboration and the internet among scientists, academics and researchers. In other words, professional activities, networks and collaboration are relevant epistemic strategies in both the short- and long-term objectives of knowledge production. Variations in these strategies are possible across different categories such as race and gender. Involving academics and scientists (n = 204) from sampled institutions in post-apartheid South Africa, this study examines how the use of technology by people in different racial categories influences their epistemic strategies of professional activities, networks and scientific collaboration.  相似文献   

5.
Summary The authors have constructed an original database of the full text of the Japanese Patent Gazette published since 1994. The database includes not only the front page but also the body text of more than 880,000 granted Japanese patents. By reading the full texts of all 1,500 patent samples, we found that some inventors cite many academic papers in addition to earlier patents in the body texts of their Japanese patents. Using manually extracted academic paper citations and patent citations as “right” answers, we fine-tuned a search algorithm that automatically retrieves cited scientific papers and patents from the entire texts of all the Japanese patents in the database. An academic paper citation in a patent text indicates that the inventor used scientific knowledge in the cited paper when he/she invented the idea codified in the citing patent. The degree of science linkage, as measured by the number of research papers cited in patent documents, is particularly strong in biotechnology. Among other types of technology, those related to photographic-sensitized material, cryptography, optical computing, and speech recognition also show strong science linkage. This suggests that the degree of dependence on scientific knowledge differs from technology to technology and therefore, different ways of university-industry collaboration are necessary for different technology fields.  相似文献   

6.
Cardiac modelling is the area of physiome modelling where the available simulation software is perhaps most mature, and it therefore provides an excellent starting point for considering the software requirements for the wider physiome community. In this paper, we will begin by introducing some of the most advanced existing software packages for simulating cardiac electrical activity. We consider the software development methods used in producing codes of this type, and discuss their use of numerical algorithms, relative computational efficiency, usability, robustness and extensibility. We then go on to describe a class of software development methodologies known as test-driven agile methods and argue that such methods are more suitable for scientific software development than the traditional academic approaches. A case study is a project of our own, Cancer, Heart and Soft Tissue Environment, which is a library of computational biology software that began as an experiment in the use of agile programming methods. We present our experiences with a review of our progress thus far, focusing on the advantages and disadvantages of this new approach compared with the development methods used in some existing packages. We conclude by considering whether the likely wider needs of the cardiac modelling community are currently being met and suggest that, in order to respond effectively to changing requirements, it is essential that these codes should be more malleable. Such codes will allow for reliable extensions to include both detailed mathematical models--of the heart and other organs--and more efficient numerical techniques that are currently being developed by many research groups worldwide.  相似文献   

7.
8.
9.
Microsystems have become an integral part of our lives and can be found in homeland security, medical science, aerospace applications and beyond. Many critical microsystem applications are in harsh environments, in which long-term reliability needs to be guaranteed and repair is not feasible. For example, gyroscope microsystems on satellites need to function for over 20 years under severe radiation, thermal cycling, and shock loading. Hence a predictive-science-based, verified and validated computational models and algorithms to predict the performance and materials integrity of microsystems in these situations is needed. Confidence in these predictions is improved by quantifying uncertainties and approximation errors. With no full system testing and limited sub-system testings, petascale computing is certainly necessary to span both time and space scales and to reduce the uncertainty in the prediction of long-term reliability. This paper presents the necessary steps to develop predictive-science-based multiscale modeling and simulation system. The development of this system will be focused on the prediction of the long-term performance of a gyroscope microsystem. The environmental effects to be considered include radiation, thermo-mechanical cycling and shock. Since there will be many material performance issues, attention is restricted to creep resulting from thermal aging and radiation-enhanced mass diffusion, material instability due to radiation and thermo-mechanical cycling and damage and fracture due to shock. To meet these challenges, we aim to develop an integrated multiscale software analysis system that spans the length scales from the atomistic scale to the scale of the device. The proposed software system will include molecular mechanics, phase field evolution, micromechanics and continuum mechanics software, and the state-of-the-art model identification strategies where atomistic properties are calibrated by quantum calculations. We aim to predict the long-term (in excess of 20 years) integrity of the resonator, electrode base, multilayer metallic bonding pads, and vacuum seals in a prescribed mission. Although multiscale simulations are efficient in the sense that they focus the most computationally intensive models and methods on only the portions of the space–time domain needed, the execution of the multiscale simulations associated with evaluating materials and device integrity for aerospace microsystems will require the application of petascale computing. A component-based software strategy will be used in the development of our massively parallel multiscale simulation system. This approach will allow us to take full advantage of existing single scale modeling components. An extensive, pervasive thrust in the software system development is verification, validation, and uncertainty quantification (UQ). Each component and the integrated software system need to be carefully verified. An UQ methodology that determines the quality of predictive information available from experimental measurements and packages the information in a form suitable for UQ at various scales needs to be developed. Experiments to validate the model at the nanoscale, microscale, and macroscale are proposed. The development of a petascale predictive-science-based multiscale modeling and simulation system will advance the field of predictive multiscale science so that it can be used to reliably analyze problems of unprecedented complexity, where limited testing resources can be adequately replaced by petascale computational power, advanced verification, validation, and UQ methodologies.  相似文献   

10.
Improving performance in terms of delivery reliability is increasingly important for make-to-order (MTO) companies. Detecting improvement opportunities requires a structured diagnosis of the current performance. General problem-solving literature provides structures for diagnosis processes in general, but – depending on the performance problem to be diagnosed – a theoretical framework based on domain-specific scientific knowledge is required. This paper presents a framework for diagnosing delivery reliability performance in MTO companies. The framework consists of a diagnosis tree that structures the diagnosis process, enabling one to navigate from the achieved performance to the underlying causes related to production planning and control (PPC). A theoretical foundation, enabling the possible causes of unreliable deliveries to be structured, is based on recent scientific developments in PPC literature. Three case studies exemplify the use of the framework. The developed framework shows its particular strengths in (1) selecting the right problem areas, (2) providing the right diagnosis instruments, and (3) detecting causes related to PPC decisions. It also supports diagnosis from quantitative data available in standard ERP software packages and enables diagnosis triangulation using qualitative data from the underlying decision processes.  相似文献   

11.
In electron backscatter diffraction (EBSD) software packages there are many user choices both in data acquisition and in data processing and display. In order to extract maximum scientific value from an inquiry, it is helpful to have some guidelines for best practice in conducting an EBSD investigation. The purpose of this article therefore is to address selected topics of EBSD practice, in a tutorial manner. The topics covered are a brief summary on the principles of EBSD, specimen preparation, calibration of an EBSD system, experiment design, speed of data acquisition, data clean-up, microstructure characterisation (including grain size) and grain boundary characterisation. This list is not meant to cover exhaustively all areas where EBSD is used, but rather to provide a resource consisting of some useful strategies for novice EBSD users.  相似文献   

12.
Abstract

Companies pursuing product leadership continually push products into innovative technology areas and new unknown markets. As companies continue to strive for continuous innovation, often leapfrogging even their own technology, new product development (NPD) processes play an increasingly important role in defining the success or failure of many new innovations. In addition, increased competitive rivalry is driving companies to commercialize their new products much more quickly. To meet these pressures, new strategies are being used to supplement the conventional new product development process that consists of strategy formulation, idea generation, screening and evaluation, development, testing, and launch. The primary objective of each of these innovation strategies is to attain sustainable competitive advantage for the company and achieve higher overall performance.

Our research examined product and service innovation strategies of six projects. Half of which were considered successful and the other half failures. Using several emerging innovation strategies including process-driven, speed-to-market, quantitative, market-driven, technology-driven, and learning-driven to classify these projects, we evaluated the innovation strategies employed in an attempt to determine the overall NPD strategy effectiveness. In addition, we also attempted to identify relevant critical success factors and associated activities to construct an ideal innovation strategy model.

In the projects we studied, we found that no one best strategy leads to successful innovation. While evaluating areas of uncertainty that impact project success, we determined that a new dimension, process uncertainty, plays as important a role as market or technical uncertainty previously examined in the emerging scholarship. Furthermore, the insights that were discovered by comparing the different innovation strategies led us to formulate the technical-market-process (TMP) uncertainty mode. The TMP model serves as a predicator for identifying the appropriate innovation strategies that can be brought together to drive project success. We conclude that the combination of identifying the appropriate innovation strategies and proficiently executing these strategies is the key to successful new product development.  相似文献   

13.
Bibliometric mapping of scientific articles based on keywords and technical terms in abstracts is now frequently used to chart scientific fields. In contrast, no significant mapping has been applied to the full texts of non-specialist documents. Editorials in Nature and Science are such non-specialist documents, reflecting the views of the two most read scientific journals on science, technology and policy issues. We use the VOSviewer mapping software to chart the topics of these editorials. A term map and a document map are constructed and clusters are distinguished in both of them. The validity of the document clustering is verified by a manual analysis of a sample of the editorials. This analysis confirms the homogeneity of the clusters obtained by mapping and augments the latter with further detail. As a result, the analysis provides reliable information on the distribution of the editorials over topics, and on differences between the journals. The most striking difference is that Nature devotes more attention to internal science policy issues and Science more to the political influence of scientists. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s11192-010-0205-9) contains supplementary material, which is available to authorized users.  相似文献   

14.
Cagliero  Luca  Garza  Paolo  Kavoosifar  Mohammad Reza  Baralis  Elena 《Scientometrics》2018,116(2):1273-1301

Identifying the most relevant scientific publications on a given topic is a well-known research problem. The Author-Topic Model (ATM) is a generative model that represents the relationships between research topics and publication authors. It allows us to identify the most influential authors on a particular topic. However, since most research works are co-authored by many researchers the information provided by ATM can be complemented by the study of the most fruitful collaborations among multiple authors. This paper addresses the discovery of research collaborations among multiple authors on single or multiple topics. Specifically, it exploits an exploratory data mining technique, i.e., weighted association rule mining, to analyze publication data and to discover correlations between ATM topics and combinations of authors. The mined rules characterize groups of researchers with fairly high scientific productivity by indicating (1) the research topics covered by their most cited publications and the relevance of their scientific production separately for each topic, (2) the nature of the collaboration (topic-specific or cross-topic), (3) the name of the external authors who have (occasionally) collaborated with the group either on a specific topic or on multiple topics, and (4) the underlying correlations between the addressed topics. The applicability of the proposed approach was validated on real data acquired from the Online Mendelian Inheritance in Man catalog of genetic disorders and from the PubMed digital library. The results confirm the effectiveness of the proposed strategy.

  相似文献   

15.
This paper reports on the practises of bioinformatics research in South Africa using bibliometric techniques. The search strategy was designed to cover the common concepts in biological data organisation, retrieval and analysis; the development and application of tools and methodologies in biological computation; and related subjects in genomics and structural bioinformatics. The South African literature in bioinformatics has grown by 66.5% between 2001 and 2006. However, its share of world production is not on par with comparator countries, Brazil, India and Australia.  相似文献   

16.
We present results of a benchmark test evaluating the resource allocation capabilities of the project management software packages Acos Plus.1 8.2, CA SuperProject 5.0a, CS Project Professional 3.0, MS Project 2000, and Scitor Project Scheduler 8.0.1. The tests are based on 1560 instances of precedence– and resource–constrained project scheduling problems. For different complexity scenarios, we analyze the deviation of the makespan obtained by the software packages from the best feasible makespan known. Among the tested software packages, Acos Plus.1 and Project Scheduler show the best resource allocation performance. Moreover, our numerical analysis reveals a considerable performance gap between the implemented methods and state–of–the–art project scheduling algorithms, especially for large–sized problems. Thus, there is still a significant potential for improving solutions to resource allocation problems in practice.   相似文献   

17.
Summary This study demonstrates that the choice of search strategy for article identification has an impact on evaluation and policy analysis of research areas. We have assessed the scientific production in two areas at one research institution during a ten-year period. We explore the recall and precision of three article identification strategies: journal classifications, keywords and authors. Our results show that the different search strategies have varying recall (0.38-1.00) and precision (0.50-1.00). In conclusion, uncritical analysis based on rudimentary article identification strategies may lead to misinterpretation of the development of research areas, and thus provide incorrect data for decision-making.  相似文献   

18.
Olmi L  Bolli P 《Applied optics》2007,46(19):4092-4101
The performance of telescope systems working at microwave or visible-IR wavelengths is typically described in terms of different parameters according to the wavelength range. Most commercial ray-tracing packages have been specifically designed for use with visible-IR systems and thus, though very flexible and sophisticated, do not provide the appropriate parameters to fully describe microwave antennas and to compare with specifications. We demonstrate that the Strehl ratio is equal to the phase efficiency when the apodization factor is taken into account. The phase efficiency is the most critical contribution to the aperture efficiency of an antenna and the most difficult parameter to optimize during the telescope design. The equivalence between the Strehl ratio and the phase efficiency gives the designer/user of the telescope the opportunity to use the faster commercial ray-tracing software to optimize the design. We also discuss the results of several tests performed to check the validity of this relationship that we carried out using a ray-tracing software, ZEMAX, and a full Physical Optics software, GRASP9.3, applied to three different telescope designs that span a factor of approximately 10 in terms of D/lambda. The maximum measured discrepancy between phase efficiency and Strehl ratio varies between approximately 0.4% and 1.9% up to an offset angle of >40 beams, depending on the optical configuration, but it is always less than 0.5% where the Strehl ratio is >0.95.  相似文献   

19.
Hicks  Diana  Melkers  Julia  Isett  Kimberley R. 《Scientometrics》2019,119(2):827-843

The publishing industry is a vast system whose elements form a metaphorical ecosystem with knowledge flowing through connections between heterogeneous elements. In this paper we seek a more robust understanding of different types of literature, and whether and how they support one another in the diffusion of knowledge. We analyze a corpus comprising professional electronic media in US dentistry and its relation to the peer reviewed journal literature. Our corpus includes full text from magazines, news sites and blogs that provide information to clinicians. We find links to research are made through several mechanisms: articles describing new clinical guidelines, referencing, summaries of recently published journal articles and crossover authoring. There is little to no apparent time lag in the diffusion of information from research literature to professional media.

  相似文献   

20.
Bioinformatics is an emerging and rapidly evolving discipline. The bioinformatics literature is growing exponentially. This paper aims to provide an integrated bibliometric study of the knowledge base of Chinese research community, based on the bibliometric information in the field of bioinformatics from SCI-Expanded database during the period of 2000–2005. It is found that China is productive in bioinformatics as far as publication activity in international journals is concerned. For comparative purpose, the results are benchmarked against the findings from five other major nations in the field of bioinformatics: USA, UK, Germany, Japan and India. In terms of collaboration profile, the findings imply that the collaborative scope of China has gradually transcended boundaries of organizations, regions and nations as well. Finally, further analyses on the citation share and some surrogate scientometric indicators show that the publications of Chinese authors suffer from a lowest international visibility among the six countries. Strikingly, Japan has achieved most remarkable impact of publication when compared to research effort devoted to bioinformatics amongst the six countries. The policy implication of the findings lies in that Chinese scientific community needs much work on improving the research impact and pays more attention to strengthening the academic linkages between China and worldwide nations, particularly scientifically advanced countries.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号