首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   38篇
  免费   4篇
电工技术   1篇
化学工业   18篇
金属工艺   1篇
机械仪表   1篇
无线电   2篇
一般工业技术   7篇
冶金工业   5篇
自动化技术   7篇
  2021年   3篇
  2019年   2篇
  2018年   2篇
  2017年   1篇
  2015年   2篇
  2014年   1篇
  2013年   1篇
  2011年   1篇
  2010年   3篇
  2008年   8篇
  2007年   1篇
  2006年   2篇
  2005年   2篇
  2004年   2篇
  2003年   2篇
  2001年   3篇
  1998年   1篇
  1997年   3篇
  1994年   1篇
  1985年   1篇
排序方式: 共有42条查询结果,搜索用时 125 毫秒
1.
2.
3.
Perceiving and memorizing faces swiftly and correctly are important social competencies. The organization of these interpersonal abilities and how they change across the life span are still poorly understood. We investigated changes in the mean and covariance structure of face cognition abilities across the adult life span. A sample of 448 participants, with age ranging from 18 to 88 years, completed a battery of 15 face cognition tasks. After establishing a measurement model of face cognition that distinguishes between face perception, face memory, and the speed of face cognition, we used multiple group models and age-weighted measurement models to explore age-related changes. The modeling showed that the loadings and intercepts of all measures are age invariant. The factor means showed substantial decrements with increasing age. Age-related decrements in performance were strongest for the speed of face cognition but were also salient for face perception and face memory. The onset of age decrements is apparent in the 60s for face perception, in the late 40s for face memory, and in the early 30s for speed of face cognition. Implications of these findings on a theoretical and methodological level are discussed, and potential consequences for applied settings are considered. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
4.
Gläser  Jochen  Laudel  Grit 《Scientometrics》2001,52(3):411-434
This article discusses the methodological problems of integrating scientometric methods into a qualitative study. Integrative attempts of this kind are poorly supported by the methodologies of both the sociology of science and scientometrics. Therefore it was necessary to develop a project-specific methodological approach that linked scientometric methods to theoretical considerations. The methodological approach is presented and used to discuss general methodological problems concerning the relation between (qualitative) theory and scientometric methods. This discussion enables some conclusions to be drawn as to the relations that exist between scientometrics and the sociology of science. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   
5.
Herein, we report the formation of α‐amylase containing polyelectrolyte complexes (PECs). The method for the encapsulation of α‐amylase is based on interactions between two oppositely charged polyelectrolytes, poly(acrylic acid) (PAA) and polyethylenimine (PEI). We could show that electrostatic interactions ensure the incorporation of the enzyme into the formed polyelectrolyte complexes. The encapsulation has no negative effect on enzyme activity and protects against denaturation of the enzyme initiated by low pH values. The resulting PECs are 150–250 nm in size with narrow size distribution, appear in a spherical shape and are colloidally stable. The complexation of both polyelectrolytes and the immobilization of α‐amylase are investigated using fractionating techniques mainly the analytical ultracentrifugation and asymmetrical‐flow field‐flow fractionation. The formation of PECs represents a simple method for the encapsulation of α‐amylase without the use of organic solvents and requires no additional purifications steps. This one‐step approach, yielding high encapsulation efficiencies, shows the potential as a drug delivery system for sensitive hydrophilic actives in future. α‐amylase is immobilized in polyelectrolyte complexes made of polyethylenimine and poly(acrylic acid). Optimized encapsulation conditions and the resulting polyelectrolyte complexes are investigated via determination of IEP, α‐amylase activity assays, nanoDSC measurements, zeta potential values, dynamic light scattering, microscopy, and fractionating techniques. The encapsulated enzyme is protected against denaturation initiated by low pH values. © 2017 Wiley Periodicals, Inc. J. Appl. Polym. Sci. 2017 , 134, 45036.  相似文献   
6.
Laudel  Grit 《Scientometrics》2003,57(2):215-237
Today science policy makers in many countries worry about a brain drain, i.e., about permanently losing their best scientists to other countries. However, such a brain drain has proven to be difficult to measure. This article reports a test of bibliometric methods that could possibly be used to study the brain drain on the micro-level. An investigation of elite mobility must solve the three methodological problems of delineating a specialty, identifying a specialty's elite and identifying international mobility and migration. The first two problems were preliminarily solved by combining participant lists from elite conferences (Gordon conferences) and citation data. Mobility was measured by using the address information of publication databases. The delineation of specialties has been identified as the crucial problem in studying elite mobility on the micro- level. Policy concerns of a brain drain were confirmed by measuring the mobility of the biomedical Angiotensin specialty. This revised version was published online in August 2006 with corrections to the Cover Date.  相似文献   
7.
Metal nanostructures are promising novel labels for microarray-based biomolecular detection. Additional silver deposition on the surface-bound labels strongly enhances the sensitivity of the system and can lead to continuous metal areas, which enable an electrical readout especially for simple and robust point-of-care analyses. In this paper, atomic force microscopy (AFM) was used to study different routes of metal deposition on labelled DNA-DNA duplexes in electrode gaps. Besides the well-established metal-induced silver enhancement, a recently introduced enzymatic silver deposition was applied and proved highly specific. The in situ characterization was especially focused on the nanostructure percolation-the moment at which the nanoparticulate film becomes continuous and electrically conducting. The formation of conducting paths, continuous from one electrode to the other, was followed by complementary electrical measurements. Thereby, a percolation threshold was determined for the surface coverage with metal structures, i.e.?the required metallized area to achieve conductance. Complementary graphic simulations of the growth process and graphic 'conductance measurements' were developed and proved suitable to model the metal deposition and electrical detection. This may help to design electrode arrays and identify optimum enhancement parameters (required seed concentration and shell growth) as well as draw quantitative conclusions on the existing label (i.e.?analyte) concentration.  相似文献   
8.
We describe the development and operation of a two-laser, large-field hyperspectral scanner for analysis of multicolor genotyping microarrays. In contrast to confocal microarray scanners, in which wavelength selectivity is obtained by positioning band-pass filters in front of a photomultiplier detector, hyperspectral microarray scanners collect the complete visible emission spectrum from the labeled microarrays. Hyperspectral scanning permits discrimination of multiple spectrally overlapping fluorescent labels with minimal use of optical filters, thus offering important advantages over standard filter-based multicolor microarray scanners. The scanner uses two-sided oblique line illumination of microarrays. Two lasers are used for the excitation of dyes in the visible and near-infrared spectral regions. The hyperspectral scanner was evaluated with commercially available two-color calibration slides and with in-house-printed four-color microarrays containing dyes with spectral properties similar to their commercial genotyping array counterparts.  相似文献   
9.
This paper discusses our efforts in implementing a divide and conquer algorithm (adaptive quadrature) on the HEP computer system. The one PEM HEP system performs in a MIMD fashion by pipelining execution of instructions from different processes. Unlike most divide and conquer approaches, our strategy ensures that the program will never deadlock due to memory expansion or spawning too many processes. Within this constraint we develop and analyse two different implementations: one using a static number of processes and the other a dynamic number of processes. Our results examine the relative performance of these two schemes. In addition we briefly discuss some of our impressions concerning some ‘myths of parallel programming’.  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号