首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3121篇
  免费   81篇
  国内免费   12篇
电工技术   40篇
综合类   5篇
化学工业   387篇
金属工艺   57篇
机械仪表   53篇
建筑科学   115篇
矿业工程   10篇
能源动力   104篇
轻工业   336篇
水利工程   16篇
石油天然气   19篇
无线电   235篇
一般工业技术   347篇
冶金工业   1056篇
原子能技术   24篇
自动化技术   410篇
  2022年   28篇
  2021年   35篇
  2020年   26篇
  2019年   19篇
  2018年   42篇
  2017年   33篇
  2016年   42篇
  2015年   44篇
  2014年   78篇
  2013年   126篇
  2012年   90篇
  2011年   113篇
  2010年   92篇
  2009年   83篇
  2008年   100篇
  2007年   110篇
  2006年   99篇
  2005年   62篇
  2004年   66篇
  2003年   80篇
  2002年   61篇
  2001年   61篇
  2000年   61篇
  1999年   64篇
  1998年   329篇
  1997年   203篇
  1996年   137篇
  1995年   88篇
  1994年   99篇
  1993年   90篇
  1992年   41篇
  1991年   42篇
  1990年   36篇
  1989年   41篇
  1988年   32篇
  1987年   38篇
  1986年   27篇
  1985年   30篇
  1984年   21篇
  1983年   23篇
  1981年   19篇
  1980年   28篇
  1979年   14篇
  1978年   16篇
  1977年   40篇
  1976年   60篇
  1975年   13篇
  1974年   14篇
  1973年   18篇
  1972年   13篇
排序方式: 共有3214条查询结果,搜索用时 18 毫秒
71.
We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, M?nnedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpF?STR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput.  相似文献   
72.
Maintaining an awareness of the working context of fellow co-workers is crucial to successful cooperation in a workplace. For mobile, non co-located workers, however, such workplace awareness is hard to maintain. This paper investigates how context-aware computing can be used to facilitate workplace awareness. In particular, we present the concept of Context-Based Workplace Awareness, which is derived from years of in-depth studies of hospital work and the design of computer supported cooperative work technologies to support the distributed collaboration and coordination of clinical work within large hospitals. This empirical background has revealed that an awareness especially of the social, spatial, temporal, and activity context plays a crucial role in the coordination of work in hospitals. The paper then presents and discusses technologies designed to support context-based workplace awareness, namely the AWARE architecture, and the AwarePhone and AwareMedia applications. Based on almost 2 year’ deployment of the technologies in a large hospital, the paper discuss how the four dimension of context-based workplace awareness play out in the coordination of clinical work.  相似文献   
73.
Service Composition Issues in Pervasive Computing   总被引:2,自引:0,他引:2  
Providing new services by combining existing ones—or service composition—is an idea pervading pervasive computing. Pervasive computing technologies seek to concurrently exhibit context awareness, manage contingencies, leverage device heterogeneity, and empower users. These four goals prompt service-composition-mechanism design requirements that are unique to pervasive computing. This article catalogs service composition mechanisms and describes their variation points, which indicate how well the resulting compositions meet the four goals.  相似文献   
74.
75.
76.
Regularly updated land cover information at continental or national scales is a requirement for various land management applications as well as biogeochemical and climate modeling exercises. However, monitoring or updating of map products with sufficient spatial detail is currently not widely practiced due to inadequate time-series coverage for most regions of the Earth. Classifications of coarser spatial resolution data can be automatically generated on an annual or finer time scale. However, discrete land cover classifications of such data cannot sufficiently quantify land surface heterogeneity or change. This study presents a methodology for continuous and discrete land cover mapping using moderate spatial resolution time series data sets. The method automatically selects sample data from higher spatial resolution maps and generates multiple decision trees. The leaves of decision trees are interpreted considering the sample distribution of all classes yielding class membership maps, which can be used as estimates for the diversity of classes in a coarse resolution cell. Results are demonstrated for the heterogeneous, small-patch landscape of Germany and the bio-climatically varying landscape of South Africa. Results have overall classification accuracies of 80%. A sensitivity analysis of individual modules of the classification process indicates the importance of appropriately chosen features, sample data balanced among classes, and an appropriate method to combine individual classifications. The comparison of classification results over several years not only indicates the method's consistency, but also its potential to detect land cover changes.  相似文献   
77.
An analytical expression relating mass and position of a particle attached on a cantilever to the resulting change in cantilever resonant frequency is derived. Theoretically, the position and mass of the attached particle can be deduced by combining measured resonant frequencies of several bending modes. This finding is verified experimentally using a microscale cantilever with and without an attached gold bead. The resonant frequencies of several bending modes are measured as a function of the bead position. The bead mass and position calculated from the measured resonant frequencies are in good agreement with the expected mass and the position measured.  相似文献   
78.
Indications for shoulder arthroplasty are numerous, mainly owing to glenohumeral osteoarthritis, rheumatoid arthritis, or fracture of the proximal humerus. However, the anatomy and the biomechanics of the shoulder are complex and shoulder arthroplasty has evolved significantly over the past 30 years. This paper presents the main recent evolutions in shoulder replacement, the questions not answered yet, and the main future areas of research. The review focuses firstly on the design, positioning, and fixation of the humeral component, secondly on the design, positioning, and fixation of the glenoid implant, and thirdly on other concepts of shoulder arthroplasty such as the reversed prosthesis, the cementless surface replacement arthroplasty, and the bipolar arthroplasty. This review demonstrates that more research is needed. Although, in the long term, large randomized trials are needed to settle the fundamental questions of what type of replacement and which kind of fixation should be used, biomechanical research in the laboratory should be focused primarily on the comprehension of glenoid loosening, which is a major cause of total shoulder arthroplasty failure, and the significance of radiolucent lines which are often seen but with no clear understanding about their relation with failure.  相似文献   
79.
We present a new algorithm for maximum likelihood convolutive independent component analysis (ICA) in which components are unmixed using stable autoregressive filters determined implicitly by estimating a convolutive model of the mixing process. By introducing a convolutive mixing model for the components, we show how the order of the filters in the model can be correctly detected using Bayesian model selection. We demonstrate a framework for deconvolving a subspace of independent components in electroencephalography (EEG). Initial results suggest that in some cases, convolutive mixing may be a more realistic model for EEG signals than the instantaneous ICA model.  相似文献   
80.
We present a vectorized version of the MatLab (MathWorks Inc.) package tweezercalib for calibration of optical tweezers with precision. The calibration is based on the power spectrum of the Brownian motion of a dielectric bead trapped in the tweezers. Precision is achieved by accounting for a number of factors that affect this power spectrum, as described in vs. 1 of the package [I.M. Toli?-Nørrelykke, K. Berg-Sørensen, H. Flyvbjerg, Matlab program for precision calibration of optical tweezers, Comput. Phys. Comm. 159 (2004) 225-240]. The graphical user interface allows the user to include or leave out each of these factors. Several “health tests” are applied to the experimental data during calibration, and test results are displayed graphically. Thus, the user can easily see whether the data comply with the theory used for their interpretation. Final calibration results are given with statistical errors and covariance matrix.

New version program summary

Title of program: tweezercalibCatalogue identifier: ADTV_v2_0Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTV_v2_0Program obtainable from: CPC Program Library, Queen's University of Belfast, N. IrelandReference in CPC to previous version: I.M. Toli?-Nørrelykke, K. Berg-Sørensen, H. Flyvbjerg, Comput. Phys. Comm. 159 (2004) 225Catalogue identifier of previous version: ADTVDoes the new version supersede the original program: YesComputer for which the program is designed and others on which it has been tested: General computer running MatLab (Mathworks Inc.)Operating systems under with the program has been tested: Windows2000, Windows-XP, LinuxProgramming language used: MatLab (Mathworks Inc.), standard licenseMemory required to execute with typical data: Of order four times the size of the data fileHigh speed storage required: noneNo. of lines in distributed program, including test data, etc.: 135 989No. of bytes in distributed program, including test data, etc.: 1 527 611Distribution format: tar. gzNature of physical problem: Calibrate optical tweezers with precision by fitting theory to experimental power spectrum of position of bead doing Brownian motion in incompressible fluid, possibly near microscope cover slip, while trapped in optical tweezers. Thereby determine spring constant of optical trap and conversion factor for arbitrary-units-to-nanometers for detection system.Method of solution: Elimination of cross-talk between quadrant photo-diode's output channels for positions (optional). Check that distribution of recorded positions agrees with Boltzmann distribution of bead in harmonic trap. Data compression and noise reduction by blocking method applied to power spectrum. Full accounting for hydrodynamic effects: Frequency-dependent drag force and interaction with nearby cover slip (optional). Full accounting for electronic filters (optional), for “virtual filtering” caused by detection system (optional). Full accounting for aliasing caused by finite sampling rate (optional). Standard non-linear least-squares fitting. Statistical support for fit is given, with several plots facilitating inspection of consistency and quality of data and fit.Summary of revisions: A faster fitting routine, adapted from [J. Nocedal, Y.x. Yuan, Combining trust region and line search techniques, Technical Report OTC 98/04, Optimization Technology Center, 1998; W.H. Press, B.P. Flannery, S.A. Teukolsky, W.T. Vetterling, Numerical Recipes. The Art of Scientific Computing, Cambridge University Press, Cambridge, 1986], is applied. It uses fewer function evaluations, and the remaining function evaluations have been vectorized. Calls to routines in Toolboxes not included with a standard MatLab license have been replaced by calls to routines that are included in the present package. Fitting parameters are rescaled to ensure that they are all of roughly the same size (of order 1) while being fitted. Generally, the program package has been updated to comply with MatLab, vs. 7.0, and optimized for speed.Restrictions on the complexity of the problem: Data should be positions of bead doing Brownian motion while held by optical tweezers. For high precision in final results, data should be time series measured over a long time, with sufficiently high experimental sampling rate: The sampling rate should be well above the characteristic frequency of the trap, the so-called corner frequency. Thus, the sampling frequency should typically be larger than 10 kHz. The Fast Fourier Transform used works optimally when the time series contain n2 data points, and long measurement time is obtained with n>12-15. Finally, the optics should be set to ensure a harmonic trapping potential in the range of positions visited by the bead. The fitting procedure checks for harmonic potential.Typical running time: SecondsUnusual features of the program: NoneReferences: The theoretical underpinnings for the procedure are found in [K. Berg-Sørensen, H. Flyvbjerg, Power spectrum analysis for optical tweezers, Rev. Sci. Ins. 75 (2004) 594-612].  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号