首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5296篇
  免费   156篇
  国内免费   64篇
电工技术   100篇
综合类   137篇
化学工业   504篇
金属工艺   111篇
机械仪表   98篇
建筑科学   135篇
矿业工程   21篇
能源动力   67篇
轻工业   429篇
水利工程   41篇
石油天然气   15篇
武器工业   4篇
无线电   276篇
一般工业技术   437篇
冶金工业   2601篇
原子能技术   25篇
自动化技术   515篇
  2023年   26篇
  2022年   48篇
  2021年   70篇
  2020年   65篇
  2019年   52篇
  2018年   50篇
  2017年   57篇
  2016年   54篇
  2015年   59篇
  2014年   84篇
  2013年   165篇
  2012年   146篇
  2011年   181篇
  2010年   158篇
  2009年   150篇
  2008年   172篇
  2007年   190篇
  2006年   167篇
  2005年   109篇
  2004年   103篇
  2003年   97篇
  2002年   92篇
  2001年   76篇
  2000年   70篇
  1999年   119篇
  1998年   705篇
  1997年   450篇
  1996年   275篇
  1995年   168篇
  1994年   137篇
  1993年   165篇
  1992年   54篇
  1991年   37篇
  1990年   42篇
  1989年   44篇
  1988年   37篇
  1987年   31篇
  1986年   31篇
  1985年   43篇
  1984年   26篇
  1983年   34篇
  1982年   44篇
  1981年   42篇
  1980年   52篇
  1979年   26篇
  1978年   31篇
  1977年   72篇
  1976年   177篇
  1975年   29篇
  1963年   13篇
排序方式: 共有5516条查询结果,搜索用时 265 毫秒
111.
圆坯方坯凝固定律的导出和验证   总被引:7,自引:0,他引:7  
经典的凝固平方根定律一般不适合于描述圆坯和方坯的连铸凝壳长大过程,本文推导了圆坯凝壳长大的圆坯凝固方程,k^2t=r^2(1nr/r0-1/2) r0^2/2.圆坯凝固方程也适用于方坯凝壳长大过程,可称为圆方坯凝固定律.经实测数据验证,圆方坯凝固定律能够较准确地表达连铸坯凝固过程.  相似文献   
112.
0.618法在一步成形模拟方法中的应用   总被引:1,自引:0,他引:1  
合理地选择冲压件的毛坯外形 ,可以提高板料的成形性 ,同时也可以提高材料利用率。本文着重研究了一步成形模拟方法中松弛因子对收敛的稳定性和收敛效率的影响问题 ,提出了一种基于 0 618法的松弛因子优化算法 ,并且设计编制了计算程序。通过球冠、L形件、行李箱盖外板、T形件 4个算例 ,与一系列未经优化的固定松弛因子结果进行了比较 ,验证了此方法的有效性  相似文献   
113.
Multiversion databases store both current and historical data. Rows are typically annotated with timestamps representing the period when the row is/was valid. We develop novel techniques to reduce index maintenance in multiversion databases, so that indexes can be used effectively for analytical queries over current data without being a heavy burden on transaction throughput. To achieve this end, we re-design persistent index data structures in the storage hierarchy to employ an extra level of indirection. The indirection level is stored on solid-state disks that can support very fast random I/Os, so that traversing the extra level of indirection incurs a relatively small overhead. The extra level of indirection dramatically reduces the number of magnetic disk I/Os that are needed for index updates and localizes maintenance to indexes on updated attributes. Additionally, we batch insertions within the indirection layer in order to reduce physical disk I/Os for indexing new records. In this work, we further exploit SSDs by introducing novel DeltaBlock techniques for storing the recent changes to data on SSDs. Using our DeltaBlock, we propose an efficient method to periodically flush the recently changed data from SSDs to HDDs such that, on the one hand, we keep track of every change (or delta) for every record, and, on the other hand, we avoid redundantly storing the unchanged portion of updated records. By reducing the index maintenance overhead on transactions, we enable operational data stores to create more indexes to support queries. We have developed a prototype of our indirection proposal by extending the widely used generalized search tree open-source project, which is also employed in PostgreSQL. Our working implementation demonstrates that we can significantly reduce index maintenance and/or query processing cost by a factor of 3. For the insertion of new records, our novel batching technique can save up to 90 % of the insertion time. For updates, our prototype demonstrates that we can significantly reduce the database size by up to 80 % even with a modest space allocated for DeltaBlocks on SSDs.  相似文献   
114.
The purpose of this paper is to introduce an effective and structured methodology for carrying out a biometric system sensitivity analysis. The goal of sensitivity analysis is to provide the researcher/developer with insight and understanding of the key factors—algorithmic, subject-based, procedural, image quality, environmental, among others—that affect the matching performance of the biometric system under study. This proposed methodology consists of two steps: (1) the design and execution of orthogonal fractional factorial experiment designs which allow the scientist to efficiently investigate the effect of a large number of factors—and interactions—simultaneously, and (2) the use of a select set of statistical data analysis graphical procedures which are fine-tuned to unambiguously highlight important factors, important interactions, and locally-optimal settings. We illustrate this methodology by application to a study of VASIR (Video-based Automated System for Iris Recognition)—NIST iris-based biometric system. In particular, we investigated k = 8 algorithmic factors from the VASIR system by constructing a (26?1 × 31  × 41) orthogonal fractional factorial design, generating the corresponding performance data, and applying an appropriate set of analysis graphics to determine the relative importance of the eight factors, the relative importance of the 28 two-term interactions, and the local best settings of the eight algorithms. The results showed that VASIR’s performance was primarily driven by six factors out of the eight, along with four two-term interactions. A virtue of our two-step methodology is that it is systematic and general, and hence may be applied with equal rigor and effectiveness to other biometric systems, such as fingerprints, face, voice, and DNA.  相似文献   
115.
Since 2002, the Royal Air Force (RAF) has been working towards developing role-related physical tests for use as an operational fitness test (OFT). The purpose of this study was to establish reliability of the OFT (comprising four tests), investigate gym-based tests as predictors of performance and establish performance standards. Fifty-eight RAF personnel performed the OFT on three occasions. A separate cohort carried out fitness and anthropometric tests before performing the OFT, by way of establishing performance predictors. Documented evidence and views of an expert panel were used to determine OFT standards. Reliability ranged from moderate to good for three tests, with one test (Dig) showing poor reliability. The 95% limits of agreement for the prediction models ranged from good to poor (6.7-34.2%). The prediction models were not sufficiently accurate to estimate confidently OFT performance, but could be used as a guide to quantify likely outcome and training needs.  相似文献   
116.
Aboveground dry biomass was estimated for the 1.3 M km2 forested area south of the treeline in the eastern Canadian province of Québec by combining data from an airborne and spaceborne LiDAR, a Landsat ETM+ land cover map, a Shuttle Radar Topographic Mission (SRTM) digital elevation model, ground inventory plots, and vegetation zone maps. Plot-level biomass was calculated using allometric relationships between tree attributes and biomass. A small footprint portable laser profiler then flew over these inventory plots to develop a generic airborne LiDAR-based biomass equation (R2 = 0.65, n = 207). The same airborne LiDAR system flew along four portions of orbits of the ICESat Geoscience Laser Altimeter System (GLAS). A square-root transformed equation was developed to predict airborne profiling LiDAR estimates of aboveground dry biomass from GLAS waveform parameters combined with an SRTM slope index (R2 = 0.59, n = 1325).Using the 104,044 quality-filtered GLAS pulses obtained during autumn 2003 from 97 orbits over the study area, we then predicted aboveground dry biomass for the main vegetation areas of Québec as well as for the entire Province south of the treeline. Including cover type covariances both within and between GLAS orbits increased standard errors of the estimates by two to five times at the vegetation zone level and as much as threefold at the provincial level. Aboveground biomass for the whole study area averaged 39.0 ± 2.2 (standard error) Mg ha? 1 and totalled 4.9 ± 0.3 Pg. Biomass distributions were 12.6% northern hardwoods, 12.6% northern mixedwood, 38.4% commercial boreal, 13% non-commercial boreal, 14.2% taiga, and 9.2% treed tundra. Non-commercial forests represented 36% of the estimated aboveground biomass, thus highlighting the importance of remote northern forests to C sequestration. This study has shown that space-based forest inventories of northern forests could be an efficient way of estimating the amount, distribution, and uncertainty of aboveground biomass and carbon stocks at large spatial scales.  相似文献   
117.
Given the prevalence of computers in education today, it is critical to understand teachers’ perspectives regarding computer integration in their classrooms. The current study surveyed a random sample of a heterogeneous group of 185 elementary and 204 secondary teachers in order to provide a comprehensive summary of teacher characteristics and variables that best discriminate between teachers who integrate computers and those who do not. Discriminant Function Analysis indicated seven variables for elementary teachers and six for secondary teachers (accounting for 74% and 68% of the variance, respectively) that discriminated between high and low integrators. Variables included positive teaching experiences with computers; teacher’s comfort with computers; beliefs supporting the use of computers as an instructional tool; training; motivation; support; and teaching efficacy. Implications for support of computer integration in the classroom are discussed.  相似文献   
118.
This paper addresses an evaluation of new heuristics solution procedures for the location of cross-docks and distribution centers in supply chain network design. The model is characterized by multiple product families, a central manufacturing plant site, multiple cross-docking and distribution center sites, and retail outlets which demand multiple units of several commodities. This paper describes two heuristics that generate globally feasible, near optimal distribution system design and utilization strategies utilizing the simulated annealing (SA) methodology. This study makes two important contributions. First, we continue the study of location planning for the cross-dock and distribution center supply chain network design problem. Second, we systematically evaluate the computational performance of this network design location model under more sophisticated heuristic control parameter settings to better understand interaction effects among the various factors comprising our experimental design, and present convergence results. The central idea of the paper is to evaluate the impact of geometric control mechanism vis-a-vis more sophisticated ones on solution time, quality, and convergence for two new heuristics. Our results suggest that integrating traditional simulated annealing with TABU search is recommended for this supply chain network design and location problem.  相似文献   
119.
基于TM影像的典型内陆淡水湿地水体提取研究   总被引:11,自引:1,他引:10  
水是维系湿地生态系统稳定和健康的决定性因子,利用卫星遥感影像快速、准确地提取湿地水体信息已经成为湿地调查、研究与保护的重要手段。鉴于TM遥感影像具有较高的空间分辨率、波谱分辨率、极为丰富的信息量、较高的定位精度和相对较低的价格,其必然成为近一段时期湿地调查、研究与保护的重要数据源之一。研究基于TM遥感影像,运用多种方法针对典型内陆淡水湿地的水体信息进行了提取实验,通过对实验结果的分析得出:在面积的准确性、提取的准确度以及视觉效果3种指标下,光谱分类法较其它方法效果要好,其次为单波段阈值分析法与植被指数法,较差的是多波段谱间关系法与水体指数法;影响提取效果的主要原因是湿地水体提取不够完全,这是由影像的分辨率及湿地特殊的水文条件所造成的,采用像元分解及多源遥感数据融合技术将成为提高水体提取精度的重要手段。  相似文献   
120.
RATIONALE AND OBJECTIVES: Traditionally, multireader receiver operating characteristic (ROC) studies have used a "paired-case, paired-reader" design. The statistical power of such a design for inferences about the relative accuracies of the tests was assessed and compared with alternative designs. METHODS: The noncentrality parameter of an F statistic was used to compute power as a function of the reader and patient sample sizes and the variability and correlation between readings. RESULTS: For a fixed-power and Type I error rate, the traditional design reduces the number of verified cases required. A hybrid design, in which each reader interprets a different sample of patients, reduces the number of readers, total readings, and reading required per reader. The drawback is a substantial increase in the number of verified cases. CONCLUSION: The ultimate choice of study design depends on the nature of the tests being compared, limiting resources, a priori knowledge of the magnitude of the correlations and variability and logistic complexity.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号