首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   71175篇
  免费   6697篇
  国内免费   2135篇
电工技术   4097篇
技术理论   11篇
综合类   4574篇
化学工业   11707篇
金属工艺   3652篇
机械仪表   4069篇
建筑科学   5437篇
矿业工程   1849篇
能源动力   2069篇
轻工业   5556篇
水利工程   1308篇
石油天然气   3705篇
武器工业   531篇
无线电   8729篇
一般工业技术   9133篇
冶金工业   3436篇
原子能技术   814篇
自动化技术   9330篇
  2025年   57篇
  2024年   1548篇
  2023年   1418篇
  2022年   2164篇
  2021年   3010篇
  2020年   2581篇
  2019年   2053篇
  2018年   2094篇
  2017年   2187篇
  2016年   2224篇
  2015年   2734篇
  2014年   3390篇
  2013年   3904篇
  2012年   4292篇
  2011年   4705篇
  2010年   3881篇
  2009年   3708篇
  2008年   3682篇
  2007年   3446篇
  2006年   3276篇
  2005年   2858篇
  2004年   2135篇
  2003年   1889篇
  2002年   1729篇
  2001年   1545篇
  2000年   1686篇
  1999年   2004篇
  1998年   1709篇
  1997年   1397篇
  1996年   1378篇
  1995年   1143篇
  1994年   1004篇
  1993年   696篇
  1992年   543篇
  1991年   394篇
  1990年   340篇
  1989年   309篇
  1988年   219篇
  1987年   158篇
  1986年   111篇
  1985年   81篇
  1984年   61篇
  1983年   42篇
  1982年   54篇
  1981年   33篇
  1980年   34篇
  1979年   13篇
  1976年   13篇
  1974年   11篇
  1973年   9篇
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
91.
As technology evolves, many organizations face the problem of migrating legacy applications from one technology base to another. We report on a case study involving the migration of legacy code into the IBM® WebSphere® Commerce Suite product. Specifically, we focus on the problem of migrating applications that use traditional database access techniques to applications using the Enterprise JavaBean (EJB) programming model. Our results include a practical methodology that facilitates such migration, as well as a tool that supports this methodology. The tool has been released on IBM's alphaWorks site.  相似文献   
92.
Based on importance measures and fuzzy integrals, a new assessment method for image coding quality is presented in this paper. The proposed assessment is based on two subevaluations. In the first subevaluation, errors on edges, textures, and flat regions are computed individually. The errors are then assessed using an assessment function. A global evaluation with Sugeno fuzzy integral is then obtained based on the importance measure of edge, texture, and flat region. In the second subevaluation, an importance measure is first established depending on the types of regions where errors occur, a subtle evaluation is then obtained using Sugeno fuzzy integral on all pixels of the image. A final evaluation is obtained based on the two subevaluations. Experimental results show that this new image quality assessment closely approximates human subjective tests such as mean opinion score with a high correlation coefficient of 0.963, which is a significant improvement over peak signal-to-noise ratio, picture quality scale, and weighted mean square error, three other image coding quality assessment methods, which have the correlation coefficients of 0.821, 0.875, and 0.716, respectively.  相似文献   
93.
Topical Web crawling is an established technique for domain-specific information retrieval. However, almost all the conventional topical Web crawlers focus on building crawlers using different classifiers, which needs a lot of labeled training data that is very difficult to label manually. This paper presents a novel approach called clustering-based topical Web crawling which is utilized to retrieve information on a specific domain based on link-context and does not require any labeled training data. In order to collect domain-specific content units, a novel hierarchical clustering method called bottom-up approach is used to illustrate the process of clustering where a new data structure, a linked list in combination with CFu-tree, is implemented to store cluster label, feature vector and content unit. During clustering, four metrics are presented. First, comparison variation (CV) is defined to judge whether the closest pair of clusters can be merged. Second, cluster impurity (CIP) evaluates the cluster error. Then, the precision and recall of clustering are also presented to evaluate the accuracy and comprehensive degree of the whole clustering process. Link-context extraction technique is used to expand the feature vector of anchor text which improves the clustering accuracy greatly. Experimental results show that the performance of our proposed method overcomes conventional focused Web crawlers both in Harvest rate and Target recall.  相似文献   
94.
Many emerging online stream processing services require the consideration of quality of service (QoS), which is highly dependent on the placement of services at various hosts. This paper investigates the QoS-aware placement problems of stream processing services under different contexts. On condition that the client demands are stable, the QoS-aware placement problem aiming to minimize the cost when servers are CPU-uncapacitated, is equivalent to the set cover problem, and can be solved by a greedy algorithm with approximation factor O(log?n), where n is the number of clients. However, when CPU capacity constraints on servers are taken into account, the QoS-aware placement problem cannot be approximated unless P=NP. Therefore, we propose two heuristic algorithms: (1) ISCA (Iterated Set Cover-based Algorithm) and (2) KBA (Knapsack-Based Algorithm). We also consider the placement problem of client demands increasing over time. Two objectives, called extension factor and system lifetime, are proposed for demand increment-blind and increment-aware models respectively. Both of them can be solved by extending ISCA and KBA. The experimental results show that ISCA and KBA have distinct effects on different demand sizes. ISCA is more efficient when client demands are relatively small, while KBA performs better for larger demands.  相似文献   
95.
96.
The rapid development of biophotonics and biomedical sciences makes a high demand on photonic structures to be interfaced with biological systems that are capab...  相似文献   
97.
本文以激光相位多普勒(PDA)技术为基础,利用PDA信号探测器及高速信号的处理系统,实现了注射液中不溶性微粒的尺寸分布检测,对PDA技术的实用化研究进行了有益的探索。  相似文献   
98.
卢娴 《电子质量》2008,(1):70-72
电磁兼容数值仿真需要建立电子设备的仿真模型,但是实际电子设备模型比较复杂,包括有大量的微小结构,如各种安装凸台、凹槽等.如果按照实际的模型建立仿真模型会耗费大量的人力物力.同时复杂的模型会使计算量急剧的增加,过于复杂的模型也会导致网格的畸变,导致错误的仿真结果.因此在电磁兼容仿真建模过程中,必须对电子设备的仿真模型加以简化或等效处理.本文基于国内外对微扰问题研究,给出电子设备建模过程中模型简化的基本原理,即微扰理论.并对以后的研究提出相关的思路和方法.  相似文献   
99.
To further improve the performance of acceler-ators, the first cryogenic normal-conducting RF gun in China was designed and manufactured. As a new and attractiv...  相似文献   
100.
探讨了以Fenton试剂和石灰对苎麻脱胶煮炼废水进行氧化混凝处理的方法.试验表明,当FeSO4·7H20、H2O2(质量分数30%)、饱和石灰乳的投加量分别为3 g/L、3 mL/L和4 mL/L时,CODcr的去除率>50%,色度去除率>90%.更重要的是,处理后出水可部分回用于煮炼生产,从而节约生产用碱量,减少废水排放量,还可提高废水可生化性,为后续生物处理创造条件.该处理方法是一种清洁生产处理方法,具有较广阔的应用前景.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号