首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   124680篇
  免费   4580篇
  国内免费   2479篇
电工技术   3273篇
技术理论   4篇
综合类   5146篇
化学工业   17902篇
金属工艺   7209篇
机械仪表   5727篇
建筑科学   5590篇
矿业工程   1805篇
能源动力   2353篇
轻工业   6635篇
水利工程   1976篇
石油天然气   2977篇
武器工业   299篇
无线电   13760篇
一般工业技术   20767篇
冶金工业   4511篇
原子能技术   834篇
自动化技术   30971篇
  2024年   251篇
  2023年   837篇
  2022年   1422篇
  2021年   1879篇
  2020年   1552篇
  2019年   1165篇
  2018年   15587篇
  2017年   14691篇
  2016年   11092篇
  2015年   2383篇
  2014年   2345篇
  2013年   2605篇
  2012年   5667篇
  2011年   12173篇
  2010年   10683篇
  2009年   7776篇
  2008年   9070篇
  2007年   9971篇
  2006年   2405篇
  2005年   3148篇
  2004年   2412篇
  2003年   2339篇
  2002年   1802篇
  2001年   1202篇
  2000年   1222篇
  1999年   1139篇
  1998年   830篇
  1997年   688篇
  1996年   643篇
  1995年   496篇
  1994年   415篇
  1993年   277篇
  1992年   237篇
  1991年   188篇
  1990年   125篇
  1989年   103篇
  1988年   80篇
  1987年   58篇
  1986年   54篇
  1985年   32篇
  1968年   43篇
  1967年   33篇
  1966年   43篇
  1965年   45篇
  1959年   39篇
  1958年   37篇
  1957年   36篇
  1956年   34篇
  1955年   63篇
  1954年   68篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
991.
为满足水声对抗视景仿真中水声信号的实时可视化需求,基于OpenGL三维图形编程语言和多线程技术,实现了对水声信号类型、频谱的嵌入和单独窗口的绘制;根据正方形均匀平面阵接收波束旋转对称的结构,提出水平切分的方法,利用GL的二次几何体"Cylinder"实现了对其波束的绘制,并对波束增益、功率、状态等参数进行了可视化映射,然后用GL的显示列表进行了优化;俯视图中通过适当可视化映射,实现了搜索扇面的绘制.最后将其嵌入到利用OpenGVS 搭建的多通道水声对抗视景仿真环境中,实现了在视景中的实时同步显示.  相似文献   
992.
提出了基于仿真退火算法优化思想的任意宽带波束设计方法,其期望响应由工作频带划分成的等间隔窄子带上旁瓣波束形成权系数构成。各个子带上的低旁瓣波束则采用基于仿真退火波束形成原理的波束优化设计方法得到。借助计算机对二十四元圆阵进行波束优化设计。结果表明,使用上述方法设计的波束与常规方法相比,获得了更低的旁瓣级和更稳定束宽,充分说明了方法的可行性和有效性。  相似文献   
993.
研究基于分布式算法的数字滤波器设计方法,介绍了能高效实现固定常数乘法的分布式算法原理,并将分布式算法应用于FIR低通滤波器设计,实现了16阶滤波器的设计和调试.采用了EP3C25F324C8来完成滤波器的设计,其中采用串行加法器将数据进行预相加,将16阶降为8阶,降低了资源占用率并提升了处理速度.使用Matlab编程8阶固定常数系数对应256个值的查找表直接导入到FPGA的ROM中,设计方法具有兼容性,可设计更高阶次的滤波器.通过功能仿真证明,方法可行高效.  相似文献   
994.
本文根据单片机原理与接口理论和实验教学的经验与研究,分析了目前这门课程实验教学的特点和存在的问题,提出将EDA仿真技术应用于实验教学的先进教学方法,可以大大提高学习积极性和总体能力。  相似文献   
995.
In this paper, we propose a distributed agent model that applies belief-desire-intention (BDI) reasoning and negotiation for addressing the linear assignment problem (LAP) collaboratively. In resource allocation, LAP is viewed as seeking a concurrent allocation of one different resource for every task to optimize a linear sum objective function. The proposed model provides a basic agent-based foundation needed for efficient resource allocation in a distributed environment. A distributed agent algorithm that has been developed based on the BDI negotiation model is examined both analytically and experimentally. To improve performance in terms of average negotiation speed and solution quality, two initialization heuristics and two different reasoning control strategies are applied, with the latter yielding different variants of the basic algorithm. Extensive simulations suggest that all the heuristic-algorithm combinations can produce a near optimal solution soon enough in some specific sense. The significance and applicability of the research work are also discussed.  相似文献   
996.
Snow is an important land cover on the earth's surface. It is characterized by its changing nature. Monitoring snow cover extent plays a significant role in dynamic studies and prevention of snow-caused disasters in pastoral areas. Using NASA EOS Terra/MODIS snow cover products and in situ observation data during the four snow seasons from November 1 to March 31 of year 2001 to 2005 in northern Xinjiang area, the accuracy of MODIS snow cover mapping algorithm under varied snow depth and land cover types was analyzed. The overall accuracy of MODIS daily snow cover mapping algorithm in clear sky condition is high at 98.5%; snow agreement reaches 98.2%, and ranges from 77.8% to 100% over the 4-year period for individual sites. Snow depth (SD) is one of the major factors affecting the accuracy of MODIS snow cover maps. MODIS does not identify any snow for SD less than 0.5 cm. The overall accuracy increases with snow depth if SD is equal to or greater than 3 cm, and decreases for SD below 3 cm. Land cover has an important influence in the accuracy of MODIS snow cover maps. The use of MOD10A1 snow cover products is severely affected by cloud cover. The 8-day composite products of MOD10A2 can effectively minimize the effect of cloud cover in most cases. Cloud cover in excess of 10% occurs on 99% of the MOD10A1 products and 14.7% of the MOD10A2 products analyzed during the four snow seasons. User-defined multiple day composite images based on MOD10A1, with flexibilities of selecting composite period, starting and ending date and composite sequence of MOD10A1 products, have an advantage in effectively monitoring snow cover extent for regional snow-caused disasters in pastoral areas.  相似文献   
997.
In this paper, we present a general guideline to find a better distance measure for similarity estimation based on statistical analysis of distribution models and distance functions. A new set of distance measures are derived from the harmonic distance, the geometric distance, and their generalized variants according to the Maximum Likelihood theory. These measures can provide a more accurate feature model than the classical Euclidean and Manhattan distances. We also find that the feature elements are often from heterogeneous sources that may have different influence on similarity estimation. Therefore, the assumption of single isotropic distribution model is often inappropriate. To alleviate this problem, we use a boosted distance measure framework that finds multiple distance measures which fit the distribution of selected feature elements best for accurate similarity estimation. The new distance measures for similarity estimation are tested on two applications: stereo matching and motion tracking in video sequences. The performance of boosted distance measure is further evaluated on several benchmark data sets from the UCI repository and two image retrieval applications. In all the experiments, robust results are obtained based on the proposed methods.  相似文献   
998.
We consider the problem of approximately integrating a Lipschitz function f (with a known Lipschitz constant) over an interval. The goal is to achieve an additive error of at most ε using as few samples of f as possible. We use the adaptive framework: on all problem instances an adaptive algorithm should perform almost as well as the best possible algorithm tuned for the particular problem instance. We distinguish between and , the performances of the best possible deterministic and randomized algorithms, respectively. We give a deterministic algorithm that uses samples and show that an asymptotically better algorithm is impossible. However, any deterministic algorithm requires samples on some problem instance. By combining a deterministic adaptive algorithm and Monte Carlo sampling with variance reduction, we give an algorithm that uses at most samples. We also show that any algorithm requires samples in expectation on some problem instance (f,ε), which proves that our algorithm is optimal.  相似文献   
999.
An instance of the path hitting problem consists of two families of paths, and ℋ, in a common undirected graph, where each path in ℋ is associated with a non-negative cost. We refer to and ℋ as the sets of demand and hitting paths, respectively. When p∈ℋ and share at least one mutual edge, we say that p hits q. The objective is to find a minimum cost subset of ℋ whose members collectively hit those of . In this paper we provide constant factor approximation algorithms for path hitting, confined to instances in which the underlying graph is a tree, a spider, or a star. Although such restricted settings may appear to be very simple, we demonstrate that they still capture some of the most basic covering problems in graphs. Our approach combines several novel ideas: We extend the algorithm of Garg, Vazirani and Yannakakis (Algorithmica, 18:3–20, 1997) for approximate multicuts and multicommodity flows in trees to prove new integrality properties; we present a reduction that involves multiple calls to this extended algorithm; and we introduce a polynomial-time solvable variant of the edge cover problem, which may be of independent interest. An extended abstract of this paper appeared in Proceedings of the 14th Annual European Symposium on Algorithms, 2006. This work is part of D. Segev’s Ph.D. thesis prepared at Tel-Aviv University under the supervision of Prof. Refael Hassin.  相似文献   
1000.
Measuring ranked list robustness for query performance prediction   总被引:2,自引:2,他引:0  
We introduce the notion of ranking robustness, which refers to a property of a ranked list of documents that indicates how stable the ranking is in the presence of uncertainty in the ranked documents. We propose a statistical measure called the robustness score to quantify this notion. Our initial motivation for measuring ranking robustness is to predict topic difficulty for content-based queries in the ad-hoc retrieval task. Our results demonstrate that the robustness score is positively and consistently correlation with average precision of content-based queries across a variety of TREC test collections. Though our focus is on prediction under the ad-hoc retrieval task, we observe an interesting negative correlation with query performance when our technique is applied to named-page finding queries, which are a fundamentally different kind of queries. A side effect of this different behavior of the robustness score between the two types of queries is that the robustness score is also found to be a good feature for query classification.   相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号