首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1568篇
  免费   317篇
  国内免费   217篇
电工技术   182篇
综合类   192篇
化学工业   42篇
金属工艺   11篇
机械仪表   90篇
建筑科学   68篇
矿业工程   152篇
能源动力   36篇
轻工业   54篇
水利工程   72篇
石油天然气   82篇
武器工业   34篇
无线电   241篇
一般工业技术   100篇
冶金工业   37篇
原子能技术   22篇
自动化技术   687篇
  2024年   20篇
  2023年   46篇
  2022年   81篇
  2021年   86篇
  2020年   101篇
  2019年   74篇
  2018年   76篇
  2017年   75篇
  2016年   86篇
  2015年   111篇
  2014年   124篇
  2013年   125篇
  2012年   139篇
  2011年   104篇
  2010年   89篇
  2009年   83篇
  2008年   93篇
  2007年   81篇
  2006年   78篇
  2005年   65篇
  2004年   54篇
  2003年   45篇
  2002年   32篇
  2001年   35篇
  2000年   29篇
  1999年   23篇
  1998年   22篇
  1997年   18篇
  1996年   16篇
  1995年   16篇
  1994年   9篇
  1993年   9篇
  1992年   14篇
  1991年   6篇
  1990年   14篇
  1989年   10篇
  1988年   1篇
  1987年   2篇
  1986年   2篇
  1985年   2篇
  1984年   1篇
  1983年   1篇
  1982年   2篇
  1958年   1篇
  1956年   1篇
排序方式: 共有2102条查询结果,搜索用时 15 毫秒
1.
Metamerism phenomenon can be used in illuminant detection to ensure the accuracy of light source. A method based on Long‐, Middle‐, Short‐ wavelength cones(LMS) weighting algorithm to evaluate metamerism degree is proposed. The chromatic relationship between the degree of metamerism mismatch and the light source is studied. Herein, the consistency between the metameric indices (MIs) and CIE1976 L*a*b* color difference ranking is analyzed by SRCC, KRCC, PLCC and RMSE. A statistically sampling method to obtain practical LMS cone fundamentals to evaluate metamerism degree is employed. The analysis results obtained show that the method based on LMS weighting algorithm has good evaluation ability and stability in simulation experiments and statistically sampling experiments, which are in line with visual characteristics of human. Proposed method meets the requirements of selecting metameric pairs used in light source detection. The analysis results have certain guiding significance.  相似文献   
2.
3.
阐述了铜杆生产过程中信息采集系统的主要研究内容;概括了硬件的结构和选用的主要设备;阐述了上位机软件的主要模块及实现方法;系统采用面向对象的编程方法,界面友好,操作简单,功能完善;实际运行效果良好,具有较好的应用前景。  相似文献   
4.
A weighting algorithm to determine the coordinates of the center of a Gaussian laser beam projected onto a matrix photodetector is considered. The influence of the internal noise of the photodetector, the maximum brightness of the signal at the beam maximum, and the beam radius on the precision of the algorithm is investigated. Recommendations on image processing are presented.  相似文献   
5.
In 1993, the American Society for Testing and Materials carried out a field test of newly calculated tristimulus weighting factors. These weighting factors had been calculated by a method proposed by Venable. the test also included a method of correction for bandpass dependence put forth by Stearns. the purpose of the trial was to assess the possible reduction in bandpass dependence introduced by each of these sets of weights. A large number of sets of spectral data were gathered from the cooperators in the field test. Results of integration by the various sets of tristimulus weighting factors were calculated. A total of 15 120 color differences were calculated and statistics were derived to test the probable error resulting from each method of correction. Errors attributable to bandpass dependence were on the order of a few tenths to as much as one CIELAB unit when uncorrected weight sets were used. These errors could be reduced to a few hundreths of a CIELAB unit, and in some cases to a few thousandths of a unit, by employing one correcting strategy or the other. an overall mix of strategies was ultimately chosen to minimize the bandpass dependence over the entire range of weight sets. Utilizing this mixed strategy, the median error introduced into 10-nm integration by bandpass dependence was only 0.004 CIELAB units. © 1995 John Wiley & Sons, Inc.  相似文献   
6.
Centroid-based categorization is one of the most popular algorithms in text classification. In this approach, normalization is an important factor to improve performance of a centroid-based classifier when documents in text collection have quite different sizes and/or the numbers of documents in classes are unbalanced. In the past, most researchers applied document normalization, e.g., document-length normalization, while some consider a simple kind of class normalization, so-called class-length normalization, to solve the unbalancedness problem. However, there is no intensive work that clarifies how these normalizations affect classification performance and whether there are any other useful normalizations. The purpose of this paper is three folds; (1) to investigate the effectiveness of document- and class-length normalizations on several data sets, (2) to evaluate a number of commonly used normalization functions and (3) to introduce a new type of class normalization, called term-length normalization, which exploits term distribution among documents in the class. The experimental results show that a classifier with weight-merge-normalize approach (class-length normalization) performs better than one with weight-normalize-merge approach (document-length normalization) for the data sets with unbalanced numbers of documents in classes, and is quite competitive for those with balanced numbers of documents. For normalization functions, the normalization based on term weighting performs better than the others on average. For term-length normalization, it is useful for improving classification accuracy. The combination of term- and class-length normalizations outperforms pure class-length normalization and pure term-length normalization as well as unnormalization with the gaps of 4.29%, 11.50%, 30.09%, respectively.  相似文献   
7.
文本索引词项相对权重计算方法与应用   总被引:4,自引:0,他引:4  
文本索引词权重计算方法决定了文本分类的准确率。该文提出一种文本索引词项相对权重计算方法,即文本索引词项权重根据索引词项在该文本中的出现频率与在整个文本空间出现的平均频率之间的相对值进行计算。该方法能有效地提高索引词对文本内容识别的准确性。  相似文献   
8.
9.
针对蛋白质序列分类的需求,深入研究了蛋白质序列分类算法。对蛋白质序列的特征属性进行了大量的分析和研究,给出了蛋白质序列特征属性的描述形式。在此基础上设计了一种基于加权决策树的蛋白质序列分类算法,详细阐述了加权决策树的构造过程以及决策树的主要参数计算方法,而且根据蛋白质序列的特征,对决策树进行了改进,给出了加权决策树的实现方法。测试结果表明:设计的蛋白质序列分类算法具有较高的分类精度和较快的执行速度。  相似文献   
10.
To counter the problem of acquiring and processing huge amounts of data for synthetic aperture radar(SAR)using traditional sampling techniques,a method for sparse SAR imaging with an optimized azimuthal aperture is presented.The equivalence of an azimuthal match filter and synthetic array beamforming is shown so that optimization of the azimuthal sparse aperture can be converted to optimization of synthetic array beamforming.The azimuthal sparse aperture,which is composed of a middle aperture and symmetrical bilateral apertures,can be obtained by optimization algorithms(density weighting and simulated annealing algorithms,respectively).Furthermore,sparse imaging of spectrum analysis SAR based on the optimized sparse aperture is achieved by padding zeros at null samplings and using a non-uniform Taylor window.Compared with traditional sampling,this method has the advantages of reducing the amount of sampling and alleviating the computational burden with acceptable image quality.Unlike periodic sparse sampling,the proposed method exhibits no image ghosts.The results obtained from airborne measurements demonstrate the effectiveness and superiority of the proposed method.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号