首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1242篇
  免费   74篇
  国内免费   2篇
电工技术   9篇
综合类   1篇
化学工业   273篇
金属工艺   20篇
机械仪表   45篇
建筑科学   61篇
矿业工程   3篇
能源动力   93篇
轻工业   120篇
水利工程   15篇
石油天然气   5篇
无线电   112篇
一般工业技术   218篇
冶金工业   69篇
原子能技术   15篇
自动化技术   259篇
  2024年   4篇
  2023年   13篇
  2022年   31篇
  2021年   38篇
  2020年   32篇
  2019年   35篇
  2018年   63篇
  2017年   36篇
  2016年   65篇
  2015年   37篇
  2014年   63篇
  2013年   120篇
  2012年   73篇
  2011年   100篇
  2010年   61篇
  2009年   134篇
  2008年   79篇
  2007年   56篇
  2006年   56篇
  2005年   29篇
  2004年   29篇
  2003年   22篇
  2002年   21篇
  2001年   12篇
  2000年   14篇
  1999年   7篇
  1998年   13篇
  1997年   9篇
  1996年   3篇
  1995年   6篇
  1994年   3篇
  1993年   3篇
  1992年   2篇
  1991年   4篇
  1990年   5篇
  1989年   2篇
  1988年   1篇
  1987年   1篇
  1986年   3篇
  1985年   1篇
  1984年   3篇
  1983年   5篇
  1982年   5篇
  1980年   2篇
  1979年   7篇
  1978年   3篇
  1977年   1篇
  1975年   1篇
  1974年   3篇
  1973年   2篇
排序方式: 共有1318条查询结果,搜索用时 15 毫秒
91.
An alternative approach to fuzzy control charts: Direct fuzzy approach   总被引:1,自引:0,他引:1  
The major contribution of fuzzy set theory lies in its capability of representing vague data. Fuzzy logic offers a systematic base to deal with situations, which are ambiguous or not well defined. In the literature, there exist few papers on fuzzy control charts, which use defuzziffication methods in the early steps of their algorithms. The use of defuzziffication methods in the early steps of the algorithm makes it too similar to the classical analysis. Linguistic data in those works are transformed into numeric values before control limits are calculated. Thus both control limits as well as sample values become numeric. In this paper, some contributions to fuzzy control charts based on fuzzy transformation methods are made by the use of α-cut to provide the ability of determining the tightness of the inspection: the higher the value of α the tighter inspection. A new alternative approach “Direct Fuzzy Approach (DFA)” is also developed in this paper. In contrast to the existing fuzzy control charts, the proposed approach is quite different in the sense it does not require the use of the defuzziffication. This prevents the loss of information included by the samples. It directly compares the linguistic data in fuzzy space without making any transformation. We use some numeric examples to illustrate the performance of the method and interpret its results.  相似文献   
92.
现代维吾尔语名词词干识别是自然语言处理领域的重要基础性研究,主要目的是从句子中提取名词词干,提高名词识别效率。首先陈述形态分析概念,通过这些形态特征可以准确地识别其词性的意义;其次讨论维吾尔语的词类划分标准、名词的形态特征分析,总结词缀歧义及消解规则;该文提出研究总体思路,设计现代维吾尔语新词中名词识别算法,其中包括特征选择及参数估计、词内部特征、前后依存词特征等;最后将初中、高中物理维吾尔语教材作为验证对象,对名词词干进行统计与分析。  相似文献   
93.
The wavelet domain association rules method is proposed for efficient texture characterization. The concept of association rules to capture the frequently occurring local intensity variation in textures. The frequency of occurrence of these local patterns within a region is used as texture features. Since texture is basically a multi-scale phenomenon, multi-resolution approaches such as wavelets, are expected to perform efficiently for texture analysis. Thus, this study proposes a new algorithm which uses the wavelet domain association rules for texture classification. Essentially, this work is an extension version of an early work of the Rushing et al. [10], [11], where the generation of intensity domain association rules generation was proposed for efficient texture characterization. The wavelet domain and the intensity domain (gray scale) association rules were generated for performance comparison purposes. As a result, Rushing et al. [10], [11] demonstrated that intensity domain association rules performs much more accurate results than those of the methods which were compared in the Rushing et al. work. Moreover, the performed experimental studies showed the effectiveness of the wavelet domain association rules than the intensity domain association rules for texture classification problem. The overall success rate is about 97%.  相似文献   
94.
HIV-1 protease has been the subject of intense research for deciphering HIV-1 virus replication process for decades. Knowledge of the substrate specificity of HIV-1 protease will enlighten the way of development of HIV-1 protease inhibitors. In the prediction of HIV-1 protease cleavage site techniques, various feature encoding techniques and machine learning algorithms have been used frequently. In this paper, a new feature amino acid encoding scheme is proposed to predict HIV-1 protease cleavage sites. In the proposed method, we combined orthonormal encoding and Taylor’s venn-diagram. We used linear support vector machines as the classifier in the tests. We also analyzed our technique by comparing some feature encoding techniques. The tests are carried out on PR-1625 and PR-3261 datasets. Experimental results show that our amino acid encoding technique leads to better classification performance than other encoding techniques on a standalone classifier.  相似文献   
95.
This paper investigates the control of nonlinear systems by neural networks and fuzzy logic. As the control methods, Gaussian neuro-fuzzy variable structure (GNFVS), feedback error learning architecture (FELA) and direct inverse modeling architecture (DIMA) are studied, and their performances are comparatively evaluated on a two degrees of freedom direct drive robotic manipulator with respect to trajectory tracking performance, computational complexity, design complexity, RMS errors, necessary training time in learning phase and payload variations.  相似文献   
96.
Educational timetabling problem is a challenging real world problem which has been of interest to many researchers and practitioners. There are many variants of this problem which mainly require scheduling of events and resources under various constraints. In this study, a curriculum based course timetabling problem at Yeditepe University is described and an iterative selection hyper-heuristic is presented as a solution method. A selection hyper-heuristic as a high level methodology operates on the space formed by a fixed set of low level heuristics which operate directly on the space of solutions. The move acceptance and heuristic selection methods are the main components of a selection hyper-heuristic. The proposed hyper-heuristic in this study combines a simulated annealing move acceptance method with a learning heuristic selection method and manages a set of low level constraint oriented heuristics. A key goal in hyper-heuristic research is to build low cost methods which are general and can be reused on unseen problem instances as well as other problem domains desirably with no additional human expert intervention. Hence, the proposed method is additionally applied to a high school timetabling problem, as well as six other problem domains from a hyper-heuristic benchmark to test its level of generality. The empirical results show that our easy-to-implement hyper-heuristic is effective in solving the Yeditepe course timetabling problem. Moreover, being sufficiently general, it delivers a reasonable performance across different problem domains.  相似文献   
97.
In this study, the strengthening with polymer the polypropylene fiber reinforced concrete exposed to high temperature was examined. Taguchi L9 (33) orthogonal array was used for the design of experiments. Three different parameters were used in the study; polypropylene fiber percentage (0 %, 1 % and 2 %), high temperature degree (300 °C, 600 °C and 900 °C) and curing period (3, 7 and 28 days). Cube samples of 100x100x100 mm sizes were produced for the compressive strength and ultrasonic pulse velocity tests. The samples were removed from the water and dried at 105?±?5 °C, and then they were exposed to temperatures of 300 °C, 600 °C and 900 °C. Then, the polymerization of monomer and the vinyl acetate monomer impregnation on the samples were carried out. The compressive strength and ultrasonic pulse velocity tests were made. Taguchi analysis showed that the largest compressive strength and ultrasonic pulse velocity were obtained at a rate of 0 % from the samples with polypropylene fiber exposed to 600 °C and kept for 28 days as cure period. It was determined as the result of Anova analysis that high temperature had made biggest effect on the compressive strength and ultrasonic pulse velocity of the concrete reinforced with polymer.  相似文献   
98.
Neural Computing and Applications - The use of games in daily life, especially in education, has been in an incline during the COVID-2019 pandemic. Thus, game-based learning environments have...  相似文献   
99.
We present formal definitions of anonymity properties for voting protocols using the process algebra CSP. We analyse a number of anonymity definitions, and give formal definitions for strong and weak anonymity, highlighting the difference between these definitions. We show that the strong anonymity definition is too strong for practical purposes; the weak anonymity definition, however, turns out to be ideal for analysing voting systems. Two case studies are presented to demonstrate the usefulness of the formal definitions: a conventional voting system, and Prêt à Voter, a paper-based, voter-verifiable scheme. In each case, we give a CSP model of the system, and analyse it against our anonymity definitions by specification checks using the Failures-Divergences Refinement (FDR2) model checker. We give a detailed discussion on the results from the analysis, emphasizing the assumptions that we made in our model as well as the challenges in modelling electronic voting systems using CSP.  相似文献   
100.
Palmprint Recognition by Applying Wavelet-Based Kernel PCA   总被引:2,自引:0,他引:2       下载免费PDF全文
This paper presents a wavelet-based kernel Principal Component Analysis (PCA) method by integrating the Daubechies wavelet representation of palm images and the kernel PCA method for palmprint recognition. Kernel PCA is a technique for nonlinear dimension reduction of data with an underlying nonlinear spatial structure. The intensity values of the palmprint image are first normalized by using mean and standard deviation. The palmprint is then transformed into the wavelet domain to decompose palm images and the lowest resolution subband coeffcients are chosen for palm representation. The kernel PCA method is then applied to extract non-linear features from the subband coeffcients. Finally, similarity measurement is accomplished by using weighted Euclidean linear distance-based nearest neighbor classifier. Experimental results on PolyU Palmprint Databases demonstrate that the proposed approach achieves highly competitive performance with respect to the published palmprint recognition approaches.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号