全文获取类型
收费全文 | 23090篇 |
免费 | 2320篇 |
国内免费 | 1584篇 |
专业分类
电工技术 | 2323篇 |
技术理论 | 2篇 |
综合类 | 2080篇 |
化学工业 | 1576篇 |
金属工艺 | 506篇 |
机械仪表 | 1167篇 |
建筑科学 | 3382篇 |
矿业工程 | 972篇 |
能源动力 | 458篇 |
轻工业 | 904篇 |
水利工程 | 1066篇 |
石油天然气 | 884篇 |
武器工业 | 160篇 |
无线电 | 2144篇 |
一般工业技术 | 1274篇 |
冶金工业 | 1039篇 |
原子能技术 | 87篇 |
自动化技术 | 6970篇 |
出版年
2024年 | 98篇 |
2023年 | 271篇 |
2022年 | 581篇 |
2021年 | 615篇 |
2020年 | 696篇 |
2019年 | 553篇 |
2018年 | 535篇 |
2017年 | 672篇 |
2016年 | 748篇 |
2015年 | 873篇 |
2014年 | 1623篇 |
2013年 | 1446篇 |
2012年 | 1853篇 |
2011年 | 2032篇 |
2010年 | 1601篇 |
2009年 | 1558篇 |
2008年 | 1472篇 |
2007年 | 1721篇 |
2006年 | 1475篇 |
2005年 | 1168篇 |
2004年 | 987篇 |
2003年 | 856篇 |
2002年 | 787篇 |
2001年 | 574篇 |
2000年 | 454篇 |
1999年 | 365篇 |
1998年 | 203篇 |
1997年 | 198篇 |
1996年 | 198篇 |
1995年 | 138篇 |
1994年 | 126篇 |
1993年 | 81篇 |
1992年 | 65篇 |
1991年 | 54篇 |
1990年 | 38篇 |
1989年 | 29篇 |
1988年 | 26篇 |
1987年 | 20篇 |
1986年 | 12篇 |
1985年 | 26篇 |
1984年 | 14篇 |
1983年 | 14篇 |
1982年 | 14篇 |
1981年 | 10篇 |
1980年 | 10篇 |
1979年 | 6篇 |
1965年 | 15篇 |
1963年 | 9篇 |
1959年 | 6篇 |
1955年 | 11篇 |
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
31.
32.
33.
现有的蠕虫检测方法大多通过关闭不安全的端口,切断感染主机与未感染主机之间通信等方法延缓蠕虫传播而达到将损害减少到最低程度的目的.实际上在实施这些方法时往往有许多障碍需要克服,其中的最大障碍就是存在错误检测率高的问题.现将免疫危险理论中的DCs(树突状细胞,Dendritic Cells)-T细胞协同机制用于蠕虫检测,其中DCs属于先天免疫系统细胞,T细胞属于适应性免疫系统细胞.本模型将蠕虫进程触发的系统调用序列当作抗原,将感染蠕虫导致的主机和网络异常当作危险信号.在该模型中,DCs负责危险信号的收集检测并提呈与该危险信号关联的抗原给T细胞检测器进行抗原结构检测.理论分析说明,这样的双重检测方法可以降低伪肯定率和伪否定率,并且记忆T细胞检测器的采用能使系统对类似蠕虫的再次感染反应更加迅速. 相似文献
34.
Mao-Zu Guo Jun Wang Chun-yu Wang Yang Liu 《Soft Computing - A Fusion of Foundations, Methodologies and Applications》2009,13(12):1143-1151
TagSNP selection, which aims to select a small subset of informative single nucleotide polymorphisms (SNPs) to represent the
whole large SNP set, has played an important role in current genomic research. Not only can this cut down the cost of genotyping
by filtering a large number of redundant SNPs, but also it can accelerate the study of genome-wide disease association. In
this paper, we propose a new hybrid method called CMDStagger that combines the ideas of the clustering and the graph algorithm,
to find the minimum set of tagSNPs. The proposed algorithm uses the information of the linkage disequilibrium association
and the haplotype diversity to reduce the information loss in tagSNP selection, and has no limit of block partition. The approach
is tested on eight benchmark datasets from Hapmap and chromosome 5q31. Experimental results show that the algorithm in this
paper can reduce the selection time and obtain less tagSNPs with high prediction accuracy. It indicates that this method has
better performance than previous ones. 相似文献
35.
36.
Coronary artery disease (CAD) is a condition in which the heart is not fed sufficiently as a result of the accumulation of fatty matter. As reported by the World Health Organization, around 32% of the total deaths in the world are caused by CAD, and it is estimated that approximately 23.6 million people will die from this disease in 2030. CAD develops over time, and the diagnosis of this disease is difficult until a blockage or a heart attack occurs. In order to bypass the side effects and high costs of the current methods, researchers have proposed to diagnose CADs with computer-aided systems, which analyze some physical and biochemical values at a lower cost. In this study, for the CAD diagnosis, (i) seven different computational feature selection (FS) methods, one domain knowledge-based FS method, and different classification algorithms have been evaluated; (ii) an exhaustive ensemble FS method and a probabilistic ensemble FS method have been proposed. The proposed approach is tested on three publicly available CAD data sets using six different classification algorithms and four different variants of voting algorithms. The performance metrics have been comparatively evaluated with numerous combinations of classifiers and FS methods. The multi-layer perceptron classifier obtained satisfactory results on three data sets. Performance evaluations show that the proposed approach resulted in 91.78%, 85.55%, and 85.47% accuracy for the Z-Alizadeh Sani, Statlog, and Cleveland data sets, respectively. 相似文献
37.
Attribute selection with fuzzy decision reducts 总被引:2,自引:0,他引:2
Rough set theory provides a methodology for data analysis based on the approximation of concepts in information systems. It revolves around the notion of discernibility: the ability to distinguish between objects, based on their attribute values. It allows to infer data dependencies that are useful in the fields of feature selection and decision model construction. In many cases, however, it is more natural, and more effective, to consider a gradual notion of discernibility. Therefore, within the context of fuzzy rough set theory, we present a generalization of the classical rough set framework for data-based attribute selection and reduction using fuzzy tolerance relations. The paper unifies existing work in this direction, and introduces the concept of fuzzy decision reducts, dependent on an increasing attribute subset measure. Experimental results demonstrate the potential of fuzzy decision reducts to discover shorter attribute subsets, leading to decision models with a better coverage and with comparable, or even higher accuracy. 相似文献
38.
B/S架构的信息系统开发中,一般情况下,根据用户的选择,n个条件的输入组合数是2n个。但是,该文中提出的算法,其组合数仅为n到n(n+1)/2,大大简化了编程。 相似文献
39.
40.
CBSD(Component-Based Software Development)已经成为嵌入式软件开发的主流技术。在嵌入式环境下,有大量功能相似的构件,在开发过程中基于QoS的构件选择已成为研究热点。然而这些构件的QoS声明与提供者给出的往往并不一致。为此,提出一种基于修正QoS值的构件选择方法,该方法将QoS信任度作为权重,修正QoS值,再结合模糊逻辑,推理出该构件的综合服务能力。通过嵌入式VOD仿真实验,验证了算法的有效性,保证了构件选择的客观性和准确性。 相似文献