首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   13103篇
  免费   405篇
  国内免费   61篇
电工技术   261篇
综合类   17篇
化学工业   2365篇
金属工艺   381篇
机械仪表   368篇
建筑科学   226篇
矿业工程   73篇
能源动力   732篇
轻工业   1191篇
水利工程   192篇
石油天然气   82篇
无线电   1640篇
一般工业技术   2826篇
冶金工业   1559篇
原子能技术   203篇
自动化技术   1453篇
  2024年   40篇
  2023年   178篇
  2022年   509篇
  2021年   518篇
  2020年   414篇
  2019年   411篇
  2018年   574篇
  2017年   520篇
  2016年   494篇
  2015年   287篇
  2014年   458篇
  2013年   907篇
  2012年   538篇
  2011年   696篇
  2010年   553篇
  2009年   542篇
  2008年   494篇
  2007年   403篇
  2006年   346篇
  2005年   267篇
  2004年   253篇
  2003年   222篇
  2002年   184篇
  2001年   147篇
  2000年   154篇
  1999年   163篇
  1998年   359篇
  1997年   233篇
  1996年   252篇
  1995年   221篇
  1994年   183篇
  1993年   177篇
  1992年   145篇
  1991年   157篇
  1990年   117篇
  1989年   109篇
  1988年   131篇
  1987年   111篇
  1986年   91篇
  1985年   117篇
  1984年   97篇
  1983年   107篇
  1982年   101篇
  1981年   90篇
  1980年   78篇
  1979年   56篇
  1978年   46篇
  1977年   56篇
  1976年   78篇
  1975年   32篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
991.
Alzheimer’s disease is a non-reversible, non-curable, and progressive neurological disorder that induces the shrinkage and death of a specific neuronal population associated with memory formation and retention. It is a frequently occurring mental illness that occurs in about 60%–80% of cases of dementia. It is usually observed between people in the age group of 60 years and above. Depending upon the severity of symptoms the patients can be categorized in Cognitive Normal (CN), Mild Cognitive Impairment (MCI) and Alzheimer’s Disease (AD). Alzheimer’s disease is the last phase of the disease where the brain is severely damaged, and the patients are not able to live on their own. Radiomics is an approach to extracting a huge number of features from medical images with the help of data characterization algorithms. Here, 105 number of radiomic features are extracted and used to predict the alzhimer’s. This paper uses Support Vector Machine, K-Nearest Neighbour, Gaussian Naïve Bayes, eXtreme Gradient Boosting (XGBoost) and Random Forest to predict Alzheimer’s disease. The proposed random forest-based approach with the Radiomic features achieved an accuracy of 85%. This proposed approach also achieved 88% accuracy, 88% recall, 88% precision and 87% F1-score for AD vs. CN, it achieved 72% accuracy, 73% recall, 72% precisionand 71% F1-score for AD vs. MCI and it achieved 69% accuracy, 69% recall, 68% precision and 69% F1-score for MCI vs. CN. The comparative analysis shows that the proposed approach performs better than others approaches.  相似文献   
992.
Neural Computing and Applications - COVID-19 has emerged as a global crisis with unprecedented socio-economic challenges, jeopardizing our lives and livelihoods for years to come. The...  相似文献   
993.
There has been an extensive and widespread deployment of wireless local area networks (WLANs) for information access. The transmission, being of a broadcast nature, is vulnerable to security threats and hence, the aspect of security provisioning in these networks has assumed an important dimension. The security of the transmitted data over a wireless channel aims at protecting the data from unauthorized access. The objective is achieved by providing advanced security mechanisms. Implementing strong security mechanisms however, affects the throughput performance and increases the complexity of the communication system. In this paper, we investigate the security performance of a WLAN based on IEEE 802.11b/g/n standards on an experimental testbed in congested and uncongested networks in a single and multi-client environment. Experimental results are obtained for a layered security model encompassing nine security protocols in terms of throughput, response time, and encryption overhead. The performance impact of transmission control protocol and user datagram protocol traffic streams on secure wireless networks has also been studied. Through numerical results obtained from the testbed, we have presented quantitative and realistic findings for both security mechanisms as well as network performance. The tradeoff between the strength of the security protocol and the associated performance is analyzed through computer simulation results. The present real time analysis enables the network designers to make intelligent choices about the implementation of security features and the perceived network performance for a given application scenario.  相似文献   
994.
In the recent years, we have seen that Grover search algorithm (Proceedings, 28th annual ACM symposium on the theory of computing, pp. 212–219, 1996) by using quantum parallelism has revolutionized the field of solving huge class of NP problems in comparisons to classical systems. In this work, we explore the idea of extending Grover search algorithm to approximate algorithms. Here we try to analyze the applicability of Grover search to process an unstructured database with a dynamic selection function in contrast to the static selection function used in the original work (Grover in Proceedings, 28th annual ACM symposium on the theory of computing, pp. 212–219, 1996). We show that this alteration facilitates us to extend the application of Grover search to the field of randomized search algorithms. Further, we use the dynamic Grover search algorithm to define the goals for a recommendation system based on which we propose a recommendation algorithm which uses binomial similarity distribution space giving us a quadratic speedup over traditional classical unstructured recommendation systems. Finally, we see how dynamic Grover search can be used to tackle a wide range of optimization problems where we improve complexity over existing optimization algorithms.  相似文献   
995.
The fault-tolerance of distributed algorithms is investigated in asynchronous message passing systems with undetectable process failures. Two specific synchronization problems are considered, the dining philosophers problem and the binary committee coordination problem. The abstraction of a bounded doorway is introduced as a general mechanism for achieving individual progress and good failure locality. Using it as a building block, optimal fault-tolerant algorithms are constructed for the two problems  相似文献   
996.
997.
Our current understanding of Web structure is based on large graphs created by centralized crawlers and indexers. They obtain data almost exclusively from the so-called surface Web, which consists, loosely speaking, of interlinked HTML pages. The deep Web, by contrast, is information that is reachable over the Web, but that resides in databases; it is dynamically available in response to queries, not placed on static pages ahead of time. Recent estimates indicate that the deep Web has hundreds of times more data than the surface Web. The deep Web gives us reason to rethink much of the current doctrine of broad-based link analysis. Instead of looking up pages and finding links on them, Web crawlers would have to produce queries to generate relevant pages. Creating appropriate queries ahead of time is nontrivial without understanding the content of the queried sites. The deep Web's scale would also make it much harder to cache results than to merely index static pages. Whereas a static page presents its links for all to see, a deep Web site can decide whose queries to process and how well. It can, for example, authenticate the querying party before giving it any truly valuable information and links. It can build an understanding of the querying party's context in order to give proper responses, and it can engage in dialogues and negotiate for the information it reveals. The Web site can thus prevent its information from being used by unknown parties. What's more, the querying party can ensure that the information is meant for it.  相似文献   
998.
In this paper, we have proposed a novel use of data mining algorithms for the extraction of knowledge from a large set of flow shop schedules. The purposes of this work is to apply data mining methodologies to explore the patterns in data generated by an ant colony algorithm performing a scheduling operation and to develop a rule set scheduler which approximates the ant colony algorithm's scheduler. Ant colony optimization (ACO) is a paradigm for designing metaheuristic algorithms for combinatorial optimization problems. The natural metaphor on which ant algorithms are based is that of ant colonies. Fascinated by the ability of the almost blind ants to establish the shortest route from their nests to the food source and back, researchers found out that these ants secrete a substance called ‘pheromone’ and use its trails as a medium for communicating information among each other. The ant algorithm is simple to implement and results of the case studies show its ability to provide speedy and accurate solutions. Further, we employed the genetic algorithm operators such as crossover and mutation to generate the new regions of solution. The data mining tool we have used is Decision Tree, which is produced by the See5 software after the instances are classified. The data mining is for mining the knowledge of job scheduling about the objective of minimization of makespan in a flow shop environment. Data mining systems typically uses conditional relationships represented by IF-THEN rules and allowing the production managers to easily take the decisions regarding the flow shop scheduling based on various objective functions and the constraints.  相似文献   
999.
In the present study, the effect of lifting task parameters on the heart rate and oxygen uptake of workers during manual lifting tasks in different ambient conditions was evaluated. The experiments conducted in two different temperature conditions showed a significantly higher oxygen uptake and heart rate in colder conditions as compared to warmer conditions. Three other factors, namely, load, lifting frequency, and vertical distance were found to significant affect the responses. Various combinations of significant factors were used to calculate oxygen uptake and heart rate. These were then compared with the safe limits as per the maximum aerobic capacity of workers. Based on these comparisons, the safe combinations were identified that can be used to design lifting tasks in varied ambient conditions. The study further concluded that lifting tasks performed in winter should have different relaxation or fatigue allowances built into the cycle time of the task to compensate for higher exertion. © 2011 Wiley Periodicals, Inc.  相似文献   
1000.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号