首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   263323篇
  免费   4328篇
  国内免费   1825篇
电工技术   5437篇
技术理论   5篇
综合类   1164篇
化学工业   42485篇
金属工艺   11061篇
机械仪表   7776篇
建筑科学   6684篇
矿业工程   1681篇
能源动力   5696篇
轻工业   26022篇
水利工程   2908篇
石油天然气   6004篇
武器工业   243篇
无线电   28007篇
一般工业技术   49533篇
冶金工业   47932篇
原子能技术   5630篇
自动化技术   21208篇
  2021年   2511篇
  2019年   2149篇
  2018年   3431篇
  2017年   3383篇
  2016年   3723篇
  2015年   2833篇
  2014年   4643篇
  2013年   11466篇
  2012年   7777篇
  2011年   10301篇
  2010年   8191篇
  2009年   8751篇
  2008年   9593篇
  2007年   9691篇
  2006年   8510篇
  2005年   7450篇
  2004年   6714篇
  2003年   6261篇
  2002年   6255篇
  2001年   6376篇
  2000年   5945篇
  1999年   5932篇
  1998年   13064篇
  1997年   9671篇
  1996年   7363篇
  1995年   5594篇
  1994年   5130篇
  1993年   5009篇
  1992年   3982篇
  1991年   3780篇
  1990年   3849篇
  1989年   3801篇
  1988年   3551篇
  1987年   3043篇
  1986年   3077篇
  1985年   3432篇
  1984年   3340篇
  1983年   3100篇
  1982年   2710篇
  1981年   2913篇
  1980年   2660篇
  1979年   2850篇
  1978年   2752篇
  1977年   2863篇
  1976年   3751篇
  1975年   2469篇
  1974年   2301篇
  1973年   2337篇
  1972年   1993篇
  1971年   1791篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
81.
Many activities in today's organizations are becoming ever more dependent on communications and computer networks, and network managers are faced with the enormous challenge of increasing the availability and efficiency of their infrastructures, which grow both in size and complexity. This makes it crucial to plan network operation systematically, to define and implement appropriate procedures for regular monitoring and performance assessment, and to set up proper tools for maintenance and troubleshooting purposes. Furthermore, proactive network testing must be pursued; it is vital to gather some ideas of normal network operation documented in order to have a comparison term when problems occur. The article has dealt with the measurement instruments and procedures currently adopted for network testing and advances in the field for the I&M research community.  相似文献   
82.
Power distribution systems have been significantly affected by many outage-causing events. Good fault cause identification can help expedite the restoration procedure and improve the system reliability. However, the data imbalance issue in many real-world data sets often degrades the fault cause identification performance. In this paper, the E-algorithm, which is extended from the fuzzy classification algorithm by Ishibuchi to alleviate the effect of imbalanced data constitution, is applied to Duke Energy outage data for distribution fault cause identification. Three major outage causes (tree, animal, and lightning) are used as prototypes. The performance of E-algorithm on real-world imbalanced data is compared with artificial neural network. The results show that the E-algorithm can greatly improve the performance when the data are imbalanced  相似文献   
83.
Electronic textiles, or e-textiles, are an increasingly important part of wearable computing, helping to make pervasive devices truly wearable. These soft, fabric-based computers can function as lovely embodiments of Mark Weiser's vision of ubiquitous computing: providing useful functionality while disappearing discreetly into the fabric of our clothing. E-textiles also give new, expressive materials to fashion designers, textile designers, and artists, and garments stemming from these disciplines usually employ technology in visible and dramatic style. Integrating computer science, electrical engineering, textile design, and fashion design, e-textiles cross unusual boundaries, appeal to a broad spectrum of people, and provide novel opportunities for creative experimentation both in engineering and design. Moreover, e-textiles are cutting- edge technologies that capture people's imagination in unusual ways. (What other emerging pervasive technology has Vogue magazine featured?) Our work aims to capitalize on these unique features by providing a toolkit that empowers novices to design, engineer, and build their own e-textiles.  相似文献   
84.
This research supports the hypothesis that the Trust Vector model can be modified to fit the CyberCraft Initiative, and that there are limits to the utility of historical data. This research proposed some modifications and expansions to the Trust Model Vector, and identified areas for future research.  相似文献   
85.
Officially, AI was born in 1956. Since then, very impressive progress has been made in many areas - but not in the realm of human level machine intelligence. During much of its early history, AI "was rife "with exaggerated expectations. A headline in an article published in the late forties of last century was headlined, "Electric brain capable of translating foreign languages is being built". Today, more than half a century later, we do have translation software, but nothing that can approach the quality of human translation. Clearly, achievement of human level machine intelligence is a challenge that is hard to meet. A prerequisite to achievement of human level machine intelligence is mechanization of these capabilities and, in particular, mechanization of natural language understanding. To make significant progress toward achievement of human level machine intelligence, a paradigm shift is needed. More specifically, what is needed is an addition to the armamentarium of AI of two methodologies: (a) a nontraditional methodology of computing with words (CW) or more generally, NL-Computation; and (b) a countertraditional methodology "which involves a progression from computing with numbers to computing with words. The centerpiece of these methodologies is the concept of precisiation of meaning. Addition of these methodologies to AI would be an important step toward the achievement of human level machine intelligence and its applications in decision-making, pattern recognition, analysis of evidence, diagnosis, and assessment of causality. Such applications have a position of centrality in our infocentric society.  相似文献   
86.
87.
We propose a model that enables software developers to systematically evaluate and compare all possible alternative reuse scenarios. The model supports the clear identification of the basic operations involved and associates a cost component with each basic operation in a focused and precise way. The model is a practical tool that assists developers to weigh and evaluate different reuse scenarios, based on accumulated organizational data, and then to decide which option to select in a given situation. The model is currently being used at six different companies for cost-benefit analysis of alternative reuse scenarios; we give a case study that illustrates how it has been used in practice.  相似文献   
88.
Theoretical and experimetal methods have been developed to characterize the effect of mechanical loading on the mesoscopic and macroscopic mechanical state of polycrystalline materials. Ferritic and austenitic single-phase materials were first analyzed, then phase interaction was studied in a multiductile phase material (austeno-ferritic duplex steel) and a natural reinforced composite (pearlitic steel). The theoretical method is based on the self-consistent approach in which elastic and plastic characteristics of the phases have been applied through the micromechanical behavior of single-crystal-using slip systems and microscopic hardening. The effects of a crystallographic texture and phase interaction during loading and after unloading were studied. The elastic and plastic anisotropy of the grains having the same crystallographic orientation were assessed by diffraction strain analysis. The simulation was compared with the experiments performed using the X-ray diffraction technique. In the considered duplex and pearlitic steels, it was observed that the ferrite stress state is much lower than the austenite and cementite ones. The results of diffraction strain distribution have showed the pertinence of the models and give valuable information, for example, for the yield stress and the hardening parameters of each phase in a two-phase material.  相似文献   
89.
90.
Sequential Bayesian bit error rate measurement   总被引:1,自引:0,他引:1  
As bit error rates decrease, the time required to measure a bit error rate (BER) or perform a BER test (i.e., to determine that a particular communications device's BER is less than some acceptable limit) increases dramatically. One cause of long measurement times is the difficulty of deciding a priori how many bits to measure to establish the BER to within a predetermined confidence interval width. This paper explores a new approach to deciding how many bits to measure, namely a sequential Bayesian approach. As measurement proceeds, the posterior distribution of BER is checked to see if the conclusion can be made that the BER rate is known to be within the desired range with high enough probability. Desired properties of the posterior distribution such as the maximum a postiori estimate and confidence limits can be computed quickly using off-the-shelf numerical software. Examples are given of using this method on bit error data measured with an Agilent 81250 parallel BER tester.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号