全文获取类型
收费全文 | 261099篇 |
免费 | 4285篇 |
国内免费 | 1821篇 |
专业分类
电工技术 | 5417篇 |
技术理论 | 5篇 |
综合类 | 1173篇 |
化学工业 | 42113篇 |
金属工艺 | 11040篇 |
机械仪表 | 7757篇 |
建筑科学 | 6593篇 |
矿业工程 | 1661篇 |
能源动力 | 5605篇 |
轻工业 | 25806篇 |
水利工程 | 2889篇 |
石油天然气 | 5999篇 |
武器工业 | 243篇 |
无线电 | 27805篇 |
一般工业技术 | 49275篇 |
冶金工业 | 47259篇 |
原子能技术 | 5617篇 |
自动化技术 | 20948篇 |
出版年
2021年 | 2490篇 |
2019年 | 2129篇 |
2018年 | 3396篇 |
2017年 | 3369篇 |
2016年 | 3695篇 |
2015年 | 2810篇 |
2014年 | 4588篇 |
2013年 | 11370篇 |
2012年 | 7701篇 |
2011年 | 10219篇 |
2010年 | 8129篇 |
2009年 | 8679篇 |
2008年 | 9490篇 |
2007年 | 9599篇 |
2006年 | 8438篇 |
2005年 | 7392篇 |
2004年 | 6656篇 |
2003年 | 6206篇 |
2002年 | 6212篇 |
2001年 | 6319篇 |
2000年 | 5911篇 |
1999年 | 5886篇 |
1998年 | 12865篇 |
1997年 | 9543篇 |
1996年 | 7284篇 |
1995年 | 5531篇 |
1994年 | 5093篇 |
1993年 | 4956篇 |
1992年 | 3948篇 |
1991年 | 3755篇 |
1990年 | 3826篇 |
1989年 | 3772篇 |
1988年 | 3533篇 |
1987年 | 3025篇 |
1986年 | 3050篇 |
1985年 | 3408篇 |
1984年 | 3322篇 |
1983年 | 3075篇 |
1982年 | 2698篇 |
1981年 | 2905篇 |
1980年 | 2643篇 |
1979年 | 2839篇 |
1978年 | 2738篇 |
1977年 | 2841篇 |
1976年 | 3694篇 |
1975年 | 2459篇 |
1974年 | 2291篇 |
1973年 | 2320篇 |
1972年 | 1981篇 |
1971年 | 1785篇 |
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
81.
Many activities in today's organizations are becoming ever more dependent on communications and computer networks, and network managers are faced with the enormous challenge of increasing the availability and efficiency of their infrastructures, which grow both in size and complexity. This makes it crucial to plan network operation systematically, to define and implement appropriate procedures for regular monitoring and performance assessment, and to set up proper tools for maintenance and troubleshooting purposes. Furthermore, proactive network testing must be pursued; it is vital to gather some ideas of normal network operation documented in order to have a comparison term when problems occur. The article has dealt with the measurement instruments and procedures currently adopted for network testing and advances in the field for the I&M research community. 相似文献
82.
Power distribution systems have been significantly affected by many outage-causing events. Good fault cause identification can help expedite the restoration procedure and improve the system reliability. However, the data imbalance issue in many real-world data sets often degrades the fault cause identification performance. In this paper, the E-algorithm, which is extended from the fuzzy classification algorithm by Ishibuchi to alleviate the effect of imbalanced data constitution, is applied to Duke Energy outage data for distribution fault cause identification. Three major outage causes (tree, animal, and lightning) are used as prototypes. The performance of E-algorithm on real-world imbalanced data is compared with artificial neural network. The results show that the E-algorithm can greatly improve the performance when the data are imbalanced 相似文献
83.
Electronic textiles, or e-textiles, are an increasingly important part of wearable computing, helping to make pervasive devices truly wearable. These soft, fabric-based computers can function as lovely embodiments of Mark Weiser's vision of ubiquitous computing: providing useful functionality while disappearing discreetly into the fabric of our clothing. E-textiles also give new, expressive materials to fashion designers, textile designers, and artists, and garments stemming from these disciplines usually employ technology in visible and dramatic style. Integrating computer science, electrical engineering, textile design, and fashion design, e-textiles cross unusual boundaries, appeal to a broad spectrum of people, and provide novel opportunities for creative experimentation both in engineering and design. Moreover, e-textiles are cutting- edge technologies that capture people's imagination in unusual ways. (What other emerging pervasive technology has Vogue magazine featured?) Our work aims to capitalize on these unique features by providing a toolkit that empowers novices to design, engineer, and build their own e-textiles. 相似文献
84.
Stevens M. Williams P.D. Peterson G.L. Kurkowski S.H. 《Computational Intelligence Magazine, IEEE》2008,3(2):65-68
This research supports the hypothesis that the Trust Vector model can be modified to fit the CyberCraft Initiative, and that there are limits to the utility of historical data. This research proposed some modifications and expansions to the Trust Model Vector, and identified areas for future research. 相似文献
85.
Officially, AI was born in 1956. Since then, very impressive progress has been made in many areas - but not in the realm of human level machine intelligence. During much of its early history, AI "was rife "with exaggerated expectations. A headline in an article published in the late forties of last century was headlined, "Electric brain capable of translating foreign languages is being built". Today, more than half a century later, we do have translation software, but nothing that can approach the quality of human translation. Clearly, achievement of human level machine intelligence is a challenge that is hard to meet. A prerequisite to achievement of human level machine intelligence is mechanization of these capabilities and, in particular, mechanization of natural language understanding. To make significant progress toward achievement of human level machine intelligence, a paradigm shift is needed. More specifically, what is needed is an addition to the armamentarium of AI of two methodologies: (a) a nontraditional methodology of computing with words (CW) or more generally, NL-Computation; and (b) a countertraditional methodology "which involves a progression from computing with numbers to computing with words. The centerpiece of these methodologies is the concept of precisiation of meaning. Addition of these methodologies to AI would be an important step toward the achievement of human level machine intelligence and its applications in decision-making, pattern recognition, analysis of evidence, diagnosis, and assessment of causality. Such applications have a position of centrality in our infocentric society. 相似文献
86.
87.
Tomer A. Goldin L. Kuflik T. Kimchi E. Schach S.R. 《IEEE transactions on pattern analysis and machine intelligence》2004,30(9):601-612
We propose a model that enables software developers to systematically evaluate and compare all possible alternative reuse scenarios. The model supports the clear identification of the basic operations involved and associates a cost component with each basic operation in a focused and precise way. The model is a practical tool that assists developers to weigh and evaluate different reuse scenarios, based on accumulated organizational data, and then to decide which option to select in a given situation. The model is currently being used at six different companies for cost-benefit analysis of alternative reuse scenarios; we give a case study that illustrates how it has been used in practice. 相似文献
88.
Theoretical and experimetal methods have been developed to characterize the effect of mechanical loading on the mesoscopic
and macroscopic mechanical state of polycrystalline materials. Ferritic and austenitic single-phase materials were first analyzed,
then phase interaction was studied in a multiductile phase material (austeno-ferritic duplex steel) and a natural reinforced
composite (pearlitic steel). The theoretical method is based on the self-consistent approach in which elastic and plastic
characteristics of the phases have been applied through the micromechanical behavior of single-crystal-using slip systems
and microscopic hardening. The effects of a crystallographic texture and phase interaction during loading and after unloading
were studied. The elastic and plastic anisotropy of the grains having the same crystallographic orientation were assessed
by diffraction strain analysis. The simulation was compared with the experiments performed using the X-ray diffraction technique.
In the considered duplex and pearlitic steels, it was observed that the ferrite stress state is much lower than the austenite
and cementite ones. The results of diffraction strain distribution have showed the pertinence of the models and give valuable
information, for example, for the yield stress and the hardening parameters of each phase in a two-phase material. 相似文献
89.
90.
Sequential Bayesian bit error rate measurement 总被引:1,自引:0,他引:1
As bit error rates decrease, the time required to measure a bit error rate (BER) or perform a BER test (i.e., to determine that a particular communications device's BER is less than some acceptable limit) increases dramatically. One cause of long measurement times is the difficulty of deciding a priori how many bits to measure to establish the BER to within a predetermined confidence interval width. This paper explores a new approach to deciding how many bits to measure, namely a sequential Bayesian approach. As measurement proceeds, the posterior distribution of BER is checked to see if the conclusion can be made that the BER rate is known to be within the desired range with high enough probability. Desired properties of the posterior distribution such as the maximum a postiori estimate and confidence limits can be computed quickly using off-the-shelf numerical software. Examples are given of using this method on bit error data measured with an Agilent 81250 parallel BER tester. 相似文献