全文获取类型
收费全文 | 7787篇 |
免费 | 411篇 |
国内免费 | 14篇 |
专业分类
电工技术 | 83篇 |
综合类 | 2篇 |
化学工业 | 1897篇 |
金属工艺 | 176篇 |
机械仪表 | 149篇 |
建筑科学 | 281篇 |
矿业工程 | 13篇 |
能源动力 | 253篇 |
轻工业 | 718篇 |
水利工程 | 75篇 |
石油天然气 | 28篇 |
武器工业 | 6篇 |
无线电 | 565篇 |
一般工业技术 | 1595篇 |
冶金工业 | 931篇 |
原子能技术 | 44篇 |
自动化技术 | 1396篇 |
出版年
2023年 | 56篇 |
2022年 | 121篇 |
2021年 | 166篇 |
2020年 | 160篇 |
2019年 | 134篇 |
2018年 | 180篇 |
2017年 | 198篇 |
2016年 | 259篇 |
2015年 | 168篇 |
2014年 | 271篇 |
2013年 | 612篇 |
2012年 | 454篇 |
2011年 | 551篇 |
2010年 | 375篇 |
2009年 | 374篇 |
2008年 | 507篇 |
2007年 | 429篇 |
2006年 | 376篇 |
2005年 | 310篇 |
2004年 | 284篇 |
2003年 | 246篇 |
2002年 | 204篇 |
2001年 | 126篇 |
2000年 | 108篇 |
1999年 | 117篇 |
1998年 | 137篇 |
1997年 | 118篇 |
1996年 | 111篇 |
1995年 | 97篇 |
1994年 | 112篇 |
1993年 | 86篇 |
1992年 | 66篇 |
1991年 | 55篇 |
1990年 | 50篇 |
1989年 | 69篇 |
1988年 | 54篇 |
1987年 | 26篇 |
1986年 | 35篇 |
1985年 | 43篇 |
1984年 | 40篇 |
1983年 | 28篇 |
1982年 | 26篇 |
1981年 | 42篇 |
1980年 | 24篇 |
1979年 | 23篇 |
1978年 | 26篇 |
1977年 | 16篇 |
1976年 | 15篇 |
1974年 | 22篇 |
1973年 | 14篇 |
排序方式: 共有8212条查询结果,搜索用时 0 毫秒
21.
Gold nanoparticles used in most experiments (1–10 nm) in gold catalysis show varying degrees of reactivity, with particles
below 5 nm generally being more reactive. The origin of this activity is a subject of a number of model experiments and theoretical
studies on either clusters of a few atoms in size or extended surfaces (smooth or stepped). In the work described here, a
classical theory for the variation of the metal workfunction with cluster size, Extended Hückel Theory (EHT) calculations
combined with DFT calculations, as well as a carbon monoxide (CO) chemisorption model are combined to develop a relationship
between metal particle size and the particle's reactivity towards CO. For gold, it is shown that while the contribution of
the d-band hybridization energy to the total CO chemisorption energy is unfavourable for bulk gold, this is not true for gold
particles below 5–6 nm. That is, the d-band hybridization energy is negative for small gold particles. This is believed to
be explanation of the onset of high reactivity for small gold particles. 相似文献
22.
A high temperature Seebeck coefficient measurement apparatus with various features to minimize typical sources of error is designed and built. Common sources of temperature and voltage measurement error are described and principles to overcome these are proposed. With these guiding principles, a high temperature Seebeck measurement apparatus with a uniaxial 4-point contact geometry is designed to operate from room temperature to over 1200 K. This instrument design is simple to operate, and suitable for bulk samples with a broad range of physical types and shapes. 相似文献
23.
Eric M. Nielsen Stephen D. Prince Gregory T. Koeln 《Remote sensing of environment》2008,112(11):4061-4074
Although the impacts of wetland loss are often felt at regional scales, effective planning and management require a comparative assessment of local needs, costs, and benefits. Satellite remote sensing can provide spatially explicit, synoptic land cover change information to support such an assessment. However, a common challenge in conventional remote sensing change detection is the difficulty of obtaining phenologically and radiometrically comparable data from the start and end of the time period of interest. An alternative approach is to use a prior land cover classification as a surrogate for historic satellite data and to examine the self-consistency of class spectral reflectances in recent imagery. We produced a 30-meter resolution wetland change probability map for the U.S. mid-Atlantic region by applying an outlier detection technique to a base classification provided by the National Wetlands Inventory (NWI). Outlier-resistant measures – the median and median absolute deviation – were used to represent spectral reflectance characteristics of wetland class populations, and formed the basis for the calculation of a pixel change likelihood index. The individual scene index values were merged into a consistent region-wide map and converted to pixel change probability using a logistic regression calibrated through interpretation of historic and recent aerial photography. The accuracy of a regional change/no-change map produced from the change probabilities was estimated at 89.6%, with a Kappa of 0.779. The change probabilities identify areas for closer inspection of change cause, impact, and mitigation potential. With additional work to resolve confusion resulting from natural spatial heterogeneity and variations in land use, automated updating of NWI maps and estimates of areal rates of wetland change may be possible. We also discuss extensions of the technique to address specific applications such as monitoring marsh degradation due to sea level rise and mapping of invasive species. 相似文献
24.
Distance learning's interfaces—from corresponding through the postal service to the televised talking head—have traditionally been designed from the top down, supporting banking models of learning or, in writing instruction, current-traditional rhetoric pedagogies. Due to temporal and spatial constraints, these interface designs often support (or encourage) one-way communication from the instructor to the student. Students mostly interact with the instructor by asking questions or submitting work, and they tend to have little correspondence with other peers. These methods clearly privilege the instructor's knowledge and evaluation. Furthermore, these interface designs empower the instructor to gaze upon the students and assess them—often not as a corporeal body but as a corpus of texts. Thus, each interface adopted for distance learning sets up a power dynamic in which the capability to share the roles of creating knowledge is juxtaposed with the instructor's capability to normalize the students and reify their own authority through their gaze. In this article we examine the traditional classroom interface through the correspondence course interface, the simulated classroom interface, and the synchronous video interface to raise questions about the infrastructures of distance learning and their implications for student learning. 相似文献
25.
26.
Vidroha Debroy Author VitaeW. Eric WongAuthor Vitae 《Journal of Systems and Software》2011,84(4):587-602
Test set size in terms of the number of test cases is an important consideration when testing software systems. Using too few test cases might result in poor fault detection and using too many might be very expensive and suffer from redundancy. We define the failure rate of a program as the fraction of test cases in an available test pool that result in execution failure on that program. This paper investigates the relationship between failure rates and the number of test cases required to detect the faults. Our experiments based on 11 sets of C programs suggest that an accurate estimation of failure rates of potential fault(s) in a program can provide a reliable estimate of adequate test set size with respect to fault detection and should therefore be one of the factors kept in mind during test set construction. Furthermore, the model proposed herein is fairly robust to incorrect estimations in failure rates and can still provide good predictive quality. Experiments are also performed to observe the relationship between multiple faults present in the same program using the concept of a failure rate. When predicting the effectiveness against a program with multiple faults, results indicate that not knowing the number of faults in the program is not a significant concern, as the predictive quality is typically not affected adversely. 相似文献
27.
From Non-Functional Requirements to Design through Patterns 总被引:8,自引:2,他引:6
28.
Eric Steinhart 《Minds and Machines》2007,17(3):261-271
You can survive after death in various kinds of artifacts. You can survive in diaries, photographs, sound recordings, and
movies. But these artifacts record only superficial features of yourself. We are already close to the construction of programs
that partially and approximately replicate entire human lives (by storing their memories and duplicating their personalities).
A digital ghost is an artificially intelligent program that knows all about your life. It is an animated auto-biography. It
replicates your patterns of belief and desire. You can survive after death in a digital ghost. We discuss a series of digital
ghosts over the next 50 years. As time goes by and technology advances, they are progressively more perfect replicas of the
lives of their original authors.
相似文献
Eric SteinhartEmail: Email: URL: http://www.wpunj.edu/cohss/philosophy/steinhart |
29.
Identification of statistical patterns from observed time series of spatially distributed sensor data is critical for performance monitoring and decision making in human-engineered complex systems, such as electric power generation, petrochemical, and networked transportation. This paper presents an information-theoretic approach to identification of statistical patterns in such systems, where the main objective is to enhance structural integrity and operation reliability. The core concept of pattern identification is built upon the principles of Symbolic Dynamics, Automata Theory, and Information Theory. To this end, a symbolic time series analysis method has been formulated and experimentally validated on a special-purpose test apparatus that is designed for data acquisition and real-time analysis of fatigue damage in polycrystalline alloys. 相似文献
30.
Odourous emissions from sewer networks and wastewater treatment plants (WWTPs) can significantly impact a local population. Sampling techniques such as wind tunnels and flux hood chambers are traditionally used to collect area source samples for subsequent quantification of odour emission rates using dilution olfactometry, however these methods are unsuitable for assessing liquid samples from point sources due to the large liquid volumes required. To overcome this limitation, a gas phase sample preparation method was developed for assessing the total Odour Emission Ability (OEA) from a liquid sample. The method was validated using two volatile organic sulphur compounds (VOSCs), dimethyl-trisulphide (DMTS) and bismethylthiomethane (BMTM) that are frequently detected from sewers and WWTPs and are relatively stable compared with common VOSCs like mercaptan or methyl mercaptan. The recovery rates of DMTS and BMTM were quantified by injecting a known volume of a standard liquid sample into Tedlar bags using a static injection and a dynamic injection methodology. It was confirmed that both dynamic and static injection methods at ambient condition achieved high recovery rates with no need to consider increasing evaporation by elevating the temperature. This method can also be used to assess odour removal effectiveness of liquids by comparing the OEA before and after the treatment tests. Two application examples were presented. 相似文献