首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9296篇
  免费   318篇
  国内免费   18篇
电工技术   102篇
综合类   9篇
化学工业   2097篇
金属工艺   256篇
机械仪表   188篇
建筑科学   340篇
矿业工程   28篇
能源动力   207篇
轻工业   784篇
水利工程   112篇
石油天然气   38篇
武器工业   2篇
无线电   683篇
一般工业技术   1748篇
冶金工业   1931篇
原子能技术   62篇
自动化技术   1045篇
  2023年   63篇
  2022年   112篇
  2021年   184篇
  2020年   145篇
  2019年   137篇
  2018年   188篇
  2017年   172篇
  2016年   188篇
  2015年   141篇
  2014年   231篇
  2013年   505篇
  2012年   403篇
  2011年   507篇
  2010年   330篇
  2009年   361篇
  2008年   431篇
  2007年   426篇
  2006年   343篇
  2005年   315篇
  2004年   285篇
  2003年   246篇
  2002年   262篇
  2001年   118篇
  2000年   128篇
  1999年   131篇
  1998年   161篇
  1997年   146篇
  1996年   156篇
  1995年   142篇
  1994年   140篇
  1993年   158篇
  1992年   119篇
  1991年   109篇
  1990年   131篇
  1989年   106篇
  1988年   82篇
  1987年   130篇
  1986年   97篇
  1985年   127篇
  1984年   124篇
  1983年   131篇
  1982年   104篇
  1981年   93篇
  1980年   77篇
  1979年   74篇
  1978年   92篇
  1977年   72篇
  1976年   74篇
  1975年   79篇
  1974年   83篇
排序方式: 共有9632条查询结果,搜索用时 0 毫秒
71.
Statistical detection of mass malware has been shown to be highly successful. However, this type of malware is less interesting to cyber security officers of larger organizations, who are more concerned with detecting malware indicative of a targeted attack. Here we investigate the potential of statistically based approaches to detect such malware using a malware family associated with a large number of targeted network intrusions. Our approach is complementary to the bulk of statistical based malware classifiers, which are typically based on measures of overall similarity between executable files. One problem with this approach is that a malicious executable that shares some, but limited, functionality with known malware is likely to be misclassified as benign. Here a new approach to malware classification is introduced that classifies programs based on their similarity with known malware subroutines. It is illustrated that malware and benign programs can share a substantial amount of code, implying that classification should be based on malicious subroutines that occur infrequently, or not at all in benign programs. Various approaches to accomplishing this task are investigated, and a particularly simple approach appears the most effective. This approach simply computes the fraction of subroutines of a program that are similar to malware subroutines whose likes have not been found in a larger benign set. If this fraction exceeds around 1.5 %, the corresponding program can be classified as malicious at a 1 in 1000 false alarm rate. It is further shown that combining a local and overall similarity based approach can lead to considerably better prediction due to the relatively low correlation of their predictions.  相似文献   
72.
73.
The ever accelerating state of technology has powered an increasing interest in heat transfer solutions and process engineering innovations in the microfluidics domain. In order to carry out such developments, reliable heat transfer diagnostic techniques are necessary. Thermo-liquid crystal (TLC) thermography, in combination with particle image velocimetry, has been a widely accepted and commonly used technique for the simultaneous measurement and characterization of temperature and velocity fields in macroscopic fluid flows for several decades. However, low seeding density, volume illumination, and low TLC particle image quality at high magnifications present unsurpassed challenges to its application to three-dimensional flows with microscopic dimensions. In this work, a measurement technique to evaluate the color response of individual non-encapsulated TLC particles is presented. A Shirasu porous glass membrane emulsification approach was used to produce the non-encapsulated TLC particles with a narrow size distribution and a multi-variable calibration procedure, making use of all three RGB and HSI color components, as well as the proper orthogonally decomposed RGB components, was used to achieve unprecedented low uncertainty levels in the temperature estimation of individual particles, opening the door to simultaneous temperature and velocity tracking using 3D velocimetry techniques.  相似文献   
74.
Remote sensing of invasive species is a critical component of conservation and management efforts, but reliable methods for the detection of invaders have not been widely established. In Hawaiian forests, we recently found that invasive trees often have hyperspectral signatures unique from that of native trees, but mapping based on spectral reflectance properties alone is confounded by issues of canopy senescence and mortality, intra- and inter-canopy gaps and shadowing, and terrain variability. We deployed a new hybrid airborne system combining the Carnegie Airborne Observatory (CAO) small-footprint light detection and ranging (LiDAR) system with the Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) to map the three-dimensional spectral and structural properties of Hawaiian forests. The CAO-AVIRIS systems and data were fully integrated using in-flight and post-flight fusion techniques, facilitating an analysis of forest canopy properties to determine the presence and abundance of three highly invasive tree species in Hawaiian rainforests.

The LiDAR sub-system was used to model forest canopy height and top-of-canopy surfaces; these structural data allowed for automated masking of forest gaps, intra- and inter-canopy shadows, and minimum vegetation height in the AVIRIS images. The remaining sunlit canopy spectra were analyzed using spatially-constrained spectral mixture analysis. The results of the combined LiDAR-spectroscopic analysis highlighted the location and fractional abundance of each invasive tree species throughout the rainforest sites. Field validation studies demonstrated < 6.8% and < 18.6% error rates in the detection of invasive tree species at  7 m2 and  2 m2 minimum canopy cover thresholds. Our results show that full integration of imaging spectroscopy and LiDAR measurements provides enormous flexibility and analytical potential for studies of terrestrial ecosystems and the species contained within them.  相似文献   

75.
Joseph Fong  Herbert Shiu  Davy Cheung 《Software》2008,38(11):1183-1213
Integrating information from multiple data sources is becoming increasingly important for enterprises that partner with other companies for e‐commerce. However, companies have their internal business applications deployed on diverse platforms and no standard solution for integrating information from these sources exists. To support business intelligence query activities, it is useful to build a data warehouse on top of middleware that aggregates the data obtained from various heterogeneous database systems. Online analytical processing (OLAP) can then be used to provide fast access to materialized views from the data warehouse. Since extensible markup language (XML) documents are a common data representation standard on the Internet and relational tables are commonly used for production data, OLAP must handle both relational and XML data. SQL and XQuery can be used to process the materialized relational and XML data cubes created from the aggregated data. This paper shows how to handle the two kinds of data cubes from a relational–XML data warehouse using extract, transformation and loading. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   
76.
In geographic information retrieval, queries often name geographic regions that do not have a well-defined boundary, such as “Southern France.” We provide two algorithmic approaches to the problem of computing reasonable boundaries of such regions based on data points that have evidence indicating that they lie either inside or outside the region. Our problem formulation leads to a number of subproblems related to red-blue point separation and minimum-perimeter polygons, many of which we solve algorithmically. We give experimental results from our implementation and a comparison of the two approaches. This research is supported by the EU-IST Project No. IST-2001-35047 (SPIRIT) and by grant WO 758/4-2 of the German Research Foundation (DFG).  相似文献   
77.
The question of how best to model rhythmic movements at self-selected amplitude-frequency combinations, and their variability, is a long-standing issue. This study presents a systematic analysis of a coupled oscillator system that has successfully accounted for the experimental result that humans' preferred oscillation frequencies closely correspond to the linear resonance frequencies of the biomechanical limb systems, a phenomenon known as resonance tuning or frequency scaling. The dynamics of the coupled oscillator model is explored by numerical integration in different areas of its parameter space, where a period doubling route to chaotic dynamics is discovered. It is shown that even in the regions of the parameter space with chaotic solutions, the model still effectively scales to the biomechanical oscillator's natural frequency. Hence, there is a solution providing for frequency scaling in the presence of chaotic variability. The implications of these results for interpreting variability as fundamentally stochastic or chaotic are discussed.  相似文献   
78.
We investigate the use of the rough set model for financial time-series data analysis and forecasting. The rough set model is an emerging technique for dealing with vagueness and uncertainty in data. It has many advantages over other techniques, such as fuzzy sets and neural networks, including attribute reduction and variable partitioning of data. These characteristics can be very useful for improving the quality of results from data analysis. We demonstrate a rough set data analysis model for the discovery of decision rules from time series data for example, the New Zealand stock exchanges. Rules are generated through reducts and can be used for future prediction. A unique ranking system for the decision rules based both on strength of the rule and stability of the rule is used in this study. The ranking system gives the user confidence regarding their market decisions. Our experiment results indicate that the forecasting of future stock index values using rough sets obtains decision ruleswith high accuracy and coverage.  相似文献   
79.
Microsystem Technologies - The maximum scan angle amplitude of resonating micro-mirrors, intended for micro-projection display applications is limited by air damping. Three-dimensional transient...  相似文献   
80.
The electromigration process has the potential capability to move atoms one by one when properly controlled. It is therefore an appealing tool to tune the cross section of monoatomic compounds with ultimate resolution or, in the case of polyatomic compounds, to change the stoichiometry with the same atomic precision. As demonstrated here, a combination of electromigration and anti‐electromigration can be used to reversibly displace atoms with a high degree of control. This enables a fine adjustment of the superconducting properties of Al weak links, whereas in Nb the diffusion of atoms leads to a more irreversible process. In a superconductor with a complex unit cell (La2?x Cex CuO4), the electromigration process acts selectively on the oxygen atoms with no apparent modification of the structure. This allows to adjust the doping of this compound and switch from a superconducting to an insulating state in a nearly reversible fashion. In addition, the conditions needed to replace feedback controlled electromigration by a simpler technique of electropulsing are discussed. These findings have a direct practical application as a method to explore the dependence of the characteristic parameters on the exact oxygen content and pave the way for a reversible control of local properties of nanowires.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号