首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   181篇
  免费   6篇
电工技术   2篇
化学工业   38篇
金属工艺   6篇
机械仪表   1篇
建筑科学   11篇
矿业工程   1篇
能源动力   22篇
轻工业   11篇
无线电   22篇
一般工业技术   34篇
冶金工业   12篇
自动化技术   27篇
  2023年   1篇
  2022年   3篇
  2021年   3篇
  2020年   2篇
  2019年   4篇
  2018年   8篇
  2017年   2篇
  2016年   5篇
  2015年   3篇
  2014年   8篇
  2013年   14篇
  2012年   15篇
  2011年   25篇
  2010年   14篇
  2009年   4篇
  2008年   16篇
  2007年   5篇
  2006年   10篇
  2005年   7篇
  2004年   4篇
  2003年   6篇
  2002年   4篇
  2000年   3篇
  1998年   5篇
  1997年   3篇
  1995年   1篇
  1994年   2篇
  1993年   1篇
  1992年   1篇
  1990年   1篇
  1985年   1篇
  1984年   1篇
  1983年   1篇
  1972年   2篇
  1971年   2篇
排序方式: 共有187条查询结果,搜索用时 15 毫秒
51.
52.
This article considers the detection of image features in different spatial scales. The main focus is on capturing the scale-dependent differences in a pair of noisy images, but the technique developed can also be applied to the analysis of single images. The approach proposed uses Bayesian statistical modeling and simulation-based inference, and it can be viewed as a further development of SiZer technology, originally designed for nonparametric curve fitting. Numerical examples include artificial test images and a preliminary analysis of a pair of Landsat images used in satellite-based forest inventory. This article has supplementary material online.  相似文献   
53.
M. Mandø  L. Rosendahl  C. Yin  H. Sørensen 《Fuel》2010,89(10):3051-3062
A CFD simulation of pulverized coal and straw combustion using a commercial multifuel burner have been undertaken to examine the difference in combustion characteristics. Focus has also been directed to development of the modeling technique to deal with larger non-spherical straw particles and to determine the relative importance of different modeling choices for straw combustion. Investigated modeling choices encompass the particle size and shape distribution, the modification of particle motion and heating due to the departure from the spherical ideal, the devolatilization rate of straw, the influence of inlet boundary conditions and the effect of particles on the carrier phase turbulence. It is concluded that straw combustion is associated with a significantly longer flame and smaller recirculation zones compared to coal combustion for the present air flow specifications. The particle size and shape distribution is the most influential parameter for the correct prediction of straw combustion. The inlet boundary conditions and the application of a turbulence modulation model can significantly affect the predicted combustion efficiency whereas the choice of devolatilization parameters was found to be of minor importance.  相似文献   
54.
Glassy carbon electrodes were grafted with carboxyphenyl groups by reduction of 4-carboxyphenyldiazonium tetrafluoroborate and these modified electrodes were characterised by cyclic voltammetry and ac impedance measurements. Cu(II) was reacted with the carboxyphenyl groups in the film to give a surface voltammetric response for the immobilised Cu(II)/Cu(I) couple. The results indicated an ECEC mechanism, in which the chemical steps correspond to the change of coordination environment following the electron transfer steps. The relaxation half-life time for the Cu(I) species formed after electron transfer was estimated at (140 ± 11) s. The large value of the peak width of 200 mV was analysed by modelling the voltammograms and the large value of the full width at half maximum (FWHM) could be explained by dispersion in the formal potentials of Cu centres present in a variety of environments in the films studied. An ECEC mechanism (scheme of square) is proposed for the electron transfer reaction considering that the chemical step after reduction of the Cu(II) complex corresponds to conformational changes within the attached layer. Experimental data clearly show that the oxidation of the reduced film can take place from different Cu(I) complexes formed along the reduction to the fully relaxed Cu(I) species.  相似文献   
55.
Lasse Makkonen   《Structural Safety》2008,30(5):405-419
Engineering design for structural safety is largely based on the statistics of natural hazards. These statistics are utilized by applying the theory of extremes, which predicts a cumulative distribution function of the extreme events. The parameters of this distribution are found by a fit to the historical extremes and the probabilities of potentially disastrous events are then calculated. It is pointed out here that this procedure often results in underestimation of the risk. This is because wrong probability plotting positions are widely used and because theoretical extreme value distributions are asymptotic only, so that in many cases they bring misleading information to the analysis. The means to avoid these problems in the extreme value analysis are outlined.  相似文献   
56.
During 90’s Nokia utilized Concurrent Engineering (CE) process in mobile phone business successfully. Strong growing of the company, more complex technologies, maturing markets and changes in competition has increased the need to develop the product process of the company to keep its position as an agile, innovative and productive product developer. Dynamic simulation approach has been one of the activities among other product process re-engineering efforts in the company.This paper describes the approach and “Product Process Decision Simulation” (PPDS) solution as the first implemented application of the approach. A dynamic model of product development has been created and applied to manage product process complex dynamic behavior on system level in order to reduce product development cycle times, slippages and costs as well as improve perceived product quality. The key contribution of the simulation solution is to provoke facilitated discussion in order to gain shared understanding of interdependencies and dynamic causes and effects in product process.The implementation and frequent simulation workshops have started in June 2006 and about 500 R&D people have already participated.  相似文献   
57.
The introduction of a methylenthiol group at position 7 of camptothecin was carried out in four steps. This preparation also yielded the corresponding disulfide, which behaves as a prodrug due to its reactivity with glutathione. Assessment of their antiproliferative activities, investigations of their mechanism of action, and molecular modeling analysis indicated that the 7‐modified camptothecin derivatives described herein maintain the biological activity and drug–target interactions of the parent compound.  相似文献   
58.
The availability of a fast and reliable method for non-destructive case depth determination which can be used to directly monitor the quality of various heat treatment processes is of great interest. Conventionally hardened steel components are analyzed by means of depth-resolved microhardness measurements providing the case hardening depth (CHD) of the material. However, as a consequence of mechanical destruction, the investigated part cannot be used in its original state anymore and needs to be replaced by a twin part whose properties might be different. In this work, a new approach of non-destructive CHD determination based on Barkhausen noise analysis with an excitation frequency of 0.5 Hz is described. The use of a low frequency allows to enhance the penetration depth of the external magnetic field and to reduce the eddy current damping of the filtered Barkhausen (BN) signal at the same time. In this way, simultaneous information about the hard case and the soft base material core is experimentally accessible. Various measurands derived from the detected BN signals are sensitive to the different magnetic properties of the case and the core. The so-obtained correlations with the measured CHD can be used for non-destructive CHD determination by means of appropriate calibration data. The investigated sample set consists of cylindrical parts of 18CrNiMo7-6 which were case-hardened and ground in order to provide graded CHD values up to 3.6 mm. The sensitivity of the tested low-frequency method will be quantitatively demonstrated over the complete range of interest and its potential for industrial applications will be discussed.  相似文献   
59.
Burnt area maps based on satellite observations are frequently used in calculations related to fire regime, such as those of carbon dioxide emissions. Nevertheless, burnt area estimates between products vary widely, and validation against independent data is scarce, especially for Europe. Here we compare two active fire maps (the ATSR World Fire Atlas and the Moderate Resolution Imaging Spectroradiometer (MODIS) Active Fire Product) and two fire scars maps (the L3JRC and the MODIS Burned Area Product) to independent national statistics taken from 22 European countries between 1997 and 2008. We also tested the coincidence between satellite products derived by calculation of the fraction of active fires that were confirmed by a subsequent drop in reflectance. As a large proportion of fire pixels (between 40% and 66%, depending on the product) is located on urban land or crop fields, filtering out fires located on these land uses greatly improves the agreement between satellite-based burnt area estimates and national statistics and it also improves the coincidence between satellite products. The MODIS Active Fire Product appears to be most suitable for use as a proxy for burnt area patterns, showing a high correlation to national statistics (R2 = 0.9), relatively low spatial and temporal heterogeneity and only a slight underestimation of the total burnt area (19 000 ha year–1). Unfiltered products show cases of substantial wildfire overestimation in all products, mainly attributable to anthropogenic activity, in the case of active fire products, and drought-induced vegetation dieback, in that of fire scar maps. Thus, filtering out fires on anthropogenic land uses seems to be essential when analysing patterns of forest fires from satellite observations. However, if agricultural fires are to be included, a combination of MODIS Active Fire and MODIS Burned Area products is recommended. We obtained that such combination shows low temporal and spatial heterogeneity and the highest coincidence between satellite products (25%), although the correlation to national statistics is not very high (R2 = 0.67) and clearly underestimates the total burnt area (187 000 ha year–1).  相似文献   
60.
We present the first deterministic 1+ε approximation algorithm for finding a large matching in a bipartite graph in the semi-streaming model which requires only O((1/ε)5) passes over the input stream. In this model, the input graph G=(V,E) is given as a stream of its edges in some arbitrary order, and storage of the algorithm is bounded by O(npolylog?n) bits, where $n = \lvert {V}\rvert $ . The only previously known arbitrarily good approximation for general graphs is achieved by the randomized algorithm of McGregor (Proceedings of the International Workshop on Approximation Algorithms for Combinatorial Optimization Problems and Randomization and Computation, Berkeley, CA, USA, pp. 170–181, 2005), which uses Ω((1/ε)1/ε ) passes. We show that even for bipartite graphs, McGregor’s algorithm needs Ω(1/ε) Ω(1/ε) passes, thus it is necessarily exponential in the approximation parameter. The design as well as the analysis of our algorithm require the introduction of some new techniques. A novelty of our algorithm is a new deterministic assignment of matching edges to augmenting paths which is responsible for the complexity reduction, and gets rid of randomization. We repeatedly grow an initial matching using augmenting paths up to a length of 2k+1 for k=?2/ε?. We terminate when the number of augmenting paths found in one iteration falls below a certain threshold also depending on k, that guarantees a 1+ε approximation. The main challenge is to find those augmenting paths without requiring an excessive number of passes. In each iteration, using multiple passes, we grow a set of alternating paths in parallel, considering each edge as a possible extension as it comes along in the stream. Backtracking is used on paths that fail to grow any further. Crucial are the so-called position limits: when a matching edge is the ith matching edge in a path and it is then removed by backtracking, it will only be inserted into a path again at a position strictly lesser than i. This rule strikes a balance between terminating quickly on the one hand and giving the procedure enough freedom on the other hand.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号