首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1494篇
  免费   95篇
  国内免费   3篇
电工技术   10篇
综合类   1篇
化学工业   336篇
金属工艺   25篇
机械仪表   54篇
建筑科学   69篇
矿业工程   4篇
能源动力   111篇
轻工业   189篇
水利工程   15篇
石油天然气   7篇
无线电   121篇
一般工业技术   265篇
冶金工业   81篇
原子能技术   17篇
自动化技术   287篇
  2024年   7篇
  2023年   15篇
  2022年   32篇
  2021年   54篇
  2020年   43篇
  2019年   46篇
  2018年   73篇
  2017年   49篇
  2016年   79篇
  2015年   43篇
  2014年   75篇
  2013年   148篇
  2012年   89篇
  2011年   126篇
  2010年   76篇
  2009年   149篇
  2008年   91篇
  2007年   72篇
  2006年   65篇
  2005年   34篇
  2004年   38篇
  2003年   27篇
  2002年   26篇
  2001年   13篇
  2000年   14篇
  1999年   7篇
  1998年   15篇
  1997年   10篇
  1996年   7篇
  1995年   6篇
  1994年   5篇
  1993年   3篇
  1992年   3篇
  1991年   4篇
  1990年   5篇
  1989年   2篇
  1987年   1篇
  1986年   4篇
  1985年   1篇
  1984年   3篇
  1983年   5篇
  1982年   5篇
  1981年   1篇
  1980年   2篇
  1979年   7篇
  1978年   4篇
  1977年   1篇
  1975年   1篇
  1974年   3篇
  1973年   2篇
排序方式: 共有1592条查询结果,搜索用时 201 毫秒
101.
Modern products frequently feature monitors designed to detect actual or impending malfunctions. False alarms (Type I errors) or excessive delays in detecting real malfunctions (Type II errors) can seriously reduce monitor utility. Sound engineering practice includes physical evaluation of error rates. Type II error rates are relatively easy to evaluate empirically. However, adequate evaluation of a low Type I error rate is difficult without using accelerated testing concepts, inducing false alarms using artificially low thresholds and then selecting production thresholds by appropriate extrapolation, as outlined here. This acceleration methodology allows for informed determination of detection thresholds and confidence in monitor performance with substantial reductions over current alternatives in time and cost required for monitor development. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   
102.
Optical tweezers are powerful tools for manipulating single DNA molecules using fluorescence microscopy, particularly in nanotechnology‐based DNA analysis. We previously proposed a manipulation technique using microstructures driven by optical tweezers that allows the handling of single giant DNA molecules of millimetre length that cannot be manipulated by conventional techniques. To further develop this technique, the authors characterised the microstructures quantitatively from the view point of fabrication and efficiency of DNA manipulation under a fluorescence microscope. The success rate and precision of the fabrications were evaluated. The results indicate that the microstructures are obtained in an aqueous solution with a precision ∼50 nm at concentrations in the order of 106 particles/ml. The visibility of these microstructures under a fluorescence microscope was also characterised, along with the elucidation of the fabrication parameters needed to fine tune visibility. Manipulating yeast chromosomal DNA molecules with the microstructures illustrated the relationship between the efficiency of manipulation and the geometrical shape of the microstructure. This report provides the guidelines for designing microstructures used in single DNA molecule analysis based on on‐site DNA manipulation, and is expected to broaden the applications of this technique in the future.Inspec keywords: DNA, molecular biophysics, fluorescence, optical microscopy, radiation pressure, biological techniquesOther keywords: optically driven microstructures, single DNA molecule analysis, fluorescence microscopy, optical tweezers, nanotechnology‐based DNA analysis, manipulation technique, aqueous solution, fine tune visibility, yeast chromosomal DNA molecules, geometrical shape, on‐site DNA manipulation  相似文献   
103.
The strong relationship between bank failure and economic growth attaches far more importance to the predictability of bank failures. Consequently, numerous statistical prediction models exist in the literature focusing on this particular subject. Besides, artificial intelligence techniques began to attain an increasing level of importance in the literature due to their predictive success. This study distinguishes itself from the similar ones in the sense that it presents a comparison of three different artificial intelligence methods, namely support vector machines (SVMs), radial basis function neural network (RBF-NN) and multilayer perceptrons (MLPs); in addition to subjecting the explanatory variables to principal component analysis (PCA). The extent of this study encompasses 37 privately owned commercial banks (17 failed, 20 non-failed) that were operating in Turkey for the period of 1997–2001. The main conclusions drawn from the study can be summarized as follows: (i) PCA does not appear to be an effective method with respect to the improvement of predictive power; (ii) SVMs and RBF demonstrated similar levels of predictive power; albeit SVMs was found to be the best model in terms of total predictive power; (iii) MLPs method stood out among the SVMs and RBF methods in a negative sense and exhibits the lowest predictive power.  相似文献   
104.
This paper presents a novel training algorithm for adaptive neuro-fuzzy inference systems. The algorithm combines the error back-propagation algorithm with the variable structure systems approach. Expressing the parameter update rule as a dynamic system in continuous time and applying sliding mode control (SMC) methodology to the dynamic model of the gradient based training procedure results in the parameter stabilizing part of training algorithm. The proposed combination therefore exhibits a degree of robustness to the unmodelled multivariable internal dynamics of the gradient-based training algorithm. With conventional training procedures, the excitation of this dynamics during a training cycle can lead to instability, which may be difficult to alleviate owing to the multidimensionality of the solution space and the ambiguities concerning the environmental conditions. This paper shows that a neuro-fuzzy model can be trained such that the adjustable parameter values are forced to settle down (parameter stabilization) while minimizing an appropriate cost function (cost optimization), which is based on state tracking performance. In the application example, trajectory control of a two degrees of freedom direct drive robotic manipulator is considered. As the controller, an adaptive neuro-fuzzy inference mechanism is used and, in the parameter tuning, the proposed algorithm is utilized.  相似文献   
105.
Despite the huge success of the Internet in providing basic communication services, its economic architecture needs to be upgraded so as to provide end-to-end guaranteed or more reliable services to its customers. Currently, a user or an enterprise that needs end-to-end bandwidth guarantees between two arbitrary points in the Internet for a short period of time has no way of expressing its needs. To allow these much needed basic services, we propose a single-domain edge-to-edge (g2g) dynamic capacity contracting mechanism, where a network customer can enter into a bandwidth contract on a g2g path at a future time, at a predetermined price. For practical and economic viability, such forward contracts must involve a bailout option to account for bandwidth becoming unavailable at service delivery time, and must be priced appropriately to enable Internet Service Providers (ISPs) manage risks in their contracting and investments. Our design allows ISPs to advertise point-to-point different prices for each of their g2g paths instead of the current point-to-anywhere prices, allowing discovery of better end-to-end paths, temporal flexibility and efficiency of bandwidth usage. We compute the risk-neutral prices for these g2g bailout forward contracts (BFCs), taking into account correlations between different contracts due to correlated demand patterns and overlapping paths. We apply this multiple g2g BFC framework on network models with Rocketfuel topologies. We evaluate our contracting mechanism in terms of key network performance metrics like fraction of bailouts, revenue earned by the provider, and adaptability to link failures. We also explore the tradeoffs between complexity of pricing and performance benefits of our BFC mechanism.  相似文献   
106.
Symbolic Regression (SR) analysis, employing a genetic programming (GP) approach, was used to analyse laboratory strength and elasticity modulus data for some granitic rocks from selected regions in Turkey. Total porosity (n), sonic velocity (vp), point load index (Is) and Schmidt Hammer values (SH) for test specimens were used to develop relations between these index tests and uniaxial compressive strength (σc), tensile strength (σt) and elasticity modulus (E). Three GP models were developed. Each GP model was run more than 50 times to optimise the GP functions. Results from the GP functions were compared with the measured data set and it was found that simple functions may not be adequate in explaining strength relations with index properties. The results also indicated that GP is a potential tool for identifying the key and optimal variables (terminals) for building functions for predicting the elasticity modulus and the strength of granitic rocks.  相似文献   
107.
A promising line of research for radar systems attempts to optimize the detector thresholds so as to maximize the overall performance of a radar detector–tracker pair. In the present work, we attempt to move in a direction to fulfill this promise by considering a particular dynamic optimization scheme which relies on a non-simulation performance prediction (NSPP) methodology for the probabilistic data association filter (PDAF), namely, the modified Riccati equation (MRE). By using a suitable functional approximation, we propose a closed-form solution for the special case of a Neyman–Pearson (NP) detector. The proposed solution replaces previously proposed iterative solution formulations and results in dramatic improvement in computational complexity without sacrificed system performance. Moreover, it provides a theoretical lower bound on the detection signal-to-noise ratio (SNR) concerning when the whole tracking system should be switched to the track before detect (TBD) mode.  相似文献   
108.
Swage casting is a new casting technique which combines the advantages of squeeze, centrifugal and semi-solid casting methods. In this new casting method, components with one rotating axis can be produced on a swage casting machine from molten metal in a one-step operation. A shape like a “bomb-body” is chosen to demonstrate the advantages of this new method by using A380 Al–Si–Cu alloy. The same alloy is also cast with centrifugal and squeeze casting methods. In this study, the swage casting method and its features are briefly described. The final microstructures, mechanical properties and amount of porosity of the cast pieces produced by squeeze, centrifugal and swage casting methods are compared. Swage cast pieces showed a different composition of microstructure that consists of fine dendritic particles at the chill ends and a mixture of spherical and rosette shaped particles at the core. The swage cast pieces also have a slightly higher mechanical strength as indicated by tensile strength and Brinell hardness values.  相似文献   
109.
Two-armed poly(?-caprolactone) (TAPCL) polymers were successfully synthesized via the ring opening polymerization (ROP) of ?-caprolactone (?-CL) using the Schiff's base complexes [Cu(SAEE)2] (1) and [Ni(SAEE)2] (2), which have two hydroxyl functional groups, as the two-site initiators and tin(II) 2-ethylhexanoate (Sn(Oct)2) as the catalyst in bulk at 115 °C. The Schiff's base complexes (1 and 2) were synthesized by utilizing the concentrated template synthesis method starting from salicyl aldehyde, 2-(2-aminoethoxy) ethanol and related metal acetate salts. The synthesized TAPCL polymers were characterized by GPC, FTIR, UV–vis, and electron paramagnetic resonance (EPR). The molecular weights of TAPCL polymers linearly increased with increasing molar ratio of the monomer to the initiator. The results obtained from FTIR, UV–vis, and EPR studies indicated that TAPCL polymers had the Schiff's base complexes at the junction point of PCL arms. The crystallization behavior of TAPCL was studied by using differential scanning calorimetry (DSC) and polarizing optical microscopy (POM). Thermal behavior of TAPCL was also investigated by thermogravimetrical analysis (TGA).  相似文献   
110.
In this paper, we study an implicit iterative algorithm for two nonexpansive mappings and two finite families of nonexpansive mappings in Banach spaces. We prove some weak and strong convergence theorems for these iterative algorithms. Our results extend some existing results.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号