首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   7509篇
  免费   976篇
  国内免费   603篇
电工技术   1164篇
综合类   632篇
化学工业   392篇
金属工艺   86篇
机械仪表   576篇
建筑科学   376篇
矿业工程   285篇
能源动力   96篇
轻工业   343篇
水利工程   139篇
石油天然气   177篇
武器工业   99篇
无线电   1432篇
一般工业技术   703篇
冶金工业   269篇
原子能技术   139篇
自动化技术   2180篇
  2024年   39篇
  2023年   128篇
  2022年   183篇
  2021年   240篇
  2020年   261篇
  2019年   228篇
  2018年   194篇
  2017年   275篇
  2016年   361篇
  2015年   331篇
  2014年   488篇
  2013年   569篇
  2012年   580篇
  2011年   570篇
  2010年   437篇
  2009年   433篇
  2008年   432篇
  2007年   520篇
  2006年   454篇
  2005年   394篇
  2004年   315篇
  2003年   268篇
  2002年   225篇
  2001年   194篇
  2000年   165篇
  1999年   134篇
  1998年   103篇
  1997年   91篇
  1996年   73篇
  1995年   68篇
  1994年   49篇
  1993年   53篇
  1992年   40篇
  1991年   30篇
  1990年   31篇
  1989年   21篇
  1988年   17篇
  1987年   11篇
  1986年   17篇
  1985年   12篇
  1984年   6篇
  1983年   6篇
  1982年   7篇
  1981年   4篇
  1980年   8篇
  1979年   3篇
  1976年   3篇
  1959年   5篇
  1958年   2篇
  1955年   2篇
排序方式: 共有9088条查询结果,搜索用时 15 毫秒
41.
Many modeled and observed data are in coarse resolution, which are required to be downscaled. This study develops a probabilistic method to downscale 3-hourly runoff to hourly resolution. Hourly data recorded at the Poldokhtar Stream gauge (Karkheh River basin, Iran) during flood events (2009–2019) are divided into two groups including calibration and validation. Statistical tests including Chi-Square and Kolmogorov–Smirnov test indicate that the Burr distribution is proper distribution functions for rising and falling limbs of the floods’ hydrograph in calibration (2009–2013). A conditional ascending/descending random sampling from the constructed distributions on rising/falling limb is applied to produce hourly runoff. The hourly-downscaled runoff is rescaled based on observation to adjust mean three-hourly data. To evaluate the efficiency of the developed method, statistical measures including root mean square error, Nash–Sutcliffe efficiency, Kolmogorov-Smirnov, and correlation are used to assess the performance of the downscaling method not only in calibration but also in validation (2014–2019). Results show that the hourly downscaled runoff is in close agreement with observations in both calibration and validation periods. In addition, cumulative distribution functions of the downscaled runoff closely follow the observed ones in rising and falling limb in two periods. Although the performance of many statistical downscaling methods decreases in extreme values, the developed model performs well at different quantiles (less and more frequent values). This developed method that can properly downscale other hydroclimatological variables at any time and location is useful to provide high-resolution inputs to drive other models. Furthermore, high-resolution data are required for valid and reliable analysis, risk assessment, and management plans.  相似文献   
42.
Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.  相似文献   
43.
We consider the minimization over probability measures of the expected value of a random variable, regularized by relative entropy with respect to a given probability distribution. In the general setting we provide a complete characterization of the situations in which a finite optimal value exists and the situations in which a minimizing probability distribution exists. Specializing to the case where the underlying probability distribution is Wiener measure, we characterize finite relative entropy changes of measure in terms of square integrability of the corresponding change of drift. For the optimal change of measure for the relative entropy weighted optimization, an expression involving the Malliavin derivative of the cost random variable is derived. The theory is illustrated by its application to several examples, including the case where the cost variable is the maximum of a standard Brownian motion over a finite time horizon. For this example we obtain an exact optimal drift, as well as an approximation of the optimal drift through a Monte-Carlo algorithm.  相似文献   
44.
Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin near Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.  相似文献   
45.
This paper reviews state of the art in the area of decentralized networked control systems with an emphasis on event-triggered approach. The models or agents with the dynamics of linear continuous-time time-invariant state-space systems are considered. They serve for the framework for network phenomena within two basic structures. The I/O-oriented systems as well as the interaction-oriented systems with disjoint subsystems are distinguished. The focus is laid on the presentation of recent decentralized control design and co-design methods which offer effective tools to overcome specific difficulties caused mainly by network imperfections. Such side-effects include communication constraints, variable sampling, time-varying transmission delays, packet dropouts, and quantizations. Decentralized time-triggered methods are briefly discussed. The review is deals mainly with decentralized event-triggered methods. Particularly, the stabilizing controller–observer event-based controller design as well as the decentralized state controller co-design are presented within the I/O-oriented structures of large scale complex systems. The sampling instants depend in this case only on a local information offered by the local feedback loops. Minimum sampling time conditions are discussed. Special attention is focused on interaction-oriented system architecture. Model-based approach combined with event-based state feedback controller design is presented, where the event thresholds are fully decentralized. Finally, several selected open decentralized control problems are briefly offered as recent research challenges.  相似文献   
46.
研究一类具有随机采样特性的网络化系统H∞滤波问题.通过将传感器的随机采样过程建模成马尔可夫链,将数据量化作用转化为模型的参数不确定性,并用二值随机变量描述丢包过程,从而用一个多随机变量的马尔可夫不确定性模型来描述滤波误差系统.应用Lyapunov稳定性理论和随机系统分析方法,导出了滤波误差系统随机稳定且具有给定H∞性能的充分条件,并给出了滤波器的设计方法.仿真结果验证了所提出方法的有效性.  相似文献   
47.
针对同步发电机励磁控制系统在控制精度和控制稳定性方面存在的不足,通过对国内外励磁控制系统发展状况的分析,提出了一种基于32位浮点型处理器的数字式励磁控制系统.该系统通过将先进的交流采样算法和DSP的优异性能相结合,实现了励磁控制器的深度数字化;通过对同步发电机系统关键模拟量的采集与计算,得出适用于当前工况下触发脉冲角度α.试验验证了该设计能很好地解决发电机励磁控制系统中的稳定性问题.  相似文献   
48.
卫星自主定轨时选择测量卫星的几何构型对于定轨精度有着重要的影响.目前地面接收机与导航卫星的几何构型对定位精度的影响主要由GDOP(几何衰减因子)来衡量,GDOP值的下限限制了在一定的测量精度下用户定位的精度范围.通过构建Walker构型的导航星座将地面接收机GDOP取最小值时的边界确定,推广应用到自主导航环境下,利用均匀采样和遗传算法分别独立得到GDOP的最小值,同时用仿真数据验证了这个值的正确性.  相似文献   
49.
In engineering, it is computationally prohibitive to directly employ costly models in optimization. Therefore, surrogate-based optimization is developed to replace the accurate models with cheap surrogates during optimization for efficiency. The two key issues of surrogate-based optimization are how to improve the surrogate accuracy by making the most of the available training samples, and how to sequentially augment the training set with certain infill strategy so as to gradually improve the surrogate accuracy and guarantee the convergence to the real global optimum of the accurate model. To address these two issues, a radial basis function neural network (RBFNN) based optimization method is proposed in this paper. First, a linear interpolation (LI) based RBFNN modelling method, LI-RBFNN, is developed, which can enhance the RBFNN accuracy by enforcing the gradient match between the surrogate and the trend observed from the training samples. Second, a hybrid infill strategy is proposed, which uses the surrogate prediction error based surrogate lower bound as the optimization objective to locate the promising region and meanwhile employs a linear interpolation-based sequential sampling approach to improve the surrogate accuracy globally. Finally, extensive tests are investigated and the effectiveness and efficiency of the proposed methods are demonstrated.  相似文献   
50.
悬浮液进样-火焰原子吸收光谱法测定茶叶中的微量铬   总被引:3,自引:0,他引:3  
将茶叶悬浮于琼脂胶体中制成悬浮液 ,直接喷入空气 -乙炔火焰 ,用火焰原子吸收光谱法测定茶叶中的微量铬 ,方法简便、快速、灵敏度和精密度高、测定结果与用灰化法处理样品一致 ,检验表明 ,两种样品前处理方法之间无显著性差异。相对标准偏差为 3.4 %~ 6 .1% ,加标回收率为 95 .7%~ 10 3.8 %。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号