全文获取类型
收费全文 | 7976篇 |
免费 | 1036篇 |
国内免费 | 645篇 |
专业分类
电工技术 | 1167篇 |
综合类 | 657篇 |
化学工业 | 584篇 |
金属工艺 | 141篇 |
机械仪表 | 572篇 |
建筑科学 | 367篇 |
矿业工程 | 293篇 |
能源动力 | 163篇 |
轻工业 | 346篇 |
水利工程 | 140篇 |
石油天然气 | 195篇 |
武器工业 | 89篇 |
无线电 | 1434篇 |
一般工业技术 | 763篇 |
冶金工业 | 302篇 |
原子能技术 | 144篇 |
自动化技术 | 2300篇 |
出版年
2024年 | 39篇 |
2023年 | 136篇 |
2022年 | 197篇 |
2021年 | 258篇 |
2020年 | 273篇 |
2019年 | 246篇 |
2018年 | 202篇 |
2017年 | 294篇 |
2016年 | 380篇 |
2015年 | 353篇 |
2014年 | 514篇 |
2013年 | 611篇 |
2012年 | 590篇 |
2011年 | 592篇 |
2010年 | 462篇 |
2009年 | 458篇 |
2008年 | 468篇 |
2007年 | 549篇 |
2006年 | 481篇 |
2005年 | 416篇 |
2004年 | 330篇 |
2003年 | 282篇 |
2002年 | 238篇 |
2001年 | 205篇 |
2000年 | 170篇 |
1999年 | 149篇 |
1998年 | 114篇 |
1997年 | 97篇 |
1996年 | 84篇 |
1995年 | 82篇 |
1994年 | 63篇 |
1993年 | 57篇 |
1992年 | 46篇 |
1991年 | 31篇 |
1990年 | 33篇 |
1989年 | 30篇 |
1988年 | 20篇 |
1987年 | 14篇 |
1986年 | 21篇 |
1985年 | 12篇 |
1984年 | 8篇 |
1983年 | 7篇 |
1982年 | 9篇 |
1981年 | 4篇 |
1980年 | 8篇 |
1979年 | 4篇 |
1976年 | 3篇 |
1959年 | 5篇 |
1958年 | 2篇 |
1955年 | 2篇 |
排序方式: 共有9657条查询结果,搜索用时 15 毫秒
61.
唐宇慧 《组合机床与自动化加工技术》2004,(7):57-59
文章在建立理想边缘成像模型的基础上,分析研究了一般成像系统的点扩展函数与空间采样频率的关系及对图像边缘定位精度的影响,给出了边缘定位误差.同时,还分析研究了量化精度对图像边缘的移位影响,给出了移位误差.最后,给出了本文的分析研究在零件视觉精密检测的应用情况. 相似文献
62.
This paper presents a multivehicle sampling algorithm to generate trajectories for nonuniform coverage of a nonstationary spatiotemporal field characterized by spatial and temporal decorrelation scales that vary in space and time, respectively. The sampling algorithm described in this paper uses a nonlinear coordinate transformation that renders the field locally stationary so that existing multivehicle control algorithms can be used to provide uniform coverage. When transformed back to the original coordinates, the sampling trajectories are concentrated in regions of short spatial and temporal decorrelation scales. For fields with coupled spatial statistics, i.e., the spatial decorrelation scales are functions of both spatial dimensions, the coordinate transformation is implemented numerically, whereas for decoupled spatial statistics, the transformation is expressed analytically. We show that the analytical transformation results in vehicle motion that preserves the vehicle sampling speed (which is a measure of vehicle speed scaled by the ratio of the spatial and temporal decorrelation scales), in the original domain; the sampling speed determines the minimum number of vehicles needed to cover a spatiotemporal domain. Theoretical results are illustrated by numerical simulations. 相似文献
63.
64.
65.
为了提高电荷耦合器件(CCD)一维尺度无接触测量系统的精度和集成度,设计了以现场可编程门阵列(FPGA)器件为核心的测量系统。对CCD输出信号进行低通滤波和相关双采样技术处理,降低了CCD信号噪声。模拟信号转换为12位数字信号后,传输至FPGA内嵌的FIFO中,提高了系统的集成度和稳定性。使用Verilog HDL语言对驱动时序发生器进行了硬件描述,并通过夫琅禾费单缝衍射实验来验证系统的可靠性和精度,实验表明:该系统稳定,精度达到0.82%。 相似文献
66.
67.
Many modeled and observed data are in coarse resolution, which are required to be downscaled. This study develops a probabilistic method to downscale 3-hourly runoff to hourly resolution. Hourly data recorded at the Poldokhtar Stream gauge (Karkheh River basin, Iran) during flood events (2009–2019) are divided into two groups including calibration and validation. Statistical tests including Chi-Square and Kolmogorov–Smirnov test indicate that the Burr distribution is proper distribution functions for rising and falling limbs of the floods’ hydrograph in calibration (2009–2013). A conditional ascending/descending random sampling from the constructed distributions on rising/falling limb is applied to produce hourly runoff. The hourly-downscaled runoff is rescaled based on observation to adjust mean three-hourly data. To evaluate the efficiency of the developed method, statistical measures including root mean square error, Nash–Sutcliffe efficiency, Kolmogorov-Smirnov, and correlation are used to assess the performance of the downscaling method not only in calibration but also in validation (2014–2019). Results show that the hourly downscaled runoff is in close agreement with observations in both calibration and validation periods. In addition, cumulative distribution functions of the downscaled runoff closely follow the observed ones in rising and falling limb in two periods. Although the performance of many statistical downscaling methods decreases in extreme values, the developed model performs well at different quantiles (less and more frequent values). This developed method that can properly downscale other hydroclimatological variables at any time and location is useful to provide high-resolution inputs to drive other models. Furthermore, high-resolution data are required for valid and reliable analysis, risk assessment, and management plans. 相似文献
68.
E brahim Mahdipour Amir Masoud Rahmani Saeed Setayeshi 《International journal of systems science》2014,45(3):373-383
Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates. 相似文献
69.
We consider the minimization over probability measures of the expected value of a random variable, regularized by relative entropy with respect to a given probability distribution. In the general setting we provide a complete characterization of the situations in which a finite optimal value exists and the situations in which a minimizing probability distribution exists. Specializing to the case where the underlying probability distribution is Wiener measure, we characterize finite relative entropy changes of measure in terms of square integrability of the corresponding change of drift. For the optimal change of measure for the relative entropy weighted optimization, an expression involving the Malliavin derivative of the cost random variable is derived. The theory is illustrated by its application to several examples, including the case where the cost variable is the maximum of a standard Brownian motion over a finite time horizon. For this example we obtain an exact optimal drift, as well as an approximation of the optimal drift through a Monte-Carlo algorithm. 相似文献
70.
Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin near Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones. 相似文献