首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1121篇
  免费   147篇
  国内免费   66篇
电工技术   58篇
综合类   77篇
化学工业   142篇
金属工艺   33篇
机械仪表   115篇
建筑科学   58篇
矿业工程   26篇
能源动力   43篇
轻工业   208篇
水利工程   26篇
石油天然气   64篇
武器工业   4篇
无线电   56篇
一般工业技术   170篇
冶金工业   19篇
原子能技术   22篇
自动化技术   213篇
  2024年   9篇
  2023年   41篇
  2022年   67篇
  2021年   61篇
  2020年   66篇
  2019年   58篇
  2018年   49篇
  2017年   60篇
  2016年   58篇
  2015年   64篇
  2014年   64篇
  2013年   88篇
  2012年   70篇
  2011年   68篇
  2010年   49篇
  2009年   65篇
  2008年   56篇
  2007年   49篇
  2006年   62篇
  2005年   42篇
  2004年   33篇
  2003年   32篇
  2002年   27篇
  2001年   16篇
  2000年   12篇
  1999年   7篇
  1998年   11篇
  1997年   3篇
  1996年   6篇
  1995年   11篇
  1994年   6篇
  1993年   5篇
  1992年   5篇
  1991年   4篇
  1990年   2篇
  1989年   1篇
  1987年   2篇
  1986年   2篇
  1985年   1篇
  1983年   1篇
  1979年   1篇
排序方式: 共有1334条查询结果,搜索用时 15 毫秒
41.
We propose a general method for predicting multiple steps ahead of our target system and estimating simultaneously the prediction errors in a real time. The requirement of the proposed method is that we have a time series of the target system. We demonstrate the method by artificial data, real wind speed data, and real solar irradiation data.  相似文献   
42.
Analysing and quantifying parametric uncertainties numerically is a tedious task, even more so when the system exhibits subcritical bifurcations. Here a novel interpolation based approach is presented and applied to two simple models exhibiting subcritical Hopf bifurcation. It is seen that this integrated interpolation scheme is significantly faster than traditional Monte Carlo based simulations. The advantages of using this scheme and the reason for its success compared to other uncertainty quantification schemes like Polynomial Chaos Expansion (PCE) are highlighted. The paper also discusses advantages of using an equi-probable node distribution which is seen to improve the accuracy of the proposed scheme. The probabilities of failure (POF) are defined and plotted for various operating conditions. The possibilities of extending the above scheme to experiments are also discussed.  相似文献   
43.
The present discourse is directed toward the community that wishes to generate or use flow reactor data from complex chemical reactions as kinetic model development and validation targets. Various methods for comparing experimental data and computational predictions are in evidence in the literature, along with limited insights into uncertainties associated with each approach. Plug flow is most often assumed in such works as a simple, chemically insightful physical reactor model; however, only brief qualitative justifications for such an interpretation are typically offered. Modern tools permit the researcher to quantitatively confirm the validity of this assumption. In a single complex reaction system, chemical time scales can change dramatically with extent of reaction of the original reactants and with transitions across boundaries separating low temperature, intermediate temperature, and chain branched (high temperature) kinetic regimes. Such transitions can violate the underlying assumptions for plug flow interpretation. Further, uncertainties in reaction initialization may confound interpretation of experiments for which the plug flow assumption may be appropriate. Finally, various methods of acquiring experimental data can also significantly influence experimental interpretations. The following discussions provide important background for those interested in critically approaching the relatively vast literature on the application of flow reactors for generating kinetic validation data. The less frequently discussed influences of reactor simulation assumptions on modeling predictions are addressed through examples for which the kinetic behavior of specific reactant combinations may cause experimental observations to depart locally from plug flow behavior.  相似文献   
44.
《Ergonomics》2012,55(4):361-380
An exposure measurement approach is described for quantifying repetitive hand activity of individual workers in a prospective epidemiological study on work-related upper extremity musculoskeletal disorders. A total of 733 subjects were involved in this study at the baseline. Hand activities were quantified by force and repetition. Force levels were measured by workers' self-reports, ergonomists' estimates based on observation and measurements with instrumentation. Repetition levels were measured by detailed time–motion analyses using two repetitive hand activity definitions and ergonomists' estimates using scales for the American Conference of Governmental Industrial Hygienists hand activity level and the Strain Index. Results showed that the present exposure assessment approach seems to be able to quantify force level and repetitiveness of hand activities. Repetitive hand activity is quantified differently depending on whether forceful hand exertion or repetitive muscle activity is used as the definition. These hand activity definitions may quantify different physical exposure phenomena. Individual exposure assessment is important in epidemiological research of musculoskeletal disorders as there are interactions between the individual subjects and the measured parameters. These interactions may vary between exposure parameters.  相似文献   
45.
This work presents the uncertainty quantification, which includes parametric inference along with uncertainty propagation, for CO2 adsorption in a hollow fiber sorbent, a complex dynamic chemical process. Parametric inference via Bayesian approach is performed using Sequential Monte Carlo, a completely parallel algorithm, and the predictions are obtained by propagating the posterior distribution through the model. The presence of residual variability in the observed data and model inadequacy often present a significant challenge in performing the parametric inference. In this work, residual variability in the observed data is handled by three different approaches: (a) by performing inference with isolated data sets, (b) by increasing the uncertainty in model parameters, and finally, (c) by using a model discrepancy term to account for the uncertainty. The pros and cons of each of the three approaches are illustrated along with the predicted distributions of CO2 breakthrough capacity for a scaled‐up process. © 2016 American Institute of Chemical Engineers AIChE J, 62: 3352–3368, 2016  相似文献   
46.
Electron probe X‐ray microanalysis enables concomitant observation of specimens and analysis of their elemental composition. The method is attractive for engineers developing tissue‐compatible biomaterials. Either changes in element composition of cells or biomaterial can be defined according to well‐established preparation and quantification procedures. However, the qualitative and quantitative elemental analysis appears more complicated when cells or thin tissue sections are deposited on biomaterials. X‐ray spectra generated at the cell/tissue–biomaterial interface are modelled using a Monte Carlo simulation of a cell deposited on borosilicate glass. Enhanced electron backscattering from borosilicate glass was noted until the thickness of the biological layer deposited on the substrate reached 1.25 μm. It resulted in significant increase in X‐ray intensities typical for the elements present in the cellular part. In this case, the mean atomic number value of the biomaterial determines the strength of this effect. When elements are present in the cells only, the positive linear relationship appears between X‐ray intensities and cell thickness. Then, spatial dimensions of X‐ray emission for the particular elements are exclusively in the range of the biological part and the intensities of X‐rays become constant. When the elements are present in both the cell and the biomaterial, X‐ray intensities are registered for the biological part and the substrate simultaneously leading to a negative linear relationship of X‐ray intensities in the function of cell thickness. In the case of the analysis of an element typical for the biomaterial, strong decrease in X‐ray emission is observed in the function of cell thickness as the effect of X‐ray absorption and the limited excitation range to biological part rather than to the substrate. Correction procedures for calculations of element concentrations in thin films and coatings deposited on substrates are well established in materials science, but little is known about factors that have to be taken into account to accurately quantify bioelements in thin and semi‐thick biological samples. Thus thorough tests of currently available quantification procedures are required to verify their applicability to cells or tissues deposited on the biomaterials.  相似文献   
47.
Lithium‐ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium‐ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3‐D imaging techniques, quantitative assessment of 3‐D microstructures from 2‐D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two‐dimensional (2‐D) data sets. In this study, stereological prediction and three‐dimensional (3‐D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium‐ion battery electrodes were imaged using synchrotron‐based X‐ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2‐D image sections generated from tomographic imaging, whereas direct 3‐D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2‐D image sections is bound to be associated with ambiguity and that volume‐based 3‐D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially‐dependent parameters, such as tortuosity and pore‐phase connectivity.  相似文献   
48.
科学家已经研究获得了大量的蛋白质/多肽潜在生物标志物,这些生物标志物需要经过验证和确证才能进一步转化到临床应用.针对蛋白质/多肽的绝对定量研究在标志物验证和确证过程中起到关键作用.传统的蛋白质定量方法,如酶联免疫吸附试验(ELISA)技术存在蛋白质抗体难以获得、不同抗体批次之间存在差异、基于抗体的检测存在交叉反应等问题...  相似文献   
49.
气液两相流电导波动信号的混沌递归特性分析   总被引:9,自引:3,他引:6       下载免费PDF全文
金宁德  郑桂波  陈万鹏 《化工学报》2007,58(5):1172-1179
通过对典型的Lorenz混沌方程和Logistic映射的考察,研究了相空间嵌入参数(嵌入维数、延迟时间和阈值)对递归分析结果的影响。研究结果表明:递归分析对嵌入维数与延迟时间的依赖性不强,嵌入维数与延迟时间变化只是改变递归率数值大小,而不改变递归结构性质;同样,阈值大小选择直接影响递归点数的多少,而不会改变递归结构性质。最后采用递归分析方法对垂直上升管中气液两相流流型进行了表征,研究结果表明,递归结构图可以较好地反映流型演化特征,且递归特征量随气相表观流速变化敏感,为气液两相流流型辨识提供了有用的特征挖掘量。  相似文献   
50.
利用递归分析及定量递归分析研究了水声信号的非线性特征.在计算正弦、随机和洛沦兹3种典型信号递归图的基础上,对2种实际舰船信号的递归图进行了定量分析,并和它们的关联维数进行比较.结果表明:定量递归分析的特征参数与混沌特征参数具有一定的联系,利用定量递归分析的特征量作为特征参数可以对舰船信号进行有效地识别和分类.这一结果对于进一步开展水下目标信号的识别、分类研究具有重要参考价值.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号