首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   683篇
  免费   18篇
  国内免费   15篇
电工技术   10篇
综合类   34篇
化学工业   28篇
金属工艺   14篇
机械仪表   45篇
建筑科学   55篇
矿业工程   16篇
能源动力   13篇
轻工业   20篇
水利工程   6篇
石油天然气   8篇
武器工业   3篇
无线电   42篇
一般工业技术   86篇
冶金工业   112篇
原子能技术   6篇
自动化技术   218篇
  2023年   5篇
  2022年   8篇
  2021年   5篇
  2020年   7篇
  2019年   7篇
  2018年   6篇
  2017年   4篇
  2016年   21篇
  2015年   16篇
  2014年   24篇
  2013年   49篇
  2012年   42篇
  2011年   58篇
  2010年   47篇
  2009年   43篇
  2008年   62篇
  2007年   29篇
  2006年   47篇
  2005年   31篇
  2004年   23篇
  2003年   28篇
  2002年   16篇
  2001年   10篇
  2000年   17篇
  1999年   12篇
  1998年   18篇
  1997年   11篇
  1996年   4篇
  1995年   2篇
  1994年   9篇
  1993年   1篇
  1992年   3篇
  1991年   3篇
  1990年   5篇
  1989年   6篇
  1986年   6篇
  1985年   5篇
  1984年   4篇
  1982年   4篇
  1981年   2篇
  1980年   4篇
  1978年   4篇
  1977年   3篇
  1976年   1篇
  1975年   1篇
  1974年   1篇
  1973年   2篇
排序方式: 共有716条查询结果,搜索用时 78 毫秒
91.
This work is devoted to the development of a small-world network model to predict real-time fire spread onboard naval vessels. This model takes into account short-range and long-range connections between neighboring and remote network compartments. Fire ignition and flashover, as well as fire transmissions through the walls and ventilation ducts are simulated using time-dependent normal probability density functions. Mean durations of fire transmission through the walls and ducts are determined by a three-zone model and a one-dimensional CFD code, respectively. Specific experiments are conducted in a steel room, representative of a naval vessel compartment, in order to validate the zone model. Then a proof of concept is developed by applying the network model to a full-scale vessel mockup composed of 113 compartments on 7 decks. A statistical study is conducted to produce fire risk maps, classifying the vessel compartments according to their propensity to burn.  相似文献   
92.
A transparent and comprehensive statistical system in China would provide an important basis for enabling a better understanding of the country. This paper focuses on energy intensity (EI), which is one of the most important indicators of China. It firstly reviews China's GDP and energy statistics, showing that China has made great improvements in recent years. The means by which EI data are released and adjusted are then explained. It shows that EI data releases do not provide complete data for calculating EI and constant GDP, which may reduce policy transparency and comprehensiveness. This paper then conducts an EI calculation method that is based on official sources and that respects the data availability of different data release times. It finds that, in general, China's EI statistics can be considered as reliable because most of the results generated by author's calculations match the figures in the official releases. However, two data biases were identified, which may necessitate supplementary information on related constant GDP values used in the official calculation of EI data. The paper concludes by proposing short- and long-term measures for improving EI statistics to provide a transparent and comprehensive EI indicator.  相似文献   
93.
Cycle time is the total elapsed time to move a unit of work from the beginning to the end of a physical process. It is a variable that is used to reduce the costs and increase the output of the physical process. In this note, the exact probability distribution of the cycle time variable is derived. Exact expressions are provided for its probability density function and moments. It is shown how these results could be used for cycle time reduction. Finally, a computer program and tabulations are provided for the associated percentage points.  相似文献   
94.
“Current facilities for computing, display, and real time interaction have developed substantially beyond our understanding of how to use them effectively in data analysis. Current limitations in data analysis technology are mainly in explicating and organizing the science of data analysis and of defining and implementing the necessary associated computer software.

“From the statistical side of the discipline must come: broader, more permissive, empirically oriented concepts and theories; more inclusive and realistic classifications of objectives; more effective and coherent classifications of useful techniques; research toward more empirically informative techniques that will provide both exposure and summarization; more understanding and research on techniques of reforming and re-expressing variables; deeper insight into the psychology of graphs, pictures and output formats in general, both for interaction and for communication; progress toward standardized data structures of great flexibility and comprehensiveness.

“From the computing side of the discipline is required software to provide: convenience with flexibility, simple and effective bookkeeping and history keeping, adequate editing, effective means for treating output as input, more flexible and general graphical presentations, and a variety of means to facilitate real time interaction.

“Though some progress is being made on many of these needs, the technology of data analysis is still in its infancy.” (Tukey and Wilk, 1966)

John Tukey and Martin Wilk have described what, they feel are the future needs in data analysis and statistics in the following terms.  相似文献   
95.
Surface soil contamination is often regulated by using guidance values that specify the maximum amount of pollutant that can be present without prompting a regulatory response. In the United States, there are at least 88 value sets, and another 35 worldwide, that provide guidance for at least one chlorinated ethene. Trichloroethene is the most commonly regulated chlorinated ethene (118 values) and may be the most commonly regulated synthetic organic surface soil contaminant. Cis- and trans-1,2-dichloroethene are the least regulated chlorinated ethenes. Overall, there are 617 guidance values for specific chlorinated ethenes plus another 32 for mixed isomers of dichlorethene. This analysis explores the origin, magnitude, and form of the variability of these values. Results indicate that values span from 4.9 to 6.6 orders of magnitude and follow distributions similar to lognormal random variables. However, distributions include value clusters similar to values advocated by the U.S. Environmental Protection Agency (USEPA) or the Canadian Council of Ministers of the Environment (CCME). Although only 9.5% of the regulatory guidance values (RGVs) are identical to USEPA or CCME values, 55% of these fall within the uncertainty bounds estimated for USEPA risk models. Results suggest that stronger national leadership and reduced risk model uncertainty could be effective in reducing the RGV variability of chlorinated ethenes.  相似文献   
96.
Instead of the frequently applied monochromatic light probes a whie light fibre optic system was employed at the Laboratory of Mechanical Process Design, TU Dortmund University, in order to exploit the color in formations for concentration measurements within bulk solid.The system is applied to obtain local particle concentrations of blue- and red-colored quartz sand within the bed of a rotary drum. 16 solid mixtures with one or two particle sizes from 100 μm to 2000 μm and different species concentration were analyzed and the relationship between probe measurement values and red sand content was determined by statistical regression methods.After transformation of the data, linear models were found to derive the red sand content from given measurement values. Based thereupon, an all-purpose scheme for mono- and bi-disperse solid mixtures was developed and verified in an example with a mean error of 5%.  相似文献   
97.
随着半导体工艺技术的不断进步,芯片制造中的工艺变量,越来越难以控制。于是,数字电路后端设计对时序分析提出了更多的要求。越来越多的进程、电源电压、温度(PVT)等工艺角(corner)传统的静态时序分析方法(STA)变得越来越难以精确地估计制程变异(variation)对于设计性能的影响。在本文中,将会介绍一种新的基于统计学的时序分析方法:Statistical Static Timing Analysis (SSTA)。通过一组附加的数据:精确的制程变异描述文件、统计学标准的库文件,SSTA有望在未来取代传统的静态时序分析方法,从而更好的驾驭越来越先进的半导体工艺技术,以及千万门级高速芯片的设计要求。  相似文献   
98.
Propagation of uncertainty refers to evaluation of uncertainty in output(s) given uncertainty in input(s). This can be across a physical process, or can be predicted based on a process model. Uncertainty can be propagated analytically, by application of Taylor series variance propagation, or numerically, through repeated Monte-Carlo simulations. Propagation of uncertainty is an important concept in process engineering statistics, which is not currently widely taught. In this paper, an approach is provided for teaching uncertainty propagation as part of a larger process engineering statistics course, applying analytical and numerical propagation principles, including consideration of correlation in inputs. A saline blending practical is used as a case study, with experimental and theoretical determination of how variability in feed pump flows determines variability in outlet conductivity. Based on a class of 132 2nd year Chemical Engineering students, learning outcomes in analytical and numerical linear and non-linear propagation models can be attained and enhanced applicability and engagement within the core statistics course. An engagement survey particularly noted that the students recognised the importance of propagation as a technical capability, but noted difficulties in linking the experimental work to theory of propagation. Overall, propagation of uncertainty allows educators to increase the direct relevance of statistics to process engineering and engage with students through their existing analytical capabilities.  相似文献   
99.
A general procedure for estimating the population mean in random effects models of the nested and/of classification type is considered. The suggested estimator is unbiased and consistent (with respect to the structure of the experimental design). It is also optimal with respect to a particular quadratic, location-sensitive criterion. Finally, for experimental designs which contain a certain degree of structural balance, the suggested estimator coincides with the sample mean.  相似文献   
100.
This study deals with the uncertainty of the measurement of lattice parameters by CBED using the kinematic approximation. The analysis of a large number of diffraction patterns acquired on a silicon sample at 93 K with a LaB6 TEM without energy filter shows the presence of both the systematic and the random parts of errors. It is established that random errors follow the normal statistical distribution and that the precision quantified by the relative standard deviation is about 3–4×10−4 for lattice parameter measurements made from single pattern. The error sources are analyzed, different ways of enhancement are reviewed, and a new approach is proposed. It is shown that both accuracy and precision can be simply improved by taking into account multiple patterns analysis for the determination of the actual voltage, the single lattice parameter “a” or the complete set of lattice parameters. The precision of about 1.5–2×10−4 can be reached using a minimum of three HOLZ line patterns for the single “a” parameter and about 5×10−4 for the complete set of lattice parameters using six diffraction patterns. The use of multiple patterns also allows overcoming the non-uniqueness of solution linked to the CBED studies.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号