首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1931篇
  免费   90篇
  国内免费   4篇
电工技术   21篇
综合类   1篇
化学工业   409篇
金属工艺   25篇
机械仪表   31篇
建筑科学   122篇
矿业工程   3篇
能源动力   79篇
轻工业   120篇
水利工程   19篇
石油天然气   12篇
无线电   179篇
一般工业技术   311篇
冶金工业   286篇
原子能技术   7篇
自动化技术   400篇
  2023年   10篇
  2022年   18篇
  2021年   27篇
  2020年   22篇
  2019年   21篇
  2018年   35篇
  2017年   39篇
  2016年   37篇
  2015年   47篇
  2014年   69篇
  2013年   130篇
  2012年   97篇
  2011年   138篇
  2010年   99篇
  2009年   102篇
  2008年   104篇
  2007年   91篇
  2006年   111篇
  2005年   85篇
  2004年   92篇
  2003年   68篇
  2002年   67篇
  2001年   32篇
  2000年   26篇
  1999年   32篇
  1998年   28篇
  1997年   24篇
  1996年   35篇
  1995年   25篇
  1994年   20篇
  1993年   30篇
  1992年   25篇
  1991年   21篇
  1990年   14篇
  1989年   20篇
  1988年   7篇
  1987年   12篇
  1986年   9篇
  1985年   18篇
  1984年   16篇
  1983年   6篇
  1982年   17篇
  1981年   10篇
  1980年   7篇
  1979年   12篇
  1978年   8篇
  1977年   7篇
  1976年   13篇
  1975年   7篇
  1970年   7篇
排序方式: 共有2025条查询结果,搜索用时 31 毫秒
101.
Bacteriophages as accessory genetic elements play a crucial role in the dissemination of genes and the promotion of genetic diversity within bacterial populations. Such horizontal transfer of DNA is critical in the emergence of new pathogenic organisms, through the dissemination of genes encoding virulence factors such as toxins, adhesins and agressins. Phages can transfer genes that are not necessary for bacteriophage persistence and are generally recognised by their ability to convert their host bacteria to new phenotypes. This phenomenon is known as phage conversion. If such converting genes encode for virulence factors, the consequences of phage infection may include increased virulence of the host bacteria, and the conversion of a non‐pathogenic strain to a potentially dangerous pathogen. A number of virulence factors in bacteria causing diseases in plants, animals and humans are encoded by converting phages, the vast majority of which are temperate as opposed to lytic in nature. © 2001 Society of Chemical Industry  相似文献   
102.
103.
104.
Geological Process Models (GPMs) have been used in the past to simulate the distinctive stratigraphies formed in carbonate sediments, and to explore the interaction of controls that produce heterogeneity. Previous GPMs have only indirectly included the supersaturation of calcium carbonate in seawater, a key physicochemical control on carbonate production in reef and lagoon environments, by modifying production rates based on the distance from open marine sources. We here use the residence time of water in the lagoon and reef areas as a proxy for the supersaturation state of carbonate in a new process model, Carbonate GPM. Residence times in the model are calculated using a particle-tracking algorithm. Carbonate production is also controlled by water depth and wave power dissipation. Once deposited, sediment can be eroded, transported and re-deposited via both advective and diffusive processes. We show that using residence time as a control on production might explain the formation of non-ordered, three-dimensional carbonate stratigraphies by lateral shifts in the locus of carbonate deposition on timescales comparable to so-called 5th-order sea-level oscillations. We also show that representing supersaturation as a function of distance from open marine sources, as in previous models, cannot correctly predict the supersaturation distribution over a lagoon due to the intricacies of the flow regime.  相似文献   
105.
In 2001, the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE) in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories) initiated development of a process designated quantification of margins and uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples. The basic ideas and challenges that underlie NNSA's mandate for QMU are present, and have been successfully addressed, in a number of past analyses for complex systems. To provide perspective on the implementation of a requirement for QMU in the analysis of a complex system, three past analyses are presented as examples: (i) the probabilistic risk assessment carried out for the Surry Nuclear Power Station as part of the U.S. Nuclear Regulatory Commission's (NRC's) reassessment of the risk from commercial nuclear power in the United States (i.e., the NUREG-1150 study), (ii) the performance assessment for the Waste Isolation Pilot Plant carried out by the DOE in support of a successful compliance certification application to the U.S. Environmental Agency, and (iii) the performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, carried out by the DOE in support of a license application to the NRC. Each of the preceding analyses involved a detailed treatment of uncertainty and produced results used to establish compliance with specific numerical requirements on the performance of the system under study. As a result, these studies illustrate the determination of both margins and the uncertainty in margins in real analyses.  相似文献   
106.
Quantification of Margins and Uncertainties   总被引:1,自引:0,他引:1  
  相似文献   
107.
Jon G. Hall 《Expert Systems》2009,26(4):305-306
  相似文献   
108.
We present a computing microsystem that uniquely leverages the bandwidth, density, and latency advantages of silicon photonic interconnect to enable highly compact supercomputer-scale systems. We describe and justify single-node and multinode systems interconnected with wavelength-routed optical links, quantify their benefits vis-a-vis electrically connected systems, analyze the constituent optical component and system requirements, and provide an overview of the critical technologies needed to fulfill this system vision. This vision calls for more than a hundredfold reduction in energy to communicate an optical bit of information. We explore the power dissipation of a photonic link, suggest a roadmap to lower the energy-per-bit of silicon photonic interconnects, and identify the challenges that will be faced by device and circuit designers towards this goal.  相似文献   
109.
Hydrogen can be produced via steam reformation of many feedstocks. External heat sources provide the thermal energy required by the endothermic steam reformation reactions. Temperature control of the steam reformation reactor is critical to reactor performance and catalyst life. Closed-loop control systems are typically used to modulate the heat input rate based on a comparison between a set point temperature and a temperature measurement. The location of the temperature sensor relative to the heat input location is a choice made during reactor design that can have significant impact on reactor temperature control.  相似文献   
110.
The analysis of many physical and engineering problems involves running complex computational models (simulation models, computer codes). With problems of this type, it is important to understand the relationships between the input variables (whose values are often imprecisely known) and the output. The goal of sensitivity analysis (SA) is to study this relationship and identify the most significant factors or variables affecting the results of the model. In this presentation, an improvement on existing methods for SA of complex computer models is described for use when the model is too computationally expensive for a standard Monte-Carlo analysis. In these situations, a meta-model or surrogate model can be used to estimate the necessary sensitivity index for each input. A sensitivity index is a measure of the variance in the response that is due to the uncertainty in an input. Most existing approaches to this problem either do not work well with a large number of input variables and/or they ignore the error involved in estimating a sensitivity index. Here, a new approach to sensitivity index estimation using meta-models and bootstrap confidence intervals is described that provides solutions to these drawbacks. Further, an efficient yet effective approach to incorporate this methodology into an actual SA is presented. Several simulated and real examples illustrate the utility of this approach. This framework can be extended to uncertainty analysis as well.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号