首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   33093篇
  免费   1002篇
  国内免费   71篇
电工技术   374篇
综合类   41篇
化学工业   6753篇
金属工艺   871篇
机械仪表   636篇
建筑科学   1864篇
矿业工程   256篇
能源动力   1028篇
轻工业   2767篇
水利工程   338篇
石油天然气   239篇
武器工业   5篇
无线电   2159篇
一般工业技术   5214篇
冶金工业   6651篇
原子能技术   328篇
自动化技术   4642篇
  2021年   387篇
  2020年   338篇
  2019年   405篇
  2018年   491篇
  2017年   512篇
  2016年   524篇
  2015年   472篇
  2014年   748篇
  2013年   2146篇
  2012年   1260篇
  2011年   1608篇
  2010年   1188篇
  2009年   1272篇
  2008年   1450篇
  2007年   1484篇
  2006年   1272篇
  2005年   1182篇
  2004年   1061篇
  2003年   1028篇
  2002年   1041篇
  2001年   643篇
  2000年   620篇
  1999年   577篇
  1998年   602篇
  1997年   502篇
  1996年   607篇
  1995年   547篇
  1994年   533篇
  1993年   534篇
  1992年   487篇
  1991年   321篇
  1990年   438篇
  1989年   423篇
  1988年   379篇
  1987年   425篇
  1986年   382篇
  1985年   515篇
  1984年   491篇
  1983年   443篇
  1982年   464篇
  1981年   445篇
  1980年   353篇
  1979年   365篇
  1978年   329篇
  1977年   307篇
  1976年   268篇
  1975年   318篇
  1974年   242篇
  1973年   277篇
  1972年   153篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
101.
The US program for the management and disposal of commercial spent nuclear fuel and high level waste is in a period of potential programmatic, regulatory, and legislative change. Proposals currently being considered by the US Congress would authorize the development of a storage facility as soon as possible adjacent to the potential repository site at Yucca Mountain. The legislation also would establish regulatory requirements for a permanent repository at an individual dose limit of 1 mSv year−1 (100 mrem year−1) for the average person living near the repository. Concurrently, the fiscal year 1996 appropriation to characterize the Yucca Mountain site has been reduced by approximately 40%. These initiatives portend possible changes in the focus of the US program, including a fundamental shift in priority from permanent disposal to temporary storage, and a change in the approach to licensing a potential repository at the Yucca Mountain site. This paper provides the perspective of the members of the Nuclear Waste Technical Review Board on the impact these developments could have on the future of the US program. It discusses the Board's opinion on how to address the issues these and other developments raise in a way which moves the US civilian radioactive waste management program forward.  相似文献   
102.
103.
Using a large database, this study examined 3 refinements of validity generalization procedures: (1) a more accurate procedure for correcting the residual standard deviation (SD) for range restriction to estimate SDp, (2) use of r? instead of study-observed rs in the formula for sampling error variance, and (3) removal of non-Pearson rs. The 1st procedure does not affect the amount of variance accounted for by artifacts. The addition of the 2nd and 3rd procedures increased the mean percentage of validity variance accounted for by artifacts from 70 to 82%, a 17% increase. The cumulative addition of all 3 procedures decreased the mean SDp estimate from .150 to .106, a 29% decrease. Six additional variance-producing artifacts were identified that could not be corrected for. In light of these it was concluded that the obtained estimates of mean SDp and mean validity variance accounted for were consistent with the hypothesis that the true mean SDp value is close to zero. These findings provide further evidence against the situational specificity hypothesis. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
104.
We study the problem of approximating one-dimensional nonintegrable codistributions by integrable ones and apply the resulting approximations to approximate feedback linearization of single-input systems. The approach derived in this paper allows a linearizable nonlinear system to be found that is close to the given system in a least-squares (L 2) sense. A linearly controllable single-input affine nonlinear system is feedback linearizable if and only if its characteristic distribution is involutive (hence integrable) or, equivalently, any characteristic one-form (a one-form that annihilates the characteristic distribution) is integrable. We study the problem of finding (least-squares approximate) integrating factors that make a fixed characteristic one-form close to being exact in anL 2 sense. A given one-form can be decomposed into exact and inexact parts using the Hodge decomposition. We derive an upper bound on the size of the inexact part of a scaled characteristic one-form and show that a least-squares integrating factor provides the minimum value for this upper bound. We also consider higher-order approximate integrating factors that scale a nonintegrable one-form in a way that the scaled form is closer to being integrable inL 2 together with some derivatives and derive similar bounds for the inexact part. This allows a linearizable nonlinear system that is close to the given system in a least-squares (L 2) sense together with some derivatives to be found. The Sobolev embedding techniques allow us to obtain an upper bound on the uniform (L ) distance between the nonlinear system and its linearizable approximation. This research was supported in part by NSF under Grant PYI ECS-9396296, by AFOSR under Grant AFOSR F49620-94-1-0183, and by a grant from the Hughes Aircraft Company.  相似文献   
105.
106.
107.
The 2-process theory of semantic priming (J. H. Neely, 1977; M. I. Posner and C. R. Snyder, 1975) was used to determine the maintenance of automatic processes after severe closed head injury (CHI) and to determine whether processes that demand attention suffer a deficit. Ss with severe CHI (N?=?18,?>?2 yrs postinjury) and 18 matched control Ss completed a lexical decision task in which a category prime was followed by a target. Automatic and attentional priming were determined by orthogonally varying prime–target relatedness, expectancy, and stimulus onset asynchrony. Although the CHI Ss had slower reaction times (RTs) overall, there were no significant group differences in the magnitude of either the automatic or attentional component of semantic priming. The present results indicate the integrity of semantic processes and normal semantic priming in long-term patients with severe CHI. The results are discussed in relation to an attentional resource hypothesis. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
108.
109.
Tested the hypothesis that Ss with severe mental illnesses would achieve better vocational outcomes with an accelerated approach to supported employment (AASE), as compared to gradual approaches (GA) with prevocational training. 86 Ss (mean age 35.1 yrs) with a diagnosis of a serious mental illness, were randomly assigned to either the AASE, or the GA, which included a minimum of 4 mo prevocational training. Data were obtained on indicators of vocational outcomes over 2 yrs, and for a limited number of Ss (n=36), during the 4th yr of inception into the program. Initially, only 5% of Ss preferred prevocational training. After 1 yr, AASE Ss showed better outcomes for a range of indicators, including achievement of competitive employment, duration of employment and mean earnings. During the 4th yr, 59% of these Ss were competitively employed, as compared to only 6% of GA Ss. Rehabilitation is more effective using AASE than GA. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
110.
Single-assignment and functional languages have value semantics that do not permit side-effects. This lack of side-effects makes automatic detection of parallelism and optimization for data locality in programs much easier. However, the same property poses a challenge in implementing these languages efficiently. This paper describes an optimizing compiler system that solves the key problem of aggregate copy elimination. The methods developed rely exclusively on compile-time algorithms, including interprocedural analysis, that are applied to an intermediate data flow representation. By dividing the problem into update-in-place and build-in-place analysis, a small set of relatively simple techniques—edge substitution, graph pattern matching, substructure sharing and substructure targeting—was found to be very powerful. If combined properly and implemented carefully, the algorithms eliminate unnecessary copy operations to a very high degree. No run-time overhead is imposed on the compiled programs.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号