全文获取类型
收费全文 | 368311篇 |
免费 | 3990篇 |
国内免费 | 1217篇 |
专业分类
电工技术 | 5761篇 |
综合类 | 2464篇 |
化学工业 | 56215篇 |
金属工艺 | 16497篇 |
机械仪表 | 11548篇 |
建筑科学 | 8955篇 |
矿业工程 | 2385篇 |
能源动力 | 7814篇 |
轻工业 | 29306篇 |
水利工程 | 4412篇 |
石油天然气 | 7416篇 |
武器工业 | 16篇 |
无线电 | 40435篇 |
一般工业技术 | 72436篇 |
冶金工业 | 53810篇 |
原子能技术 | 7774篇 |
自动化技术 | 46274篇 |
出版年
2021年 | 2098篇 |
2018年 | 17701篇 |
2017年 | 16610篇 |
2016年 | 13605篇 |
2015年 | 3046篇 |
2014年 | 4443篇 |
2013年 | 12189篇 |
2012年 | 9845篇 |
2011年 | 18289篇 |
2010年 | 15534篇 |
2009年 | 13982篇 |
2008年 | 15432篇 |
2007年 | 16306篇 |
2006年 | 7931篇 |
2005年 | 8455篇 |
2004年 | 7872篇 |
2003年 | 7656篇 |
2002年 | 7132篇 |
2001年 | 6586篇 |
2000年 | 6345篇 |
1999年 | 6279篇 |
1998年 | 14795篇 |
1997年 | 11049篇 |
1996年 | 8606篇 |
1995年 | 6530篇 |
1994年 | 5924篇 |
1993年 | 5795篇 |
1992年 | 4490篇 |
1991年 | 4462篇 |
1990年 | 4284篇 |
1989年 | 4301篇 |
1988年 | 4264篇 |
1987年 | 3598篇 |
1986年 | 3595篇 |
1985年 | 4161篇 |
1984年 | 3965篇 |
1983年 | 3620篇 |
1982年 | 3422篇 |
1981年 | 3558篇 |
1980年 | 3422篇 |
1979年 | 3365篇 |
1978年 | 3445篇 |
1977年 | 3923篇 |
1976年 | 5037篇 |
1975年 | 3158篇 |
1974年 | 3007篇 |
1973年 | 3024篇 |
1972年 | 2651篇 |
1971年 | 2473篇 |
1970年 | 2104篇 |
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
121.
Somatic hypermutation of the immunoglobulin variable genes during germinal reactions might permit the expansion of B-cell clones with unwanted (e.g. autoreactive) specificity. Here, Ernst Lindhout and colleagues propose three antigen-specific checkpoints that ensure the appropriate antigen specificity of activated B cells is maintained by regulating the activation, selection and further differentiation of B cells. 相似文献
122.
123.
The purpose of this paper is to evaluate two methods of assessing the productivity and quality impact of Computer Aided Software Engineering (CASE) and Fourth Generation Language (4GL) technologies: (1) by the retrospective method; and (2) the cross-sectional method. Both methods involve the use of questionnaire surveys. Developers' perceptions depend on the context in which they are expressed and this includes expectations about the effectiveness of a given software product. Consequently, it is generally not reliable to base inferences about the relative merits of CASE and 4GLs on a cross-sectional comparison of two separate samples of users. The retrospective method that requires each respondent to directly compare different products is shown to be more reliable. However, there may be scope to employ cross-sectional comparisons of the findings from different samples where both sets of respondents use the same reference point for their judgements, and where numerical rather than verbal rating scales are used to measure perceptions. 相似文献
124.
125.
In order to observe the transport ability of peritoneum to small molecular substances, peritoneal equilibration test (PET) was performed in 52 CAPD patients. By analysing the relationship between peritoneal transport function and dialysis adequacy, we found the average urea KT/V and Cr were significantly lower in high and low transport groups (n = 6 and n = 2) than in high average and low average groups (n = 35 and n = 9). According to the results of PET, we adjusted the dialysis program of 11 patients and the dialysis adequacy was markedly improved. We concluded that PET was helpful for selecting and adjusting CAPD program, and discussed some questions which should be payed more attention in PET operation. 相似文献
126.
For part I see, ibid., p. 134, 1998. The basic approach outlined in the previous article is applied to the difficult problem of computing the optical modes of a vertical-cavity surface-emitting laser. The formulation utilizes a finite difference equation based upon the lowest order term of an infinite series solution of the scalar Helmholtz equation in a local region. This difference equation becomes exact in the one-dimensional (1-D) limit, and is thus ideally suited for nearly 1-D devices such as vertical-cavity lasers. The performance of the resulting code is tested on both a simple cylindrical cavity with known solutions and an oxide-confined vertical-cavity laser structure, and the results compared against second-order-accurate code based upon Crank-Nicolson differencing 相似文献
127.
Local autonomous dynamic channel allocation (LADCA) including power control is essential to accommodating the anticipated explosion of demand for wireless. The authors simulate call performance for users accessing channels in a regular cellular array with a base located at the center of each hexagon. The computer model includes stochastic channel demand and a propagation environment characterized by attenuation with distance as well as shadow fading. The study of LADCA shows that distributed power control and channel access can be combined in an access management policy that achieves satisfactory system capacity and provides desired call performance. The authors report: LADCA/power control is observed to be stable alleviating a major concern about users unaware of the signal to interference problems their presence on a channel might cause to others. There can be substantial inadvertent dropping of calls in progress caused by originating calls. Modeling user time dynamics is essential. LADCA contrasts very favorably with fixed channel allocation (FCA) in a comparative example 相似文献
128.
Jan van Eijck 《Formal Aspects of Computing》1994,6(1):766-787
Presuppositions of utterances are the pieces of information you convey with an utterance no matter whether your utterance is true or not. We first study presupposition in a very simple framework of updating propositional information, with examples of how presuppositions of complex propositional updates can be calculated. Next we move on to presuppositions and quantification, in the context of a dynamic version of predicate logic, suitably modified to allow for presupposition failure. In both the propositional and the quantificational case, presupposition failure can be viewed as error abortion of procedures. Thus, a dynamic assertion logic which describes the preconditions for error abortion is the suitable tool for analysing presupposition. 相似文献
129.
130.
Kemper A. Kilger C. Moerkotte G. 《Knowledge and Data Engineering, IEEE Transactions on》1994,6(4):587-608
View materialization is a well-known optimization technique of relational database systems. We present a similar, yet more powerful, optimization concept for object-oriented data models: function materialization. Exploiting the object-oriented paradigm-namely, classification, object identity, and encapsulation-facilitates a rather easy incorporation of function materialization into (existing) object-oriented systems. Only those types (classes) whose instances are involved in some materialization are appropriately modified and recompiled, thus leaving the remainder of the object system invariant. Furthermore, the exploitation of encapsulation (information hiding) and object identity provides for additional performance tuning measures that drastically decrease the invalidation and rematerialization overhead incurred by updates in the object base. First, it allows us to cleanly separate the object instances that are irrelevant for the materialized functions from those that are involved in the materialization of some function result, and this to penalize only those involved objects upon update. Second, the principle of information hiding facilitates fine-grained control over the invalidation of precomputed results. Based on specifications given by the data type implementor, the system can exploit operational semantics to better distinguish between update operations that invalidate a materialized result and those that require no rematerialization. The paper concludes with a quantitative analysis of function materialization based on two sample performance benchmarks obtained from our experimental object base system GOM 相似文献