首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   18976篇
  免费   677篇
  国内免费   44篇
电工技术   237篇
综合类   24篇
化学工业   3739篇
金属工艺   468篇
机械仪表   397篇
建筑科学   948篇
矿业工程   108篇
能源动力   600篇
轻工业   1634篇
水利工程   172篇
石油天然气   68篇
无线电   1457篇
一般工业技术   3364篇
冶金工业   3424篇
原子能技术   116篇
自动化技术   2941篇
  2023年   123篇
  2022年   210篇
  2021年   330篇
  2020年   234篇
  2019年   270篇
  2018年   368篇
  2017年   378篇
  2016年   413篇
  2015年   338篇
  2014年   513篇
  2013年   1218篇
  2012年   825篇
  2011年   1166篇
  2010年   799篇
  2009年   829篇
  2008年   930篇
  2007年   871篇
  2006年   759篇
  2005年   743篇
  2004年   594篇
  2003年   541篇
  2002年   526篇
  2001年   332篇
  2000年   303篇
  1999年   328篇
  1998年   355篇
  1997年   298篇
  1996年   326篇
  1995年   303篇
  1994年   289篇
  1993年   262篇
  1992年   259篇
  1991年   161篇
  1990年   230篇
  1989年   203篇
  1988年   167篇
  1987年   166篇
  1986年   156篇
  1985年   187篇
  1984年   197篇
  1983年   162篇
  1982年   170篇
  1981年   168篇
  1980年   143篇
  1979年   165篇
  1978年   123篇
  1977年   121篇
  1976年   162篇
  1975年   125篇
  1973年   108篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
81.
The present study presents a methodology for detailed reliability analysis of nuclear containment without metallic liners against aircraft crash. For this purpose, a nonlinear limit state function has been derived using violation of tolerable crack width as failure criterion. This criterion has been considered as failure criterion because radioactive radiations may come out if size of crack becomes more than the tolerable crack width. The derived limit state uses the response of containment that has been obtained from a detailed dynamic analysis of nuclear containment under an impact of a large size Boeing jet aircraft. Using this response in conjunction with limit state function, the reliabilities and probabilities of failures are obtained at a number of vulnerable locations employing an efficient first-order reliability method (FORM). These values of reliability and probability of failure at various vulnerable locations are then used for the estimation of conditional and annual reliabilities of nuclear containment as a function of its location from the airport. To study the influence of the various random variables on containment reliability the sensitivity analysis has been performed. Some parametric studies have also been included to obtain the results of field and academic interest.  相似文献   
82.
83.
This paper proposes that self-deception results from the emotional coherence of beliefs with subjective goals. We apply the HOTCO computational model of emotional coherence to simulate a rich case of self-deception from Hawthorne's The Scarlet Letter.We argue that this model is more psychologically realistic than other available accounts of self-deception, and discuss related issues such as wishful thinking, intention, and the division of the self.  相似文献   
84.
Symmetric multiprocessor systems are increasingly common, not only as high-throughput servers, but as a vehicle for executing a single application in parallel in order to reduce its execution latency. This article presents Pedigree, a compilation tool that employs a new partitioning heuristic based on the program dependence graph (PDG). Pedigree creates overlapping, potentially interdependent threads, each executing on a subset of the SMP processors that matches the thread’s available parallelism. A unified framework is used to build threads from procedures, loop nests, loop iterations, and smaller constructs. Pedigree does not require any parallel language support; it is post-compilation tool that reads in object code. The SDIO Signal and Data Processing Benchmark Suite has been selected as an example of real-time, latency-sensitive code. Its coarse-grained data flow parallelism is naturally exploited by Pedigree to achieve speedups of 1.63×/2.13× (mean/max) and 1.71×/2.41× on two and four processors, respectively. There is roughly a 20% improvement over existing techniques that exploit only data parallelism. By exploiting the unidirectional flow of data for coarse-grained pipelining, the synchronization overhead is typically limited to less than 6% for synchronization latency of 100 cycles, and less than 2% for 10 cycles. This research was supported by ONR contract numbers N00014-91-J-1518 and N00014-96-1-0347. We would like to thank the Pittsburgh Supercomputing Center for use of their Alpha systems.  相似文献   
85.
Quality is one of the main concerns in today's systems and software development and use. One important instrument in verification is the use of formal methods, which means that requirements and designs are analyzed formally to determine their relationships. Furthermore, since professional software design is to an increasing extent a distributed process, the issue of integrating different systems to an entity is of great importance in modern system development and design. Various candidates for formalizing system development and integration have prevailed, but very often, particularly for dynamic conflict detection, these introduce non-standard objects and formalisms, leading to severe confusion, both regarding the semantics and the computability. In contrast to such, we introduce a framework for defining requirement fulfillment by designs, detecting conflicts of various kinds as well as integration of heterogeneous schemata. The framework introduced transcends ordinary logical consequence, as it takes into account static and dynamic aspects of design consistency and, in particular, the specific features of the state space of a specification. Another feature of the approach is that it provides a unifying framework for design conflict analysis and schema integration.  相似文献   
86.
Election security: Perception and reality   总被引:1,自引:0,他引:1  
Voters' trust in elections comes from a combination of the mechanisms and procedures we use to record and tally votes, and from confidence in election officials' competence and honesty. Electronic voting systems pose considerable risks to both the perception and reality of trustworthy elections.  相似文献   
87.
In influential research, R. N. Shepard, C. I. Hovland, and H. M. Jenkins (1961) surveyed humans' categorization abilities using tasks based in rules, exclusive-or (XOR) relations, and exemplar memorization. Humans' performance was poorly predicted by cue-conditioning or stimulus-generalization theories, causing Shepard et al. to describe it in terms of hypothesis selection and rule application that were possibly supported by verbal mediation. The authors of the current article surveyed monkeys' categorization abilities similarly. Monkeys, like humans, found category tasks with a single relevant dimension the easiest and perceptually chaotic tasks requiring exemplar memorization the most difficult. Monkeys, unlike humans, found tasks based in XOR relations very difficult. The authors discuss the character and basis of the species difference in categorization and consider whether monkeys are the generalization-based cognitive system that humans are not. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
88.
Three experiments with 204 undergraduates examined the hypothesis that an audience can inhibit overt practice and thereby impair learning of unfamiliar words and enhance learning of familiar words. This hypothesis was derived from an analysis of motoric and symbolic mediation during learning. In comparison with learning while alone, the results show that the audience inhibited overt practice of unfamiliar and familiar words and that reduced practice was detrimental to learning unfamiliar words. Inhibition of overt practice with an audience enhanced learning of familiar words in only 1 of the experiments. Instructions to practice overtly reduced the audience-inhibition effect in learning unfamiliar words. The studies are discussed in the context of drive-theory explanations for social facilitation effects in learning. (20 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
89.
I examine whether it is possible for content relevant to a computer's behavior to be carried without an explicit internal representation. I consider three approaches. First, an example of a chess playing computer carrying emergent content is offered from Dennett. Next I examine Cummins response to this example. Cummins says Dennett's computer executes a rule which is inexplicitly represented. Cummins describes a process wherein a computer interprets explicit rules in its program, implements them to form a chess-playing device, then this device executes the rules in a way that exhibits them inexplicitly. Though this approach is intriguing, I argue that the chess-playing device cannot exist as imagined. The processes of interpretation and implementation produce explicit representations of the content claimed to be inexplicit. Finally, the Chinese Room argument is examined and shown not to save the notion of inexplicit information. This means the strategy of attributing inexplicit content to a computer which is executing a rule, fails.I wish to thank Fred Dretske, JOhn Perry, and an anonymous reviewer for helpful comments and suggestions. Earlier versions of this paper were read at the American Philosophical Association Pacific Division Meeting in San Francisco in March, 1993, and at the 7th International Conference on Computing and Philosophy in Orlando in August, 1992.  相似文献   
90.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号