首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   219篇
  免费   7篇
电工技术   2篇
化学工业   21篇
机械仪表   5篇
建筑科学   22篇
能源动力   3篇
轻工业   31篇
水利工程   1篇
石油天然气   1篇
无线电   15篇
一般工业技术   31篇
冶金工业   38篇
自动化技术   56篇
  2023年   2篇
  2022年   2篇
  2021年   2篇
  2020年   1篇
  2019年   5篇
  2018年   10篇
  2017年   7篇
  2015年   3篇
  2014年   8篇
  2013年   16篇
  2012年   8篇
  2011年   19篇
  2010年   5篇
  2009年   9篇
  2008年   16篇
  2007年   13篇
  2006年   5篇
  2005年   6篇
  2004年   8篇
  2003年   3篇
  2002年   4篇
  2001年   5篇
  2000年   4篇
  1999年   5篇
  1998年   5篇
  1997年   4篇
  1996年   10篇
  1995年   3篇
  1994年   4篇
  1993年   3篇
  1992年   5篇
  1991年   1篇
  1990年   2篇
  1989年   3篇
  1988年   2篇
  1987年   1篇
  1984年   2篇
  1982年   1篇
  1981年   1篇
  1980年   1篇
  1978年   4篇
  1976年   3篇
  1975年   2篇
  1974年   1篇
  1967年   2篇
排序方式: 共有226条查询结果,搜索用时 0 毫秒
61.
Two experiments were conducted with a hybrid procedure that involved a battery of indirect criterion tests designed to study the activation and metacognition of inaccessible stored information. In each experiment, subjects first attempted to recall some rare target words in response to a series of definitions meant to cue retrieval from long-term semantic memory. For the words that could not be recalled initially, the subjects rated their feelings of knowing. They then performed a lexical-decision task in which the target words and other control words were presented. Reaction times were measured as a function of the feeling-of-knowing ratings and the length of the interval between the initial exposure to the definitions and the subsequent lexical decisions. Faster decisions occurred for the target words than for the controls, especially when strong feelings of knowing had been expresseed about the targets. Similar facilitation was obtained in a subsequent old-new recognition task. It appears that unsuccessful attempts to retrieve inaccessible stored information prime the later recognition of the information through a process of spreading activation. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
62.
This paper presents a summary of some of the findings obtained during a series of research projects aimed to select, define and evaluate the significant aggregate and filler properties, and to relate them to the behaviour of bituminous concrete. The paper discusses a new concept and test method suggested for an objective quantitative evaluation of the geometric irregularity of aggregate particles (shape, angularity and surface texture combined) in a wide range of particle sizes. The aggregate parameters were found to be significantly correlated to the physical properties and mechanical behaviour of the bituminous mixtures composed of the aggregates tested. The paper also summarizes research work and findings related to a special evaluation of important physico-chemical properties of the filler and their significant effect on the immediate properties and the long-term durability behaviour of bituminous paving mixtures.  相似文献   
63.
ABSTRACT The size distribution of vesicles exocytosed from secretory cells displays quantal nature, vesicle volume is periodic multi‐modal, suggesting that these heterogeneous vesicles are aggregate sums of a variable number of homogeneous basic granules. Whether heterogeneity is a lumping‐together artifact of the measurement or an inherent intra‐cell feature of the vesicles is an unresolved question. Recent empirical evidence will be provided for the quantal nature of intra‐cell vesicle volume, supporting the controversial paradigm of homotypic fusion: basic cytoplasmic granules fuse with each other to create heterogeneously sized vesicles. An EM‐algorithm‐based method is presented for the conversion of multi‐modal to quantal data that provides as by‐product estimates of means and variances of basic granule packaging. Microsc. Res. Tech. 77:1–10, 2014. © 2013 Wiley Periodicals, Inc.  相似文献   
64.
This paper focuses on the possibilities of the material imagination as a theoretical and practical lens for contemporary housing research. The emphasis is on housing/home as complex material cultural assemblages interwoven across the four key ancient elements: earth, air, fire and water. The principle behind the material imagination is that “matter” – which we are immersed in and indeed ourselves composed of – is important, indeed underpins everything, and yet is typically rendered invisible within housing theory and research. As a critical response to social scientific engagement – “a needed radical corrective” – the potential of the material imagination for housing theory and practice is considered in ways that purposively attend to the elemental dimensions of housing as dynamic, fluid environments comprised of living matter. Suggestions for taking this approach forward through empirical housing studies are outlined.  相似文献   
65.
An actual sampling process can be modeled as a random process, which consists of the regular (uniform) deterministic sampling process plus an error in the sampling times which constitutes a zero-mean noise (the jitter). In this paper we discuss the problem of estimating the jitter process. By assuming that the jitter process is an i.i.d. one, with standard deviation that is small compared to the regular sampling time, we show that the variance of the jitter process can be estimated from thenth order spectrum of the sampled data,n=2, 3, i.e., the jitter variance can be extracted from the 2nd-order spectrum or the 3rd-order spectrum (the bispectrum) of the sampled data, provided the continuous signal spectrum is known. However when the signal skewness exceeds a certain level, the potential performance of the bispectrum-based estimation is better than that of the spectrum-based estimation. Moreover, the former can also provide jitter variance estimates when the continuous signal spectrum is unknown while the latter cannot. This suggests that the bispectrum of the sampled data is potentially better for estimating any parameter of the sampling jitter process, once the signal skewness is sufficiently large.  相似文献   
66.
We present a new definition of optimality intervals for the parametric right-hand side linear programming (parametric RHS LP) Problem () = min{c t x¦Ax =b + ¯b,x 0}. We then show that an optimality interval consists either of a breakpoint or the open interval between two consecutive breakpoints of the continuous piecewise linear convex function (). As a consequence, the optimality intervals form a partition of the closed interval {; ¦()¦ < }. Based on these optimality intervals, we also introduce an algorithm for solving the parametric RHS LP problem which requires an LP solver as a subroutine. If a polynomial-time LP solver is used to implement this subroutine, we obtain a substantial improvement on the complexity of those parametric RHS LP instances which exhibit degeneracy. When the number of breakpoints of () is polynomial in terms of the size of the parametric problem, we show that the latter can be solved in polynomial time.This research was partially funded by the United States Navy-Office of Naval Research under Contract N00014-87-K-0202. Its financial support is gratefully acknowledged.  相似文献   
67.
Increasingly, countries around the world are adopting policies that emphasize the importance of partnerships for disaster resilience. The overarching questions that this paper investigates are how to form and sustain (1) effective collaborative arrangements involving governments, businesses, non‐governmental organizations and communities to ensure development of disaster resilient communities, and (2) governance institutions that can effectively mobilize geographically dispersed disaster response resources with fragmented ownership. We have reviewed case studies of alternative inter‐sectoral collaborative arrangements that were formed to (1) promote the development of resilient communities and critical physical and social systems; (2) mitigate or respond to emerging crises; or (3) facilitate post‐disaster recovery and learning. We have developed grounded propositions articulating the antecedents of performance of inter‐sectoral collaborative arrangements.  相似文献   
68.
XBn or XBp barrier detectors exhibit diffusion-limited dark currents comparable with mercury cadmium telluride Rule-07 and high quantum efficiencies. In 2011, SemiConductor Devices (SCD) introduced “HOT Pelican D”, a 640 × 512/15-μm pitch InAsSb/AlSbAs XBn mid-wave infrared (MWIR) detector with a 4.2-μm cut-off and an operating temperature of ~150 K. Its low power (~3 W), high pixel operability (>99.5%) and long mean time to failure make HOT Pelican D a highly reliable integrated detector-cooler product with a low size, weight and power. More recently, “HOT Hercules” was launched with a 1280 × 1024/15-μm format and similar advantages. A 3-megapixel, 10-μm pitch version (“HOT Blackbird”) is currently completing development. For long-wave infrared applications, SCD’s 640 × 512/15-μm pitch “Pelican-D LW” XBp type II superlattice (T2SL) detector has a ~9.3-μm cut-off wavelength. The detector contains InAs/GaSb and InAs/AlSb T2SLs, and is fabricated into focal plane array (FPA) detectors using standard production processes including hybridization to a digital silicon read-out integrated circuit (ROIC), glue underfill and substrate thinning. The ROIC has been designed so that the complete detector closely follows the interfaces of SCD’s MWIR Pelican-D detector family. The Pelican-D LW FPA has a quantum efficiency of ~50%, and operates at 77 K with a pixel operability of >99% and noise equivalent temperature difference of 13 mK at 30 Hz and F/2.7.  相似文献   
69.
Combinatorial property testing deals with the following relaxation of decision problems: Given a fixed property and an input x, one wants to decide whether x satisfies the property or is “far” from satisfying it. The main focus of property testing is in identifying large families of properties that can be tested with a certain number of queries to the input. In this paper we study the relation between the space complexity of a language and its query complexity. Our main result is that for any space complexity s(n) ≤ log n there is a language with space complexity O(s(n)) and query complexity 2Ω(s(n)). Our result has implications with respect to testing languages accepted by certain restricted machines. Alon et al. [FOCS 1999] have shown that any regular language is testable with a constant number of queries. It is well known that any language in space o(log log n) is regular, thus implying that such languages can be so tested. It was previously known that there are languages in space O(log n) that are not testable with a constant number of queries and Newman [FOCS 2000] raised the question of closing the exponential gap between these two results. A special case of our main result resolves this problem as it implies that there is a language in space O(log log n) that is not testable with a constant number of queries. It was also previously known that the class of testable properties cannot be extended to all context-free languages. We further show that one cannot even extend the family of testable languages to the class of languages accepted by single counter machines.   相似文献   
70.
   Abstract. We consider a simple restriction of the PRAM model (called PPRAM), where the input is arbitrarily partitioned between a fixed set of p processors and the shared memory is restricted to m cells. This model allows for investigation of the tradeoffs/ bottlenecks with respect to the communication bandwidth (modeled by the shared memory size m ) and the number of processors p . The model is quite simple and allows the design of optimal algorithms without losing the effect of communication bottlenecks. We have focused on the PPRAM complexity of problems that have
(n) sequential solutions (where n is the input size), and where m ≤ p ≤ n . We show essentially tight time bounds (up to logarithmic factors) for several problems in this model such as summing, Boolean threshold, routing, integer sorting, list reversal and k -selection. We get typically two sorts of complexity behaviors for these problems: One type is
(n/p + p/m) , which means that the time scales with the number of processors and with memory size (in appropriate ranges) but not with both. The other is
(n/m) , which means that the running time does not scale with p and reflects a communication bottleneck (as long as m < p ). We are not aware of any problem whose complexity scales with both p and m (e.g.
). This might explain why in actual implementations one often fails to get p -scalability for p close to n .  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号