首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9511篇
  免费   462篇
  国内免费   20篇
电工技术   97篇
综合类   4篇
化学工业   1956篇
金属工艺   140篇
机械仪表   188篇
建筑科学   613篇
矿业工程   21篇
能源动力   317篇
轻工业   839篇
水利工程   108篇
石油天然气   28篇
无线电   696篇
一般工业技术   1938篇
冶金工业   1242篇
原子能技术   36篇
自动化技术   1770篇
  2023年   70篇
  2022年   135篇
  2021年   214篇
  2020年   175篇
  2019年   210篇
  2018年   232篇
  2017年   231篇
  2016年   254篇
  2015年   226篇
  2014年   334篇
  2013年   657篇
  2012年   554篇
  2011年   785篇
  2010年   501篇
  2009年   472篇
  2008年   586篇
  2007年   509篇
  2006年   433篇
  2005年   386篇
  2004年   328篇
  2003年   306篇
  2002年   276篇
  2001年   146篇
  2000年   155篇
  1999年   133篇
  1998年   133篇
  1997年   143篇
  1996年   130篇
  1995年   113篇
  1994年   88篇
  1993年   84篇
  1992年   79篇
  1991年   61篇
  1990年   75篇
  1989年   61篇
  1988年   42篇
  1987年   59篇
  1986年   52篇
  1985年   56篇
  1984年   52篇
  1983年   67篇
  1982年   49篇
  1981年   40篇
  1980年   36篇
  1979年   38篇
  1978年   29篇
  1977年   26篇
  1976年   23篇
  1975年   22篇
  1974年   14篇
排序方式: 共有9993条查询结果,搜索用时 15 毫秒
191.
Diverse proteomic techniques based on protein MS have been introduced to systematically characterize protein perturbations associated with disease. Progress in clinical proteomics is essential for personalized medicine, wherein treatments will be tailored to individual needs based on patient stratification using noninvasive disease monitoring procedures to reveal the most appropriate therapeutic targets. However, breakthroughs await the successful development and application of a robust proteomic pipeline capable of identifying and rigorously assessing the relevance of multiple candidate proteins as informative diagnostic and prognostic indicators or suitable drug targets involved in a pathological process. While steady progress has been made toward more comprehensive proteome profiling, the emphasis must now shift from in depth screening of reference samples to stringent quantitative validation of selected lead candidates in a broader clinical context. Here, we present an overview of the emerging proteomic strategies for high-throughput protein detection focused primarily on targeted MS/MS as the basis for biomarker verification in large clinical cohorts. We discuss the conceptual promise and practical pitfalls of these methods in terms of achieving higher dynamic range, higher throughput, and more reliable quantification, highlighting research avenues that merit additional inquiry.  相似文献   
192.
We report a new experimental apparatus for infrared microthermography applicable to a wide class of samples including semitransparent ones and perforated devices. This setup is particularly well suited for the thermography of microfabricated devices. Traditionally, temperature calibration is performed using calibration hot plates, but this is not applicable to transmissive samples. In this work a custom designed miniature calibration oven in conjunction with spatial filtering is used to obtain accurate static and transient temperature maps of actively heated devices. The procedure does not require prior knowledge of the emissivity. Calibration and image processing algorithms are discussed and analyzed. We show that relatively inexpensive uncooled bolometer arrays can be a suitable detector choice in certain radiometric applications. As an example, we apply this method in the analysis of temperature profiles of an actively heated microfabricated preconcentrator device that incorporates a perforated membrane and is used in trace detection of illicit substances.  相似文献   
193.
Perrow’s normal accident theory suggests that some major accidents are inevitable for technological reasons. An alternative approach explains major accidents as resulting from management failures, particularly in relation to the communication of information. This latter theory has been shown to be applicable to a wide variety of disasters. By contrast, Perrow’s theory seems to be applicable to relatively few accidents, the exemplar case being the Three Mile Island nuclear power station accident in the U.S. in 1979. This article re‐examines Three Mile Island. It shows that this was not a normal accident in Perrow’s sense and is readily explicable in terms of management failures. The article also notes that Perrow’s theory is motivated by a desire to shift blame away from front line operators and that the alternative approach does this equally well.  相似文献   
194.
Just as we can work with two-dimensional floor plans to communicate 3D architectural design, we can exploit reduced-dimension shadows to manipulate the higher-dimensional objects generating the shadows. In particular, by taking advantage of physically reactive 3D shadow-space controllers, we can transform the task of interacting with 4D objects to a new level of physical reality. We begin with a teaching tool that uses 2D knot diagrams to manipulate the geometry of 3D mathematical knots via their projections; our unique 2D haptic interface allows the user to become familiar with sketching, editing, exploration, and manipulation of 3D knots rendered as projected imageson a 2D shadow space. By combining graphics and collision-sensing haptics, we can enhance the 2D shadow-driven editing protocol to successfully leverage 2D pen-and-paper or blackboard skills. Building on the reduced-dimension 2D editing tool for manipulating 3D shapes, we develop the natural analogy to produce a reduced-dimension 3D tool for manipulating 4D shapes. By physically modeling the correct properties of 4D surfaces, their bending forces, and their collisions in the 3D haptic controller interface, we can support full-featured physical exploration of 4D mathematical objects in a manner that is otherwise far beyond the experience accessible to human beings. As far as we are aware, this paper reports the first interactive system with force-feedback that provides "4D haptic visualization" permitting the user to model and interact with 4D cloth-like objects.  相似文献   
195.
It's a banker     
Griffiths  Andrew 《ITNOW》2007,49(3):12-13
  相似文献   
196.
In participatory ergonomic (PE) interventions, "how" effective participation by workplace parties can be achieved remains unclear. We conducted a case study of the dynamics of an ergonomic change team (ECT) process in a medium-sized (175 employees) automotive foam manufacturing plant. We present analyses of observer field notes and post-intervention interviews from which key elements on the dynamics of the "how" emerged: (1) impacts of facilitators' involvement and interests; (2) tensions in delimiting the scope of ECT activities; issues around (3) managing meetings and (4) realizing labour and management participation; and (5) workplace ECT members' difficulties in juggling other job commitments and facing production pressures. We highlight the ongoing negotiated nature of responses to these challenges by labour, management and ergonomic facilitator members of the ECT. We argue for greater examination of the social dynamics of PE processes to identify additional ways of fostering participation in ergonomic project implementation.  相似文献   
197.
This paper describes models and algorithms for the real-time segmentation of foreground from background layers in stereo video sequences. Automatic separation of layers from color/contrast or from stereo alone is known to be error-prone. Here, color, contrast, and stereo matching information are fused to infer layers accurately and efficiently. The first algorithm, layered dynamic programming (LDP), solves stereo in an extended six-state space that represents both foreground/background layers and occluded regions. The stereo-match likelihood is then fused with a contrast-sensitive color model that is learned on-the-fly and stereo disparities are obtained by dynamic programming. The second algorithm, layered graph cut (LGC), does not directly solve stereo. Instead, the stereo match likelihood is marginalized over disparities to evaluate foreground and background hypotheses and then fused with a contrast-sensitive color model like the one used in LDP. Segmentation is solved efficiently by ternary graph cut. Both algorithms are evaluated with respect to ground truth data and found to have similar performance, substantially better than either stereo or color/contrast alone. However, their characteristics with respect to computational efficiency are rather different. The algorithms are demonstrated in the application of background substitution and shown to give good quality composite video output.  相似文献   
198.
Biometric analysis for identity verification is becoming a widespread reality. Such implementations necessitate large-scale capture and storage of biometric data, which raises serious issues in terms of data privacy and (if such data is compromised) identity theft. These problems stem from the essential permanence of biometric data, which (unlike secret passwords or physical tokens) cannot be refreshed or reissued if compromised. Our previously presented biometric-hash framework prescribes the integration of external (password or token-derived) randomness with user-specific biometrics, resulting in bitstring outputs with security characteristics (i.e., noninvertibility) comparable to cryptographic ciphers or hashes. The resultant BioHashes are hence cancellable, i.e., straightforwardly revoked and reissued (via refreshed password or reissued token) if compromised. BioHashing furthermore enhances recognition effectiveness, which is explained in this paper as arising from the random multispace quantization (RMQ) of biometric and external random inputs  相似文献   
199.
We present visualization tools for analyzing molecular simulations of liquid crystal (LC) behavior. The simulation data consists of terabytes of data describing the position and orientation of every molecule in the simulated system over time. Condensed matter physicists study the evolution of topological defects in these data, and our visualization tools focus on that goal. We first convert the discrete simulation data to a sampled version of a continuous second-order tensor field and then use combinations of visualization methods to simultaneously display combinations of contractions of the tensor data, providing an interactive environment for exploring these complicated data. The system, built using AVS, employs colored cutting planes, colored isosurfaces, and colored integral curves to display fields of tensor contractions including Westin's scalar cl, cp, and cs metrics and the principal eigenvector. Our approach has been in active use in the physics lab for over a year. It correctly displays structures already known; it displays the data in a spatially and temporally smoother way than earlier approaches, avoiding confusing grid effects and facilitating the study of multiple time steps; it extends the use of tools developed for visualizing diffusion tensor data, re-interpreting them in the context of molecular simulations; and it has answered long-standing questions regarding the orientation of molecules around defects and the conformational changes of the defects.  相似文献   
200.
The MANDAS project has defined a layered architecture for the management of distributed applications. In this paper we examine a vertical slice of this architecture, namely the management applications and services related to configuration management. We introduce an information model which captures the configuration information for distributed applications and discuss a repository service based on the model. We define a set of services and management applications to support maintenance of configuration information, and describe how the different types of configuration information are collected. Finally, we present two management applications that use configuration information.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号