首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2977篇
  免费   108篇
  国内免费   39篇
电工技术   98篇
综合类   39篇
化学工业   748篇
金属工艺   68篇
机械仪表   134篇
建筑科学   112篇
矿业工程   23篇
能源动力   91篇
轻工业   266篇
水利工程   25篇
石油天然气   10篇
武器工业   2篇
无线电   258篇
一般工业技术   427篇
冶金工业   266篇
原子能技术   43篇
自动化技术   514篇
  2023年   12篇
  2022年   47篇
  2021年   63篇
  2020年   29篇
  2019年   55篇
  2018年   67篇
  2017年   44篇
  2016年   58篇
  2015年   62篇
  2014年   81篇
  2013年   167篇
  2012年   137篇
  2011年   207篇
  2010年   193篇
  2009年   203篇
  2008年   212篇
  2007年   178篇
  2006年   142篇
  2005年   164篇
  2004年   98篇
  2003年   102篇
  2002年   100篇
  2001年   61篇
  2000年   55篇
  1999年   44篇
  1998年   59篇
  1997年   49篇
  1996年   48篇
  1995年   34篇
  1994年   34篇
  1993年   45篇
  1992年   32篇
  1991年   22篇
  1990年   24篇
  1989年   14篇
  1988年   21篇
  1987年   20篇
  1986年   9篇
  1985年   25篇
  1984年   18篇
  1983年   21篇
  1982年   10篇
  1981年   17篇
  1980年   12篇
  1979年   6篇
  1978年   2篇
  1977年   6篇
  1976年   5篇
  1974年   2篇
  1973年   3篇
排序方式: 共有3124条查询结果,搜索用时 15 毫秒
101.
We are interested in information management for decision support applications, especially those that monitor distributed, heterogeneous databases to assess time-critical decisions. Users of such applications can easily be overwhelmed with data that may change rapidly, may conflict, and may be redundant. Developers are faced with a dilemma: either filter out most information and risk excluding critical items, or gather possibly irrelevant or redundant information, and overwhelm the decision maker. This paper describes a solution to this dilemma called decision-centric information monitoring (DCIM). First, we observe that decision support systems should monitor only information that can potentially change some decision. We present an architecture for DCIM that meets the requirements implied by this observation. We describe techniques for identifying the highest value information to monitor and techniques for monitoring that information despite autonomy, distribution, and heterogeneity of data sources. Finally, we present lessons learned from building LOOKOUT, which is to our knowledge the first implementation of a top-to-bottom system performing decision-centric information monitoring.  相似文献   
102.
Replica Placement Strategies in Data Grid   总被引:1,自引:0,他引:1  
Replication is a technique used in Data Grid environments that helps to reduce access latency and network bandwidth utilization. Replication also increases data availability thereby enhancing system reliability. The research addresses the problem of replication in Data Grid environment by investigating a set of highly decentralized dynamic replica placement algorithms. Replica placement algorithms are based on heuristics that consider both network latency and user requests to select the best candidate sites to place replicas. Due to dynamic nature of Grid, the candidate site holds replicas currently may not be the best sites to fetch replicas in subsequent periods. Therefore, a replica maintenance algorithm is proposed to relocate replicas to different sites if the performance metric degrades significantly. The study of our replica placement algorithms is carried out using a model of the EU Data Grid Testbed 1 [Bell et al. Comput. Appl., 17(4), 2003] sites and their associated network geometry. We validate our replica placement algorithms with total file transfer times, the number of local file accesses, and the number of remote file accesses.  相似文献   
103.
We have developed a unique device, a dynamic diamond anvil cell (dDAC), which repetitively applies a time-dependent load/pressure profile to a sample. This capability allows studies of the kinetics of phase transitions and metastable phases at compression (strain) rates of up to 500 GPa/s (approximately 0.16 s(-1) for a metal). Our approach adapts electromechanical piezoelectric actuators to a conventional diamond anvil cell design, which enables precise specification and control of a time-dependent applied load/pressure. Existing DAC instrumentation and experimental techniques are easily adapted to the dDAC to measure the properties of a sample under the varying load/pressure conditions. This capability addresses the sparsely studied regime of dynamic phenomena between static research (diamond anvil cells and large volume presses) and dynamic shock-driven experiments (gas guns, explosive, and laser shock). We present an overview of a variety of experimental measurements that can be made with this device.  相似文献   
104.
Although recent years have seen significant advances in the spatial resolution possible in the transmission electron microscope (TEM), the temporal resolution of most microscopes is limited to video rate at best. This lack of temporal resolution means that our understanding of dynamic processes in materials is extremely limited. High temporal resolution in the TEM can be achieved, however, by replacing the normal thermionic or field emission source with a photoemission source. In this case the temporal resolution is limited only by the ability to create a short pulse of photoexcited electrons in the source, and this can be as short as a few femtoseconds. The operation of the photo-emission source and the control of the subsequent pulse of electrons (containing as many as 5 x 10(7) electrons) create significant challenges for a standard microscope column that is designed to operate with a single electron in the column at any one time. In this paper, the generation and control of electron pulses in the TEM to obtain a temporal resolution <10(-6)s will be described and the effect of the pulse duration and current density on the spatial resolution of the instrument will be examined. The potential of these levels of temporal and spatial resolution for the study of dynamic materials processes will also be discussed.  相似文献   
105.
Although developments on software agents have led to useful applications in automation of routine tasks such as electronic mail filtering, there is a scarcity of research that empirically evaluates the performance of a software agent versus that of a human reasoner, whose problem-solving capabilities the agent embodies. In the context of a game of a chance, namely Yahtzee©, we identified strategies deployed by expert human reasoners and developed a decision tree for agent development. This paper describes the computer implementation of the Yahtzee game as well as the software agent. It also presents a comparison of the performance of humans versus an automated agent. Results indicate that, in this context, the software agent embodies human expertise at a high level of fidelity.  相似文献   
106.
Pipeline architectures provide a versatile and efficient mechanism for constructing visualizations, and they have been implemented in numerous libraries and applications over the past two decades. In addition to allowing developers and users to freely combine algorithms, visualization pipelines have proven to work well when streaming data and scale well on parallel distributed-memory computers. However, current pipeline visualization frameworks have a critical flaw: they are unable to manage time varying data. As data flows through the pipeline, each algorithm has access to only a single snapshot in time of the data. This prevents the implementation of algorithms that do any temporal processing such as particle tracing; plotting over time; or interpolation, fitting, or smoothing of time series data. As data acquisition technology improves, as simulation time-integration techniques become more complex, and as simulations save less frequently and regularly, the ability to analyze the time-behavior of data becomes more important. This paper describes a modification to the traditional pipeline architecture that allows it to accommodate temporal algorithms. Furthermore, the architecture allows temporal algorithms to be used in conjunction with algorithms expecting a single time snapshot, thus simplifying software design and allowing adoption into existing pipeline frameworks. Our architecture also continues to work well in parallel distributed-memory environments. We demonstrate our architecture by modifying the popular VTK framework and exposing the functionality to the ParaView application. We use this framework to apply time-dependent algorithms on large data with a parallel cluster computer and thereby exercise a functionality that previously did not exist.  相似文献   
107.
OBJECTIVE: This study examined the way in which the type and preexisting strength of association between an auditory icon and a warning event affects the ease with which the icon/event pairing can be learned and retained. BACKGROUND: To be effective, an auditory warning must be audible, identifiable, interpretable, and heeded. Warnings consisting of familiar environmental sounds, or auditory icons, have potential to facilitate identification and interpretation. The ease with which pairings between auditory icons and warning events can be learned and retained is likely to depend on the type and strength of the preexisting icon/event association. METHOD: Sixty-three participants each learned eight auditory-icon/denotative-referent pairings and attempted to recall them 4 weeks later. Three icon/denotative-referent association types (direct, related, and unrelated) were employed. Participants rated the strength of the association for each pairing on a 7-point scale. RESULTS: The number of errors made while learning pairings was greater for unrelated than for either related or direct associations, whereas the number of errors made while attempting to recall pairings 4 weeks later was greater for unrelated than for related associations and for related than for direct associations. Irrespective of association type, both learning and retention performance remained at very high levels, provided the strength of the association was rated greater than 5. CONCLUSION: This suggests that strong preexisting associations are used to facilitate learning and retention of icon/denotative-referent pairings. APPLICATION: The practical implication of this study is that auditory icons having either direct or strong, indirect associations with warning events should be preferred.  相似文献   
108.
In radiotherapy treatment planning, tumor volumes and anatomical structures are manually contoured for dose calculation, which takes time for clinicians. This study examines the use of semi-automated segmentation of CT images. A few high curvature points are manually drawn on a CT slice. Then Fourier interpolation is used to complete the contour. Consequently, optical flow, a deformable image registration method, is used to map the original contour to other slices. This technique has been applied successfully to contour anatomical structures and tumors. The maximum difference between the mapped contours and manually drawn contours was 6 pixels, which is similar in magnitude to difference one would see in manually drawn contours by different clinicians. The technique fails when the region to contour is topologically different between two slices. A solution is recommended to manually delineate contours on a sparse subset of slices and then map in both directions to fill the remaining slices.  相似文献   
109.
典型柔性铰链的结构参数对其刚度性能影响的研究   总被引:10,自引:0,他引:10  
王纪武  陈恳  李嘉 《机器人》2001,23(1):51-57
柔性铰链是目前被广泛用于微动机器人的主要部件之一,其刚度性能直接影响到微动 机器人的终端定位和操作精度.由于实际需要的多样性和复杂性,使得其实际结构的几何尺 寸不能完全满足传统理论分析的假设条件,因此影响到对其性能的准确分析.本文采用有限 元技术对三种典型柔性铰运动变形的力学机理进行了系统的研究,并与传统理论的分析结果 进行了比较,分析了二者间产生误差的根本原因,并给出其结构参数对刚度性能的影响关系 .  相似文献   
110.
对牛顿—拉夫逊法潮流计算作出适当简化,解除方程式中各节点之间的耦合关系,形成了一种新的节点解耦算法.该方法将雅可比矩阵简化为广义分块对角阵,进而取消雅可比矩阵所需的存储单元及其求逆运算过程,并引入高斯—塞德尔的迭代方式进一步加快收敛速度.算例验证了该方法的可行性,尤其是在复杂系统计算中具有占用内存小、计算速度快的特点.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号