全文获取类型
收费全文 | 2970篇 |
免费 | 108篇 |
国内免费 | 39篇 |
专业分类
电工技术 | 94篇 |
综合类 | 39篇 |
化学工业 | 749篇 |
金属工艺 | 69篇 |
机械仪表 | 134篇 |
建筑科学 | 111篇 |
矿业工程 | 23篇 |
能源动力 | 91篇 |
轻工业 | 267篇 |
水利工程 | 25篇 |
石油天然气 | 10篇 |
武器工业 | 2篇 |
无线电 | 263篇 |
一般工业技术 | 412篇 |
冶金工业 | 260篇 |
原子能技术 | 43篇 |
自动化技术 | 525篇 |
出版年
2023年 | 12篇 |
2022年 | 47篇 |
2021年 | 63篇 |
2020年 | 28篇 |
2019年 | 55篇 |
2018年 | 66篇 |
2017年 | 44篇 |
2016年 | 57篇 |
2015年 | 62篇 |
2014年 | 81篇 |
2013年 | 170篇 |
2012年 | 140篇 |
2011年 | 210篇 |
2010年 | 192篇 |
2009年 | 202篇 |
2008年 | 215篇 |
2007年 | 178篇 |
2006年 | 144篇 |
2005年 | 167篇 |
2004年 | 96篇 |
2003年 | 102篇 |
2002年 | 102篇 |
2001年 | 62篇 |
2000年 | 55篇 |
1999年 | 44篇 |
1998年 | 55篇 |
1997年 | 49篇 |
1996年 | 48篇 |
1995年 | 34篇 |
1994年 | 31篇 |
1993年 | 44篇 |
1992年 | 31篇 |
1991年 | 22篇 |
1990年 | 23篇 |
1989年 | 14篇 |
1988年 | 22篇 |
1987年 | 19篇 |
1986年 | 9篇 |
1985年 | 22篇 |
1984年 | 18篇 |
1983年 | 20篇 |
1982年 | 8篇 |
1981年 | 17篇 |
1980年 | 11篇 |
1979年 | 6篇 |
1977年 | 5篇 |
1976年 | 4篇 |
1974年 | 2篇 |
1973年 | 3篇 |
1964年 | 1篇 |
排序方式: 共有3117条查询结果,搜索用时 15 毫秒
101.
Len Seligman Paul Lehner Ken Smith Chris Elsaesser David Mattox 《Journal of Intelligent Information Systems》2000,14(1):29-50
We are interested in information management for decision support applications, especially those that monitor distributed, heterogeneous databases to assess time-critical decisions. Users of such applications can easily be overwhelmed with data that may change rapidly, may conflict, and may be redundant. Developers are faced with a dilemma: either filter out most information and risk excluding critical items, or gather possibly irrelevant or redundant information, and overwhelm the decision maker. This paper describes a solution to this dilemma called decision-centric information monitoring (DCIM). First, we observe that decision support systems should monitor only information that can potentially change some decision. We present an architecture for DCIM that meets the requirements implied by this observation. We describe techniques for identifying the highest value information to monitor and techniques for monitoring that information despite autonomy, distribution, and heterogeneity of data sources. Finally, we present lessons learned from building LOOKOUT, which is to our knowledge the first implementation of a top-to-bottom system performing decision-centric information monitoring. 相似文献
102.
Replica Placement Strategies in Data Grid 总被引:1,自引:0,他引:1
Replication is a technique used in Data Grid environments that helps to reduce access latency and network bandwidth utilization.
Replication also increases data availability thereby enhancing system reliability. The research addresses the problem of replication
in Data Grid environment by investigating a set of highly decentralized dynamic replica placement algorithms. Replica placement
algorithms are based on heuristics that consider both network latency and user requests to select the best candidate sites
to place replicas. Due to dynamic nature of Grid, the candidate site holds replicas currently may not be the best sites to
fetch replicas in subsequent periods. Therefore, a replica maintenance algorithm is proposed to relocate replicas to different
sites if the performance metric degrades significantly. The study of our replica placement algorithms is carried out using
a model of the EU Data Grid Testbed 1 [Bell et al. Comput. Appl., 17(4), 2003] sites and their associated network geometry. We validate our replica placement
algorithms with total file transfer times, the number of local file accesses, and the number of remote file accesses. 相似文献
103.
Evans WJ Yoo CS Lee GW Cynn H Lipp MJ Visbeck K 《The Review of scientific instruments》2007,78(7):073904
We have developed a unique device, a dynamic diamond anvil cell (dDAC), which repetitively applies a time-dependent load/pressure profile to a sample. This capability allows studies of the kinetics of phase transitions and metastable phases at compression (strain) rates of up to 500 GPa/s (approximately 0.16 s(-1) for a metal). Our approach adapts electromechanical piezoelectric actuators to a conventional diamond anvil cell design, which enables precise specification and control of a time-dependent applied load/pressure. Existing DAC instrumentation and experimental techniques are easily adapted to the dDAC to measure the properties of a sample under the varying load/pressure conditions. This capability addresses the sparsely studied regime of dynamic phenomena between static research (diamond anvil cells and large volume presses) and dynamic shock-driven experiments (gas guns, explosive, and laser shock). We present an overview of a variety of experimental measurements that can be made with this device. 相似文献
104.
Armstrong MR Boyden K Browning ND Campbell GH Colvin JD DeHope WJ Frank AM Gibson DJ Hartemann F Kim JS King WE LaGrange TB Pyke BJ Reed BW Shuttlesworth RM Stuart BC Torralva BR 《Ultramicroscopy》2007,107(4-5):356-367
Although recent years have seen significant advances in the spatial resolution possible in the transmission electron microscope (TEM), the temporal resolution of most microscopes is limited to video rate at best. This lack of temporal resolution means that our understanding of dynamic processes in materials is extremely limited. High temporal resolution in the TEM can be achieved, however, by replacing the normal thermionic or field emission source with a photoemission source. In this case the temporal resolution is limited only by the ability to create a short pulse of photoexcited electrons in the source, and this can be as short as a few femtoseconds. The operation of the photo-emission source and the control of the subsequent pulse of electrons (containing as many as 5 x 10(7) electrons) create significant challenges for a standard microscope column that is designed to operate with a single electron in the column at any one time. In this paper, the generation and control of electron pulses in the TEM to obtain a temporal resolution <10(-6)s will be described and the effect of the pulse duration and current density on the spatial resolution of the instrument will be examined. The potential of these levels of temporal and spatial resolution for the study of dynamic materials processes will also be discussed. 相似文献
105.
Ken Maynard Patrick Moss Marcus Whitehead S. Narayanan Matt Garay Nathan Brannon Raj Gopal Kantamneni & Todd Kustra 《Expert Systems》2001,18(2):88-98
Although developments on software agents have led to useful applications in automation of routine tasks such as electronic mail filtering, there is a scarcity of research that empirically evaluates the performance of a software agent versus that of a human reasoner, whose problem-solving capabilities the agent embodies. In the context of a game of a chance, namely Yahtzee©, we identified strategies deployed by expert human reasoners and developed a decision tree for agent development. This paper describes the computer implementation of the Yahtzee game as well as the software agent. It also presents a comparison of the performance of humans versus an automated agent. Results indicate that, in this context, the software agent embodies human expertise at a high level of fidelity. 相似文献
106.
Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, the truncation of loss functions is applied to contamination models that describe the occurrence of mislabels near decision boundaries. Numerical experiments illustrate that the proposed loss functions derived from the contamination models are useful for handling highly noisy data in comparison with other loss functions. 相似文献
107.
Biddiscombe J Geveci B Martin K Moreland K Thompson D 《IEEE transactions on visualization and computer graphics》2007,13(6):1376-1383
Pipeline architectures provide a versatile and efficient mechanism for constructing visualizations, and they have been implemented in numerous libraries and applications over the past two decades. In addition to allowing developers and users to freely combine algorithms, visualization pipelines have proven to work well when streaming data and scale well on parallel distributed-memory computers. However, current pipeline visualization frameworks have a critical flaw: they are unable to manage time varying data. As data flows through the pipeline, each algorithm has access to only a single snapshot in time of the data. This prevents the implementation of algorithms that do any temporal processing such as particle tracing; plotting over time; or interpolation, fitting, or smoothing of time series data. As data acquisition technology improves, as simulation time-integration techniques become more complex, and as simulations save less frequently and regularly, the ability to analyze the time-behavior of data becomes more important. This paper describes a modification to the traditional pipeline architecture that allows it to accommodate temporal algorithms. Furthermore, the architecture allows temporal algorithms to be used in conjunction with algorithms expecting a single time snapshot, thus simplifying software design and allowing adoption into existing pipeline frameworks. Our architecture also continues to work well in parallel distributed-memory environments. We demonstrate our architecture by modifying the popular VTK framework and exposing the functionality to the ParaView application. We use this framework to apply time-dependent algorithms on large data with a parallel cluster computer and thereby exercise a functionality that previously did not exist. 相似文献
108.
OBJECTIVE: This study examined the way in which the type and preexisting strength of association between an auditory icon and a warning event affects the ease with which the icon/event pairing can be learned and retained. BACKGROUND: To be effective, an auditory warning must be audible, identifiable, interpretable, and heeded. Warnings consisting of familiar environmental sounds, or auditory icons, have potential to facilitate identification and interpretation. The ease with which pairings between auditory icons and warning events can be learned and retained is likely to depend on the type and strength of the preexisting icon/event association. METHOD: Sixty-three participants each learned eight auditory-icon/denotative-referent pairings and attempted to recall them 4 weeks later. Three icon/denotative-referent association types (direct, related, and unrelated) were employed. Participants rated the strength of the association for each pairing on a 7-point scale. RESULTS: The number of errors made while learning pairings was greater for unrelated than for either related or direct associations, whereas the number of errors made while attempting to recall pairings 4 weeks later was greater for unrelated than for related associations and for related than for direct associations. Irrespective of association type, both learning and retention performance remained at very high levels, provided the strength of the association was rated greater than 5. CONCLUSION: This suggests that strong preexisting associations are used to facilitate learning and retention of icon/denotative-referent pairings. APPLICATION: The practical implication of this study is that auditory icons having either direct or strong, indirect associations with warning events should be preferred. 相似文献
109.
Huang TC Zhang G Guerrero T Starkschall G Lin KP Forster K 《Computer methods and programs in biomedicine》2006,84(2-3):124-134
In radiotherapy treatment planning, tumor volumes and anatomical structures are manually contoured for dose calculation, which takes time for clinicians. This study examines the use of semi-automated segmentation of CT images. A few high curvature points are manually drawn on a CT slice. Then Fourier interpolation is used to complete the contour. Consequently, optical flow, a deformable image registration method, is used to map the original contour to other slices. This technique has been applied successfully to contour anatomical structures and tumors. The maximum difference between the mapped contours and manually drawn contours was 6 pixels, which is similar in magnitude to difference one would see in manually drawn contours by different clinicians. The technique fails when the region to contour is topologically different between two slices. A solution is recommended to manually delineate contours on a sparse subset of slices and then map in both directions to fill the remaining slices. 相似文献
110.