首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1988篇
  免费   93篇
  国内免费   5篇
电工技术   36篇
综合类   6篇
化学工业   609篇
金属工艺   54篇
机械仪表   39篇
建筑科学   125篇
矿业工程   9篇
能源动力   47篇
轻工业   134篇
水利工程   5篇
无线电   132篇
一般工业技术   362篇
冶金工业   114篇
原子能技术   28篇
自动化技术   386篇
  2023年   19篇
  2022年   21篇
  2021年   61篇
  2020年   34篇
  2019年   35篇
  2018年   37篇
  2017年   38篇
  2016年   72篇
  2015年   53篇
  2014年   63篇
  2013年   93篇
  2012年   112篇
  2011年   145篇
  2010年   113篇
  2009年   95篇
  2008年   111篇
  2007年   94篇
  2006年   109篇
  2005年   75篇
  2004年   76篇
  2003年   49篇
  2002年   42篇
  2001年   40篇
  2000年   33篇
  1999年   37篇
  1998年   38篇
  1997年   33篇
  1996年   24篇
  1995年   22篇
  1994年   24篇
  1993年   20篇
  1992年   22篇
  1991年   21篇
  1990年   16篇
  1989年   15篇
  1988年   10篇
  1987年   14篇
  1985年   13篇
  1984年   9篇
  1983年   12篇
  1982年   13篇
  1981年   11篇
  1980年   13篇
  1979年   11篇
  1978年   9篇
  1977年   10篇
  1976年   8篇
  1975年   13篇
  1974年   16篇
  1973年   7篇
排序方式: 共有2086条查询结果,搜索用时 47 毫秒
61.
We review a number of formal verification techniques supported by STeP, the Stanford Temporal Prover, describing how the tool can be used to verify properties of several versions of the Bakery Mutual exclusion algorithm for mutual exclusion. We verify the classic two-process algorithm and simple variants, as well as an atomic parameterized version. The methods used include deductive verification rules, verification diagrams, automatic invariant generation, and finite-state model checking and abstraction.  相似文献   
62.
Although numerous protein biomarkers have been correlated with advanced disease states, no new clinical assays have been developed. Goals often anticipate disease-specific protein changes that exceed values among healthy individuals, a property common to acute phase reactants. This review considers somewhat different approaches. It focuses on intact protein isoform ratios that present a biomarker without change in the total concentration of the protein. These will seldom be detected by peptide level analysis or by most antibody-based assays. For example, application of an inexpensive method to large sample groups resulted in observation of several polymorphisms, including the first structural polymorphism of apolipoprotein C1. Isoform distribution of this protein was altered and was eventually linked to increased obesity. Numerous other protein isoforms included C- and N-terminal proteolysis, changes of glycoisoform ratios and certain types of sulfhydryl oxidation. While many of these gave excellent statistical correlation with advanced disease, clinical utility was not apparent. More important may be that protein isoform ratios were very stable in each individual. Diagnosis by longitudinal analysis of the same individual might increase sensitivity of protein biomarkers by 20-fold or more. Protein changes that exceed the range of values found among healthy individuals may be uncommon.  相似文献   
63.
We report on the experimental realization of an ultrahigh vacuum (UHV) indium sealing between a conflat knife edge and an optical window. The sealing requires a very low clamping force and thus allows for the use of very thin and fragile windows.  相似文献   
64.
It is a well-known fact that Hebbian learning is inherently unstable because of its self-amplifying terms: the more a synapse grows, the stronger the postsynaptic activity, and therefore the faster the synaptic growth. This unwanted weight growth is driven by the autocorrelation term of Hebbian learning where the same synapse drives its own growth. On the other hand, the cross-correlation term performs actual learning where different inputs are correlated with each other. Consequently, we would like to minimize the autocorrelation and maximize the cross-correlation. Here we show that we can achieve this with a third factor that switches on learning when the autocorrelation is minimal or zero and the cross-correlation is maximal. The biological counterpart of such a third factor is a neuromodulator that switches on learning at a certain moment in time. We show in a behavioral experiment that our three-factor learning clearly outperforms classical Hebbian learning.  相似文献   
65.
We present a powerful framework for 3D-texture-based rendering of multiple arbitrarily intersecting volumetric datasets. Each volume is represented by a multi-resolution octree-based structure and we use out-of-core techniques to support extremely large volumes. Users define a set of convex polyhedral volume lenses, which may be associated with one or more volumetric datasets. The volumes or the lenses can be interactively moved around while the region inside each lens is rendered using interactively defined multi-volume shaders. Our rendering pipeline splits each lens into multiple convex regions such that each region is homogenous and contains a fixed number of volumes. Each such region is further split by the brick boundaries of the associated octree representations. The resulting puzzle of lens fragments is sorted in front-to-back or back-to-front order using a combination of a view-dependent octree traversal and a GPU-based depth peeling technique. Our current implementation uses slice-based volume rendering and allows interactive roaming through multiple intersecting multi-gigabyte volumes.  相似文献   
66.
Topology provides a foundation for the development of mathematically sound tools for processing and exploration of scalar fields. Existing topology-based methods can be used to identify interesting features in volumetric data sets, to find seed sets for accelerated isosurface extraction, or to treat individual connected components as distinct entities for isosurfacing or interval volume rendering. We describe a framework for direct volume rendering based on segmenting a volume into regions of equivalent contour topology and applying separate transfer functions to each region. Each region corresponds to a branch of a hierarchical contour tree decomposition, and a separate transfer function can be defined for it. The novel contributions of our work are: 1) a volume rendering framework and interface where a unique transfer function can be assigned to each subvolume corresponding to a branch of the contour tree, 2) a runtime method for adjusting data values to reflect contour tree simplifications, 3) an efficient way of mapping a spatial location into the contour tree to determine the applicable transfer function, and 4) an algorithm for hardware-accelerated direct volume rendering that visualizes the contour tree-based segmentation at interactive frame rates using graphics processing units (GPUs) that support loops and conditional branches in fragment programs  相似文献   
67.
The Morse-Smale complex is an efficient representation of the gradient behavior of a scalar function, and critical points paired by the complex identify topological features and their importance. We present an algorithm that constructs the Morse-Smale complex in a series of sweeps through the data, identifying various components of the complex in a consistent manner. All components of the complex, both geometric and topological, are computed, providing a complete decomposition of the domain. Efficiency is maintained by representing the geometry of the complex in terms of point sets.  相似文献   
68.
Heart rate variability (HRV) represents the cardiovascular control mediated by the autonomic nervous system and other mechanisms. In the established task force HRV monitoring different cardiovascular control mechanisms can approximately be identified at typical frequencies of heart rate oscillations by power spectral analysis. HRV measures assessing complex and fractal behavior partly improved clinical risk stratification. However, their relationship to (patho-)physiology is not sufficiently explored. Objective of the present work is the introduction of complexity measures of different physiologically relevant time scales. This is achieved by a new concept of the autonomic information flow (AIF) analysis which was designed according to task force HRV. First applications show that different time scales of AIF improve the risk stratification of patients with multiple organ dysfunction syndrome and cardiac arrest patients in comparison to standard HRV. Each group's significant time scales correspond to their respective pathomechanisms.  相似文献   
69.
This article explores the achievable transmission electron microscopy specimen thickness and quality by using three different preparation methods in the case of a high-strength nanocrystalline Cu-Nb powder alloy. Low specimen thickness is essential for spatially resolved analyses of the grains in nanocrystalline materials. We have found that single-sided as well as double-sided low-angle Ar ion milling of the Cu-Nb powders embedded into epoxy resin produced wedge-shaped particles of very low thickness (<10 nm) near the edge. By means of a modified focused ion beam lift-out technique generating holes in the lamella interior large micrometer-sized electron-transparent regions were obtained. However, this lamella displayed a higher thickness at the rim of ≥30 nm. Limiting factors for the observed thicknesses are discussed including ion damage depths, backscattering, and surface roughness, which depend on ion type, energy, current density, and specimen motion. Finally, sections cut by ultramicrotomy at low stroke rate and low set thickness offered vast, several tens of square micrometers uniformly thin regions of ~10-nm minimum thickness. As major drawbacks, we have detected a thin coating on the sections consisting of epoxy deployed as the embedding material and considerable nanoscale thickness variations.  相似文献   
70.
Reliable routing of packets in a Mobile Ad Hoc Network (MANET) has always been a major concern. The open medium and the susceptibility of the nodes of being fault-prone make the design of protocols for these networks a challenging task. The faults in these networks, which occur either due to the failure of nodes or due to reorganization, can eventuate to packet loss. Such losses degrade the performance of the routing protocols running on them. In this paper, we propose a routing algorithm, named as learning automata based fault-tolerant routing algorithm (LAFTRA), which is capable of routing in the presence of faulty nodes in MANETs using multipath routing. We have used the theory of Learning Automata (LA) for optimizing the selection of paths, reducing the overhead in the network, and for learning about the faulty nodes present in the network. The proposed algorithm can be juxtaposed to any existing routing protocol in a MANET. The results of simulation of our protocol using network simulator 2 (ns-2) shows the increase in packet delivery ratio and decrease in overhead compared to the existing protocols. The proposed protocol gains an edge over FTAR, E2FT by nearly 2% and by more than 10% when compared with AODV in terms of packet delivery ratio with nearly 30% faulty nodes in the network. The overhead generated by our protocol is lesser by 1% as compared to FTAR and by nearly 17% as compared to E2FT when there are nearly 30% faulty nodes.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号