首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2702篇
  免费   115篇
  国内免费   11篇
电工技术   58篇
综合类   4篇
化学工业   645篇
金属工艺   73篇
机械仪表   44篇
建筑科学   134篇
矿业工程   12篇
能源动力   46篇
轻工业   219篇
水利工程   15篇
石油天然气   6篇
无线电   246篇
一般工业技术   568篇
冶金工业   376篇
原子能技术   18篇
自动化技术   364篇
  2023年   23篇
  2022年   24篇
  2021年   57篇
  2020年   45篇
  2019年   35篇
  2018年   61篇
  2017年   43篇
  2016年   64篇
  2015年   58篇
  2014年   75篇
  2013年   141篇
  2012年   118篇
  2011年   153篇
  2010年   117篇
  2009年   108篇
  2008年   137篇
  2007年   125篇
  2006年   106篇
  2005年   82篇
  2004年   84篇
  2003年   70篇
  2002年   66篇
  2001年   47篇
  2000年   49篇
  1999年   61篇
  1998年   108篇
  1997年   78篇
  1996年   65篇
  1995年   65篇
  1994年   50篇
  1993年   38篇
  1992年   39篇
  1991年   32篇
  1990年   32篇
  1989年   24篇
  1988年   28篇
  1987年   25篇
  1986年   36篇
  1985年   38篇
  1984年   23篇
  1983年   20篇
  1982年   15篇
  1981年   9篇
  1980年   25篇
  1979年   17篇
  1978年   16篇
  1977年   21篇
  1976年   23篇
  1975年   11篇
  1974年   9篇
排序方式: 共有2828条查询结果,搜索用时 15 毫秒
61.
In order to accomplish practical deployment modelling for system performance evaluation and comparison for possible modulation and equalisation schemes to be used in HIPERLAN, a wide band tapped delay line (WTDL) channel model has been adopted by ETSI to characterise the multipath fading in the indoor radio environment. Based on this statistical channel model, and using Monte Carlo method, this paper evaluates the average probability of error for linear and decision feedback equaliser as a function of signal-to-noise ratio. It also evaluates the matched filter bound for this channel model. The results show the optimum performance levels achievable via the use of any equaliser.The work described in this paper was supported by the UK DTI/EPSRC LINK project: PC2011 High Throughput Radio Modem under EPSRC grant reference GR/K00318 in collaboration with Symbionics Networks limited.  相似文献   
62.
Barton JP 《Applied optics》1996,35(3):532-541
Theoretical procedures are presented for the determination of the internal and the near-surface electromagnetic fields for an arbitrary monochromatic field (e.g., a focused laser beam) incident upon an irregulary shaped, axisymmetric layered particle. The layered spherical particle solution is also given as a special case of the general solution. Systematic calculations are presented that demonstrate the effects of particle shape and incident focused-beam orientation on the electromagnetic-field distributions.  相似文献   
63.
This paper presents an assumption/commitment specification technique and a refinement calculus for networks of agents communicating asynchronously via unbounded FIFO channels in the tradition of Kahn.
  • We define two types of assumption/commitment specifications, namely simple and general specifications.
  • It is shown that semantically, any deterministic agent can be uniquely characterized by a simple specification, and any nondeterministic agent can be uniquely characterized by a general specification.
  • We define two sets of refinement rules, one for simple specifications and one for general specifications. The rules are Hoare-logic inspired. In particular the feedback rules employ invariants in the style of a traditional while-rule.
  • Both sets of rules have been proved to be sound and also (semantic) relative complete.
  • Conversion rules allow the two logics to be combined. This means that general specifications and the rules for general specifications have to be introduced only at the point in a system development where they are really needed.
  •   相似文献   
    64.
    Massively parallel processors have begun using commodity operating systems that support demand-paged virtual memory. To evaluate the utility of virtual memory, we measured the behavior of seven shared-memory parallel application programs on a simulated distributed-shared-memory machine. Our results (1) confirm the importance of gang CPU scheduling, (2) show that a page-faulting processor should spin rather than invoke a parallel context switch, (3) show that our parallel programs frequently touch most of their data, and (4) indicate that memory, not just CPUs, must be gang scheduled. Overall, our experiments demonstrate that demand paging has limited value on current parallel machines because of the applications' synchronization and memory reference patterns and the machines' high page-fault and parallel context-switch overheads.An earlier version of this paper was presented at Supercomputing '94.This work is supported in part by NSF Presidential Young Investigator Award CCR-9157366; NSF Grants MIP-9225097, CCR-9100968, and CDA-9024618; Office of Naval Research Grant N00014-89-J-1222; Department of Energy Grant DE-FG02-93ER25176; and donations from Thinking Machines Corporation, Xerox Corporation, and Digital Equipment Corporation.  相似文献   
    65.
    An instant and quantitative assessment of spatial distances between two objects plays an important role in interactive applications such as virtual model assembly, medical operation planning, or computational steering. While some research has been done on the development of distance-based measures between two objects, only very few attempts have been reported to visualize such measures in interactive scenarios. In this paper we present two different approaches for this purpose, and we investigate the effectiveness of these approaches for intuitive 3D implant positioning in a medical operation planning system. The first approach uses cylindrical glyphs to depict distances, which smoothly adapt their shape and color to changing distances when the objects are moved. This approach computes distances directly on the polygonal object representations by means of ray/triangle mesh intersection. The second approach introduces a set of slices as additional geometric structures, and uses color coding on surfaces to indicate distances. This approach obtains distances from a precomputed distance field of each object. The major findings of the performed user study indicate that a visualization that can facilitate an instant and quantitative analysis of distances between two objects in interactive 3D scenarios is demanding, yet can be achieved by including additional monocular cues into the visualization.  相似文献   
    66.
    Numerous numerical methods have been developed in an effort to accurately predict stresses in bones. The largest group are variants of the h-version of the finite element method (h-FEM), where low order Ansatz functions are used. By contrast, we3 investigate a combination of high order FEM and a fictitious domain approach, the finite cell method (FCM). While the FCM has been verified and validated in previous publications, this article proposes methods on how the FCM can be made computationally efficient to the extent that it can be used for patient specific, interactive bone simulations. This approach is called computational steering and allows to change input parameters like the position of an implant, material or loads and leads to an almost instantaneous change in the output (stress lines, deformations). This direct feedback gives the user an immediate impression of the impact of his actions to an extent which, otherwise, is hard to obtain by the use of classical non interactive computations. Specifically, we investigate an application to pre-surgical planning of a total hip replacement where it is desirable to select an optimal implant for a specific patient. Herein, optimal is meant in the sense that the expected post-operative stress distribution in the bone closely resembles that before the operation.  相似文献   
    67.
    Data recorded from multiple sources sometimes exhibit non-instantaneous couplings. For simple data sets, cross-correlograms may reveal the coupling dynamics. But when dealing with high-dimensional multivariate data there is no such measure as the cross-correlogram. We propose a simple algorithm based on Kernel Canonical Correlation Analysis (kCCA) that computes a multivariate temporal filter which links one data modality to another one. The filters can be used to compute a multivariate extension of the cross-correlogram, the canonical correlogram, between data sources that have different dimensionalities and temporal resolutions. The canonical correlogram reflects the coupling dynamics between the two sources. The temporal filter reveals which features in the data give rise to these couplings and when they do so. We present results from simulations and neuroscientific experiments showing that tkCCA yields easily interpretable temporal filters and correlograms. In the experiments, we simultaneously performed electrode recordings and functional magnetic resonance imaging (fMRI) in primary visual cortex of the non-human primate. While electrode recordings reflect brain activity directly, fMRI provides only an indirect view of neural activity via the Blood Oxygen Level Dependent (BOLD) response. Thus it is crucial for our understanding and the interpretation of fMRI signals in general to relate them to direct measures of neural activity acquired with electrodes. The results computed by tkCCA confirm recent models of the hemodynamic response to neural activity and allow for a more detailed analysis of neurovascular coupling dynamics.  相似文献   
    68.
    In silico models that predict the rate of human renal clearance for a diverse set of drugs, that exhibit both active secretion and net re-absorption, have been produced using three statistical approaches. Partial Least Squares (PLS) and Random Forests (RF) have been used to produce continuous models whereas Classification And Regression Trees (CART) has only been used for a classification model. The best models generated from either PLS or RF produce significant models that can predict acids/zwitterions, bases and neutrals with approximate average fold errors of 3, 3 and 4, respectively, for an independent test set that covers oral drug-like property space. These models contain additional information on top of any influence arising from plasma protein binding on the rate of renal clearance. Classification And Regression Trees (CART) has been used to generate a classification tree leading to a simple set of Renal Clearance Rules (RCR) that can be applied to man. The rules are influenced by lipophilicity and ion class and can correctly predict 60% of an independent test set. These percentages increase to 71% and 79% for drugs with renal clearances of < 0.1 ml/min/kg and > 1 ml/min/kg, respectively. As far as the authors are aware these are the first set of models to appear in the literature that predict the rate of human renal clearance and can be used to manipulate molecular properties leading to new drugs that are less likely to fail due to renal clearance.  相似文献   
    69.
    One method for the evaluation of complex environmental and health datasets is the discrete mathematical method Hasse diagram technique based on partial orders. The introduced software program package is named PyHasse. In this paper we evaluate a possible human association between maternal exposure to organochlorine compounds used as pesticides and cryptorchidism among male children in Finland and Denmark. We identified differences in comparable and incomparable objects and quantified these differences by the software tool Similarity Analysis in the program PyHasse. Furthermore we interpreted the corresponding Hasse diagrams concerning chosen “striking objects”. We found the position of the chemicals AHCH (alpha-Hexachlorohexane), CHCE (cis-Heptachloroepoxide), DIEL (Dieldrin), and MIRE (Mirex) has some influence on the differentiation of the Hasse diagrams and hence of each two datasets analyzed. The largest disparities can be observed when we compare the Finnish and Danish datasets concerning cryptorchidism. The disparities are demonstrated in the corresponding Hasse diagrams.  相似文献   
    70.
    In this work, we continue the study of the many facets of the Fully Mixed Nash Equilibrium Conjecture, henceforth abbreviated as the FMNE\mathsf{FMNE} Conjecture, in selfish routing for the special case of n identical users over two (identical) parallel links. We introduce a new measure of Social Cost, defined as the expectation of the square of the maximum congestion on a link; we call it Quadratic Maximum Social Cost. A Nash equilibrium is a stable state where no user can improve her (expected) latency by switching her mixed strategy; a worst-case Nash equilibrium is one that maximizes Quadratic Maximum Social Cost. In the fully mixed Nash equilibrium, all mixed strategies achieve full support.  相似文献   
    设为首页 | 免责声明 | 关于勤云 | 加入收藏

    Copyright©北京勤云科技发展有限公司  京ICP备09084417号