首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   8231篇
  免费   278篇
  国内免费   10篇
电工技术   99篇
综合类   13篇
化学工业   1754篇
金属工艺   177篇
机械仪表   162篇
建筑科学   445篇
矿业工程   14篇
能源动力   202篇
轻工业   621篇
水利工程   50篇
石油天然气   18篇
武器工业   2篇
无线电   519篇
一般工业技术   1368篇
冶金工业   1580篇
原子能技术   60篇
自动化技术   1435篇
  2022年   87篇
  2021年   141篇
  2020年   111篇
  2019年   149篇
  2018年   140篇
  2017年   117篇
  2016年   161篇
  2015年   153篇
  2014年   190篇
  2013年   477篇
  2012年   333篇
  2011年   474篇
  2010年   365篇
  2009年   333篇
  2008年   426篇
  2007年   389篇
  2006年   332篇
  2005年   266篇
  2004年   245篇
  2003年   232篇
  2002年   235篇
  2001年   136篇
  2000年   111篇
  1999年   133篇
  1998年   173篇
  1997年   131篇
  1996年   149篇
  1995年   145篇
  1994年   122篇
  1993年   121篇
  1992年   113篇
  1991年   84篇
  1990年   95篇
  1989年   111篇
  1988年   77篇
  1987年   80篇
  1986年   93篇
  1985年   100篇
  1984年   89篇
  1983年   107篇
  1982年   96篇
  1981年   91篇
  1980年   83篇
  1979年   87篇
  1978年   69篇
  1977年   73篇
  1976年   79篇
  1975年   71篇
  1974年   50篇
  1973年   56篇
排序方式: 共有8519条查询结果,搜索用时 31 毫秒
191.
The Kruskal-Wallis (KW) nonparametric analysis of variance is often used instead of a standard one-way ANOVA when data are from a suspected non-normal population. The KW omnibus procedure tests for some differences between groups, but provides no specific post hoc pair wise comparisons. This paper provides a SAS® macro implementation of a multiple comparison test based on significant Kruskal-Wallis results from the SAS NPAR1WAY procedure. The implementation is designed for up to 20 groups at a user-specified alpha significance level. A Monte-Carlo simulation compared this nonparametric procedure to commonly used parametric multiple comparison tests.  相似文献   
192.
Adaptive Sampling for Network Management   总被引:3,自引:0,他引:3  
High-performance networks require sophisticated management systems to identify sources of bottlenecks and detect faults. At the same time, the impact of network queries on the latency and bandwidth available to the applications must be minimized. Adaptive techniques can be used to control and reduce the rate of sampling of network information, reducing the amount of processed data and lessening the overhead on the network. Two adaptive sampling methods are proposed in this paper based on linear prediction and fuzzy logic. The performance of these techniques is compared with conventional sampling methods by conducting simulative experiments using Internet and videoconference traffic patterns. The adaptive techniques are significantly more flexible in their ability to dynamically adjust with fluctuations in network behavior, and in some cases they are able to reduce the sample count by as much as a factor of two while maintaining the same accuracy as the best conventional sampling interval. The results illustrate that adaptive sampling provides the potential for better monitoring, control, and management of high-performance networks with higher accuracy, lower overhead, or both.  相似文献   
193.
The genomics, proteomics, clinical, and drug discovery laboratories have a growing need to maintain valuable samples at ultra-low (−80°C) temperatures in a validated, secure environment. Automated sample processing systems have until now required manual (off-line) storage of samples at −80°C, reducing system reliability and speed. Both of these important needs are addressed by the Sample Process Management System being introduced by BIOPHILE Inc. Conventional sample management processes, such as storage, retrieval, and cataloging, are increasingly strained by the growing sample populations. There are variable sample types, access requirements and storage requirements. Security and inventory procedures are implemented manually. The evolving technologies present in the laboratory cannot interface with conventional manual storage techniques. Addressing these limitations, the primary benefits of BIOPHILE's solutions are:
• Fully validated sample management process that coordinates the life-cycles of samples and their related data.
• Robotic technology to securely store and retrieve samples, improving their accessibility and stability. Thermal shock is reduced, improving sample longevity and quality. The robotic technology allows integration with larger automation systems.
• A process program to develop a Sample Management Strategy. This strategy is developed by analyzing long-term research goals, current baseline processes, and identification of current sample life cycles. A full validation documentation package can be generated, providing a high level of quality assurance.
• Improved sample visibility and quality assurance - automated sample population cataloging; controlled sample management access and security.
  相似文献   
194.
195.
Many real-world domains exhibit rich relational structure and stochasticity and motivate the development of models that combine predicate logic with probabilities. These models describe probabilistic influences between attributes of objects that are related to each other through known domain relationships. To keep these models succinct, each such influence is considered independent of others, which is called the assumption of “independence of causal influences” (ICI). In this paper, we describe a language that consists of quantified conditional influence statements and captures most relational probabilistic models based on directed graphs. The influences due to different statements are combined using a set of combining rules such as Noisy-OR. We motivate and introduce multi-level combining rules, where the lower level rules combine the influences due to different ground instances of the same statement, and the upper level rules combine the influences due to different statements. We present algorithms and empirical results for parameter learning in the presence of such combining rules. Specifically, we derive and implement algorithms based on gradient descent and expectation maximization for different combining rules and evaluate them on synthetic data and on a real-world task. The results demonstrate that the algorithms are able to learn both the conditional probability distributions of the influence statements and the parameters of the combining rules.  相似文献   
196.
Mobile sensing and mapping applications are becoming more prevalent because sensing hardware is becoming more portable and more affordable. However, most of the hardware uses small numbers of fixed sensors that report and share multiple sets of environmental data which raises privacy concerns. Instead, these systems can be decentralized and managed by individuals in their public and private spaces. This paper describes a robust system called MobGeoSens which enables individuals to monitor their local environment (e.g. pollution and temperature) and their private spaces (e.g. activities and health) by using mobile phones in their day to day life. The MobGeoSen is a combination of software components that facilitates the phone’s internal sensing devices (e.g. microphone and camera) and external wireless sensors (e.g. data loggers and GPS receivers) for data collection. It also adds a new dimension of spatial localization to the data collection process and provides the user with both textual and spatial cartographic displays. While collecting the data, individuals can interactively add annotations and photos which are automatically added and integrated in the visualization file/log. This makes it easy to visualize the data, photos and annotations on a spatial and temporal visualization tool. In addition, the paper will present ways in which mobile phones can be used as noise sensors using an on-device microphone. Finally, we present our experiences with school children using the above mentioned system to measure their exposure to environmental pollution.
Adrain WoolardEmail:
  相似文献   
197.
Atomic computing     
Woodward  Alan 《ITNOW》2008,50(1):30-31
Technological progress comes from pushing hard at the limitsof what is currently possible, not from merely following trendsothers have set. In computing a good illustration of this principleis the life and work of the 19th century computer pioneer CharlesBabbage (1791-1870), who spent most of his adult life tryingto build a digital computer. Babbage first invented such a machinein 1834. He called it the Analytical Engine.  相似文献   
198.
Although numerous protein biomarkers have been correlated with advanced disease states, no new clinical assays have been developed. Goals often anticipate disease-specific protein changes that exceed values among healthy individuals, a property common to acute phase reactants. This review considers somewhat different approaches. It focuses on intact protein isoform ratios that present a biomarker without change in the total concentration of the protein. These will seldom be detected by peptide level analysis or by most antibody-based assays. For example, application of an inexpensive method to large sample groups resulted in observation of several polymorphisms, including the first structural polymorphism of apolipoprotein C1. Isoform distribution of this protein was altered and was eventually linked to increased obesity. Numerous other protein isoforms included C- and N-terminal proteolysis, changes of glycoisoform ratios and certain types of sulfhydryl oxidation. While many of these gave excellent statistical correlation with advanced disease, clinical utility was not apparent. More important may be that protein isoform ratios were very stable in each individual. Diagnosis by longitudinal analysis of the same individual might increase sensitivity of protein biomarkers by 20-fold or more. Protein changes that exceed the range of values found among healthy individuals may be uncommon.  相似文献   
199.
Although recent years have seen significant advances in the spatial resolution possible in the transmission electron microscope (TEM), the temporal resolution of most microscopes is limited to video rate at best. This lack of temporal resolution means that our understanding of dynamic processes in materials is extremely limited. High temporal resolution in the TEM can be achieved, however, by replacing the normal thermionic or field emission source with a photoemission source. In this case the temporal resolution is limited only by the ability to create a short pulse of photoexcited electrons in the source, and this can be as short as a few femtoseconds. The operation of the photo-emission source and the control of the subsequent pulse of electrons (containing as many as 5 x 10(7) electrons) create significant challenges for a standard microscope column that is designed to operate with a single electron in the column at any one time. In this paper, the generation and control of electron pulses in the TEM to obtain a temporal resolution <10(-6)s will be described and the effect of the pulse duration and current density on the spatial resolution of the instrument will be examined. The potential of these levels of temporal and spatial resolution for the study of dynamic materials processes will also be discussed.  相似文献   
200.
Chan AH  Tang NY 《Ergonomics》2007,50(2):289-318
In quantitative models of visual search it has usually been assumed that visual lobe area shape was sufficiently regular to be approximated by a circle or ellipse. However, the irregularities in visual lobe shapes that have been found in studies involving extensive lobe mapping have suggested that lobe shape may have important implications for visual search performance and for the accuracy of mathematical models used for performance prediction. However, no systematic research on the relationship between the shape aspect of visual lobes and search performance seems to have been carried out and no comparisons of visual lobe shape characteristics under the effect of target difficulty have been reported. The current study was conducted to achieve two major objectives in two experiments. Experiment 1 used two different targets (letter 'O' and letter 'Y') to map the visual lobes of subjects in order to provide a systematic and quantitative comparison of lobe shape characteristics and experiment 2 was to investigate the correlation of visual lobe shape characteristics with visual search time under the effect of target difficulty. The visual lobes of 28 subjects were mapped on 24 imaginary and regularly spaced meridians originating from the centre of the visual field to resemble the full field mapping situation. Five categories of shape indices, viz. roundness, boundary smoothness, symmetry, elongation and shape regularity were investigated. The results of this study demonstrated that the visual lobe shapes of subjects elongate horizontally with medium level of roundness, high levels of boundary smoothness, symmetry and regularity for an easy target (O) against a homogeneous background of 'X's. When a difficult target (Y) was used, the visual lobes of the subjects were still elongated horizontally but to a smaller extent and with a low level of roundness, medium level of boundary smoothness and regularity and a similar high level of symmetry to the easy target. Moreover, significant correlations between shape indices and visual search time were found, suggesting mathematical models for predicting search time should not merely rely on area but also should consider visual lobe shape indices. Finally, a universal mathematical model containing several visual lobe shape indices was developed, which was applicable in the prediction of visual search time for a range of similar search tasks.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号