首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1013篇
  免费   13篇
  国内免费   1篇
电工技术   11篇
综合类   1篇
化学工业   223篇
金属工艺   9篇
机械仪表   34篇
建筑科学   73篇
矿业工程   1篇
能源动力   37篇
轻工业   71篇
水利工程   7篇
无线电   81篇
一般工业技术   184篇
冶金工业   141篇
原子能技术   7篇
自动化技术   147篇
  2024年   7篇
  2023年   7篇
  2022年   16篇
  2021年   17篇
  2020年   22篇
  2019年   27篇
  2018年   20篇
  2017年   15篇
  2016年   27篇
  2015年   28篇
  2014年   46篇
  2013年   85篇
  2012年   54篇
  2011年   69篇
  2010年   45篇
  2009年   48篇
  2008年   42篇
  2007年   38篇
  2006年   30篇
  2005年   36篇
  2004年   30篇
  2003年   24篇
  2002年   16篇
  2001年   16篇
  2000年   14篇
  1999年   13篇
  1998年   55篇
  1997年   25篇
  1996年   14篇
  1995年   17篇
  1994年   14篇
  1993年   15篇
  1992年   9篇
  1991年   6篇
  1990年   4篇
  1989年   10篇
  1987年   3篇
  1986年   3篇
  1985年   8篇
  1984年   5篇
  1983年   3篇
  1982年   2篇
  1981年   4篇
  1980年   6篇
  1979年   5篇
  1976年   7篇
  1975年   5篇
  1974年   6篇
  1960年   2篇
  1957年   1篇
排序方式: 共有1027条查询结果,搜索用时 0 毫秒
91.
In this study a chromatographic approach for fluorescence reduction in liquid Raman analysis has been evaluated. The idea behind the approach is to apply a chromatographic separation step prior to Raman analysis in order to separate fluorescing compounds from other components of interest, thus facilitating better quantitative and qualitative analysis of the latter components. A real-time liquid-core Raman waveguide detector designed for chromatographic applications was used in the study, thus providing real-time chemical pretreatment of liquid samples for Raman analysis. Twenty aqueous mixtures of additives frequently found in beverages were analyzed, and for comparative purposes the mixtures were also analyzed in the Raman waveguide detector without chromatographic separation and with a conventional immersion probe. Both qualitatively and quantitatively satisfying results were obtained using the chromatographic Raman approach, and the technique provided possibilities for quantitative and qualitative assessments superior to the two other instrumental setups. The technique may provide additional benefits through sensitivity enhancements, and the approach is simple, inexpensive, and easy to implement in the average applied Raman laboratory. The analysis of various chemical systems and factors such as system stability over time need further evaluation in order to confirm the general applicability of the approach.  相似文献   
92.
Chronic otitis media is a common disease often accompanied by recurrent bacterial infections. These may lead to the destruction of the middle ear bones such that prostheses have to be implanted to restore sound transmission. Surface coatings with layered double hydroxides (LDHs) are evaluated here as a possibility for drug delivery systems with convenient advantages such as low cytotoxicity and easy synthesis. Male New Zealand White rabbits were implanted with Bioverit® II middle ear prostheses coated with the LDH Mg4Al2(OH)12(SO4)2·6H2O impregnated with ciprofloxacin. 12 (group 1) were directly infected with Pseudomonas aeruginosa and another 12 (group 2) 1 week after the implantation. Clinical outcome, blood counts, histological analyses and microbiological examination showed an excellent antimicrobial activity for group 1, whereas this effect was attenuated in animals where infection was performed 1 week after implantation. This is the first study to demonstrate an efficient drug delivery system with an LDH coating on prostheses in the middle ear.  相似文献   
93.
94.
The purpose of the present investigation was to systematically examine the effectiveness of the Sympson-Hetter technique and rotated content balancing relative to no exposure control and no content rotation conditions in a computerized adaptive testing system (CAT) based on the partial credit model. A series of simulated fixed and variable length CATs were run using two data sets generated to multiple content areas for three sizes of item pools. The 2 (exposure control) X 2 (content rotation) X 2 (test length) X 3 (item pool size) X 2 (data sets) yielded a total of 48 conditions. Results show that while both procedures can be used with no deleterious effect on measurement precision, the gains in exposure control, pool utilization, and item overlap appear quite modest. Difficulties involved with setting the exposure control parameters in small item pools make questionable the utility of the Sympson-Hetter technique with similar item pools.  相似文献   
95.
We have performed an extended replication of the Porter-Votta-Basili experiment comparing the Scenario method and the Checklist method for inspecting requirements specifications using identical instruments. The experiment has been conducted in our educational context represented by a more general definition of a defect compared to the original defect list. Our study involving 24 undergraduate students manipulated three independent variables: detection method, requirements specification, and the order of the inspections. The dependent variable measured is the defect detection rate. We found the requirements specification inspected and not the detection method to be the most probable explanation for the variance in defect detection rate. This suggests that it is important to gather knowledge of how a requirements specification can convey an understandable view of the product and to adapt inspection methods accordingly. Contrary to the original experiment, we can not significantly support the superiority of the Scenario method. This is in accordance with a replication conducted by Fusaro, Lanubile and Visaggio, and might be explained by the lack of individual defect detection skill of our less experienced subjects.  相似文献   
96.
Requirements Engineering - Traceability links recovery (TLR) has been a topic of interest for many years. However, TLR approaches are based on the latent semantics of the software artifacts, and...  相似文献   
97.
This paper deals with a finite element formulation of problems of limit loads in soil mechanics via limit analysis theory. After recalling the principal results of this theory, the authors describe a numerical formulation for both the static and kinematic approaches of the ultimate load. Thanks to linearization of the yield criterion, the finite element model leads to a linear programming problem. The efficiency of the two proposed computing procedures is demonstrated by their application to the problem of pulling out of foundations and slope stability.  相似文献   
98.
Many tasks in AI require representation and manipulation of complex functions. First-Order Decision Diagrams (FODD) are a compact knowledge representation expressing functions over relational structures. They represent numerical functions that, when constrained to the Boolean range, use only existential quantification. Previous work has developed a set of operations for composition and for removing redundancies in FODDs, thus keeping them compact, and showed how to successfully employ FODDs for solving large-scale stochastic planning problems through the formalism of relational Markov decision processes (RMDP). In this paper, we introduce several new ideas enhancing the applicability of FODDs. More specifically, we first introduce Generalized FODDs (GFODD) and composition operations for them, generalizing FODDs to arbitrary quantification. Second, we develop a novel approach for reducing (G)FODDs using model checking. This yields – for the first time – a reduction that maximally reduces the diagram for the FODD case and provides a sound reduction procedure for GFODDs. Finally we show how GFODDs can be used in principle to solve RMDPs with arbitrary quantification, and develop a complete solution for the case where the reward function is specified using an arbitrary number of existential quantifiers followed by an arbitrary number of universal quantifiers.  相似文献   
99.
Interest in psychological experimentation from the Artificial Intelligence community often takes the form of rigorous post-hoc evaluation of completed computer models. Through an example of our own collaborative research, we advocate a different view of how psychology and AI may be mutually relevant, and propose an integrated approach to the study of learning in humans and machines. We begin with the problem of learning appropriate indices for storing and retrieving information from memory. From a planning task perspective, the most useful indices may be those that predict potential problems and access relevant plans in memory, improving the planner's ability to predict and avoid planning failures. This predictive features hypothesis is then supported as a psychological claim, with results showing that such features offer an advantage in terms of the selectivity of reminding because they more distinctively characterize planning situations where differing plans are appropriate.We present a specific case-based model of plan execution, RUNNER, along with its indices for recognizing when to select particular plans—appropriateness conditions—and how these predictive indices serve to enhance learning. We then discuss how this predictive features claim as implemented in the RUNNER model is then tested in a second set of psychological studies. The results show that learning appropriateness conditions results in greater success in recognizing when a past plan is in fact relevant in current processing, and produces more reliable recall of the related information. This form of collaboration has resulted in a unique integration of computational and empirical efforts to create a model of case-based learning.  相似文献   
100.
Abstract

This article gives an account of the results of a case study undertaken at a pioneer and particularly prominent firm in the Brazilian computer industry: COBRA — Computadores e Sistemas Brasileiros SA.

The study is part of a research project1 whose main goal is the identification of viable organizational and technological options that could enhance the performance of firms in the Brazilian electro‐electronics industry. Among the more important findings, it was observed that the firm has the potential to evolve to a more flexible structure, in keeping with the new requirements of the probable future market scenarios within its sector.

Everything indicated, however, that the main constraining factors for bringing about such a structure were connected with the firm's culture and beliefs and with the indirect influence of the National Policy of Information Technology upon these. An hypothesis was put forward for an organizational model, and the necessary supportive computerized technology, which could be particularly appropriate in view of the impending deregulation, of the Brazilian computer industry. Also, in the scope of the current discussion on the restructuring of this industry in terms of a greater concentration, an alternative to the prevailing school of thought was proposed, and for whose success the organizational model at issue purports to be particularly relevant.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号