首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1413篇
  免费   74篇
  国内免费   1篇
电工技术   26篇
化学工业   445篇
金属工艺   11篇
机械仪表   37篇
建筑科学   52篇
能源动力   47篇
轻工业   187篇
水利工程   7篇
石油天然气   2篇
武器工业   1篇
无线电   111篇
一般工业技术   214篇
冶金工业   35篇
原子能技术   17篇
自动化技术   296篇
  2023年   9篇
  2022年   9篇
  2021年   86篇
  2020年   39篇
  2019年   41篇
  2018年   45篇
  2017年   27篇
  2016年   45篇
  2015年   43篇
  2014年   63篇
  2013年   109篇
  2012年   103篇
  2011年   100篇
  2010年   87篇
  2009年   66篇
  2008年   65篇
  2007年   77篇
  2006年   60篇
  2005年   50篇
  2004年   45篇
  2003年   44篇
  2002年   36篇
  2001年   21篇
  2000年   20篇
  1999年   22篇
  1998年   19篇
  1997年   19篇
  1996年   10篇
  1995年   11篇
  1994年   17篇
  1993年   12篇
  1992年   7篇
  1991年   3篇
  1990年   4篇
  1989年   5篇
  1988年   3篇
  1987年   5篇
  1986年   4篇
  1984年   8篇
  1983年   5篇
  1982年   6篇
  1980年   6篇
  1979年   3篇
  1978年   5篇
  1977年   4篇
  1974年   2篇
  1973年   2篇
  1971年   3篇
  1967年   2篇
  1966年   2篇
排序方式: 共有1488条查询结果,搜索用时 21 毫秒
31.
32.
The AHP (analytic hierarchy process) has been applied in many fields and especially to complex engineering problems and applications. AHP is capable of structuring decision problems and finding mathematically determined judgments built on knowledge and experience. This suggests that AHP should prove useful in agile software development, where complex decisions occur routinely. This paper describes a ranking approach to help stakeholders select the best prioritization method for prioritizing the user stories.  相似文献   
33.
34.
Going through a few examples of robot artists who are recognized worldwide, we try to analyze the deepest meaning of what is called “robot art” and the related art field definition. We also try to highlight its well-marked borders, such as kinetic sculptures, kinetic art, cyber art, and cyberpunk. A brief excursion into the importance of the context, the message, and its semiotics is also provided, case by case, together with a few hints on the history of this discipline in the light of an artistic perspective. Therefore, the aim of this article is to try to summarize the main characteristics that might classify robot art as a unique and innovative discipline, and to track down some of the principles by which a robotic artifact can or cannot be considered an art piece in terms of social, cultural, and strictly artistic interest. This work was presented in part at the 13th International Symposium on Artificial Life and Robotics, Oita, Japan, January 31–February 2, 2008  相似文献   
35.
Default logics are usually used to describe the regular behavior and normal properties of domain elements. In this paper we suggest, conversely, that the framework of default logics can be exploited for detecting outliers. Outliers are observations expressed by sets of literals that feature unexpected semantical characteristics. These sets of literals are selected among those explicitly embodied in the given knowledge base. Hence, essentially we perceive outlier detection as a knowledge discovery technique. This paper defines the notion of outlier in two related formalisms for specifying defaults: Reiter's default logic and extended disjunctive logic programs. For each of the two formalisms, we show that finding outliers is quite complex. Indeed, we prove that several versions of the outlier detection problem lie over the second level of the polynomial hierarchy. We believe that a thorough complexity analysis, as done here, is a useful preliminary step towards developing effective heuristics and exploring tractable subsets of outlier detection problems.  相似文献   
36.
Many time-critical applications require predictable performance in the presence of failures. This paper considers a distributed system with independent periodic tasks which can checkpoint their state on some reliable medium in order to handle failures. The problem of preemptively scheduling a set of such tasks is discussed where every occurrence of a task has to be completely executed before the next occurrence of the same task can start. Efficient scheduling algorithms are proposed which yield sub-optimal schedules when there is provision for fault-tolerance. The performance of the solutions proposed is evaluated in terms of the number of processors and the cost of the checkpoints needed. Moreover, analytical studies are used to reveal interesting trade-offs associated with the scheduling algorithms.This work has been supported by grants from the Italian Ministero dell'Università e della Ricerca Scientifica e Tecnologica and the Consiglio Nazionale delle Ricerche-Progetto Finalizzato Sistemi Informatici e Calcolo Parallelo.  相似文献   
37.
In the information age, the storage and accessibility of data is of vital importance. There are several possibilities to fulfill this task. Magnetic storage of data is a well‐established method and the range of materials used is continuously extended. In this study, the magnetic remanence of thermally sprayed tungsten carbide–cobalt (WCCo)‐coatings in dependence of their thickness is examined. Two magnetic fields differing in value and geometry are imprinted into the coatings and the resulting remanence field is measured. It is found that there are two effects, which in combination determine the effective value of the magnetic remanence usable for magnetic data storage.
  相似文献   
38.
This article proposes an optimization–simulation model for planning the transport of supplies to large public infrastructure works located in congested urban areas. The purpose is to minimize their impact on the environment and on private transportation users on the local road network. To achieve this goal, the authors propose and solve an optimization problem for minimizing the total system cost made up of operating costs for various alternatives for taking supplies to the worksite and the costs supported by private vehicle users as a result of increased congestion due to the movement of heavy goods vehicles transporting material to the worksite. The proposed optimization problem is a bi-level Math Program model. The upper level defines the total cost of the system, which is minimized taking into account environmental constraints on atmospheric and noise pollution. The lower level defines the optimization problem representing the private transportation user behavior, assuming they choose the route that minimizes their total individual journey costs. Given the special characteristics of the problem, a heuristic algorithm is proposed for finding optimum solutions. Both the model developed and the specific solution algorithm are applied to the real case of building a new port at Laredo (Northern Spain). A series of interesting conclusions are obtained from the corresponding sensitivity analysis.  相似文献   
39.

Background

COSMIC Function Points and traditional Function Points (i.e., IFPUG Function Points and more recent variation of Function Points, such as NESMA and FISMA) are probably the best known and most widely used Functional Size Measurement methods. The relationship between the two kinds of Function Points still needs to be investigated. If traditional Function Points could be accurately converted into COSMIC Function Points and vice versa, then, by measuring one kind of Function Points, one would be able to obtain the other kind of Function Points, and one might measure one or the other kind interchangeably. Several studies have been performed to evaluate whether a correlation or a conversion function between the two measures exists. Specifically, it has been suggested that the relationship between traditional Function Points and COSMIC Function Points may not be linear, i.e., the value of COSMIC Function Points seems to increase more than proportionally to an increase of traditional Function Points.

Objective

This paper aims at verifying this hypothesis using available datasets that collect both FP and CFP size measures.

Method

Rigorous statistical analysis techniques are used, specifically Piecewise Linear Regression, whose applicability conditions are systematically checked. The Piecewise Linear Regression curve is a series of interconnected segments. In this paper, we focused on Piecewise Linear Regression curves composed of two segments. We also used Linear and Parabolic Regressions, to check if and to what extent Piecewise Linear Regression may provide an advantage over other regression techniques. We used two categories of regression techniques: Ordinary Least Squares regression is based on the usual minimization of the sum of squares of the residuals, or, equivalently, on the minimization of the average squared residual; Least Median of Squares regression is a robust regression technique that is based on the minimization of the median squared residual. Using a robust regression technique helps filter out the excessive influence of outliers.

Results

It appears that the analysis of the relationship between traditional Function Points and COSMIC Function Points based on the aforementioned data analysis techniques yields valid significant models. However, different results for the various available datasets are achieved. In practice, we obtained statistically valid linear, piecewise linear, and non-linear conversion formulas for several datasets. In general, none of these is better than the others in a statistically significant manner.

Conclusions

Practitioners interested in the conversion of FP measures into CFP (or vice versa) cannot just pick a conversion model and be sure that it will yield the best results. All the regression models we tested provide good results with some datasets. In practice, all the models described in the paper - in particular, both linear and non-linear ones - should be evaluated in order to identify the ones that are best suited for the specific dataset at hand.  相似文献   
40.
Air pollution has a negative impact on human health. For this reason, it is important to correctly forecast over-threshold events to give timely warnings to the population. Nonlinear models of the nonlinear autoregressive with exogenous variable (NARX) class have been extensively used to forecast air pollution time series, mainly using artificial neural networks (NNs) to model the nonlinearities. This work discusses the possible advantages of using polynomial NARX instead, in combination with suitable model structure selection methods. Furthermore, a suitably weighted mean square error (MSE) (one-step-ahead prediction) cost function is used in the identification/learning process to enhance the model performance in peak estimation, which is the final purpose of this application. The proposed approach is applied to ground-level ozone concentration time series. An extended simulation analysis is provided to compare the two classes of models on a selected case study (Milan metropolitan area) and to investigate the effect of different weighting functions in the identification performance index. Results show that polynomial NARX are able to correctly reconstruct ozone concentrations, with performances similar to NN-based NARX models, but providing additional information, as, e.g., the best set of regressors to describe the studied phenomena. The simulation analysis also demonstrates the potential benefits of using the weighted cost function, especially in increasing the reliability in peak estimation.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号