首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5812篇
  免费   299篇
  国内免费   10篇
电工技术   52篇
综合类   7篇
化学工业   1193篇
金属工艺   93篇
机械仪表   135篇
建筑科学   207篇
矿业工程   15篇
能源动力   89篇
轻工业   562篇
水利工程   39篇
石油天然气   5篇
无线电   409篇
一般工业技术   984篇
冶金工业   1530篇
原子能技术   22篇
自动化技术   779篇
  2023年   43篇
  2022年   81篇
  2021年   150篇
  2020年   106篇
  2019年   76篇
  2018年   197篇
  2017年   181篇
  2016年   182篇
  2015年   150篇
  2014年   169篇
  2013年   401篇
  2012年   344篇
  2011年   331篇
  2010年   236篇
  2009年   186篇
  2008年   267篇
  2007年   215篇
  2006年   194篇
  2005年   156篇
  2004年   119篇
  2003年   112篇
  2002年   97篇
  2001年   80篇
  2000年   90篇
  1999年   94篇
  1998年   501篇
  1997年   285篇
  1996年   192篇
  1995年   107篇
  1994年   95篇
  1993年   105篇
  1992年   24篇
  1991年   40篇
  1990年   22篇
  1989年   35篇
  1988年   27篇
  1987年   28篇
  1986年   18篇
  1985年   34篇
  1984年   23篇
  1983年   24篇
  1982年   30篇
  1981年   35篇
  1980年   19篇
  1979年   16篇
  1978年   20篇
  1977年   49篇
  1976年   42篇
  1975年   18篇
  1973年   12篇
排序方式: 共有6121条查询结果,搜索用时 15 毫秒
91.
In this paper, we propose a metaheuristic for solving an original scheduling problem with auxiliary resources in a photolithography workshop of a semiconductor plant. The photolithography workshop is often a bottleneck, and improving scheduling decisions in this workshop can help to improve indicators of the whole plant. Two optimization criteria are separately considered: the weighted flow time (to minimize) and the number of products that are processed (to maximize). After stating the problem and giving some properties on the solution space, we show how these properties help us to efficiently solve the problem with the proposed memetic algorithm, which has been implemented and tested on large generated instances. Numerical experiments show that good solutions are obtained within a reasonable computational time.  相似文献   
92.
Some supervised tasks are presented with a numerical output but decisions have to be made in a discrete, binarised, way, according to a particular cutoff. This binarised regression task is a very common situation that requires its own analysis, different from regression and classification—and ordinal regression. We first investigate the application cases in terms of the information about the distribution and range of the cutoffs and distinguish six possible scenarios, some of which are more common than others. Next, we study two basic approaches: the retraining approach, which discretises the training set whenever the cutoff is available and learns a new classifier from it, and the reframing approach, which learns a regression model and sets the cutoff when this is available during deployment. In order to assess the binarised regression task, we introduce context plots featuring error against cutoff. Two special cases are of interest, the \( UCE \) and \( OCE \) curves, showing that the area under the former is the mean absolute error and the latter is a new metric that is in between a ranking measure and a residual-based measure. A comprehensive evaluation of the retraining and reframing approaches is performed using a repository of binarised regression problems created on purpose, concluding that no method is clearly better than the other, except when the size of the training data is small.  相似文献   
93.
Increasing numbers of hard environmental constraints are being imposed in urban traffic networks by authorities in an attempt to mitigate pollution caused by traffic. However, it is not trivial for authorities to assess the cost of imposing such hard environmental constraints. This leads to difficulties when setting the constraining values as well as implementing effective control measures. For that reason, quantifying the cost of imposing hard environmental constraints for a certain network becomes crucial. This paper first indicates that for a given network, such cost is not only related to the attribution of environmental constraints but also related to the considered control measures. Next, we present an assessment criterion that quantifies the loss of optimality under the control measures considered by introducing the environmental constraints. The criterion can be acquired by solving a bi-level programming problem with/without environmental constraints. A simple case study shows its practicability as well as the differences between this framework and other frameworks integrating the environmental aspects. This proposed framework is widely applicable when assessing the interaction of traffic and its environmental aspects.  相似文献   
94.
In this paper we introduce a new variant of shape differentiation which is adapted to the deformation of shapes along their normal direction. This is typically the case in the level-set method for shape optimization where the shape evolves with a normal velocity. As all other variants of the original Hadamard method of shape differentiation, our approach yields the same first order derivative. However, the Hessian or second-order derivative is different and somehow simpler since only normal movements are allowed. The applications of this new Hessian formula are twofold. First, it leads to a novel extension method for the normal velocity, used in the Hamilton-Jacobi equation of front propagation. Second, as could be expected, it is at the basis of a Newton optimization algorithm which is conceptually simpler since no tangential displacements have to be considered. Numerical examples are given to illustrate the potentiality of these two applications. The key technical tool for our approach is the method of bicharacteristics for solving Hamilton-Jacobi equations. Our new idea is to differentiate the shape along these bicharacteristics (a system of two ordinary differential equations).  相似文献   
95.
96.
97.
A quartz crystal viscometer has been developed for measuring viscosity in liquids under pressure. It employs an AT-cut quartz crystal resonator of fundamental frequency 3 MHz inserted in a variable-volume vessel designed for working up to 80 MPa. Viscosity is determined by two methods from resonance frequency and bandwidth measurements along up to eight different overtones. The resonance frequency allows an absolute measurement of the viscosity but leads to an accuracy limited to 5% whereas the bandwidth technique which works in a relative way provides an accuracy of 2%. The techniques were tested by carrying out measurements in two pure compounds: heptane and toluene. Measurement results demonstrate the feasibility of the technique in this viscosity range. The apparatus was also used to determine the viscosity of n-decane with dissolved methane. The results obtained with these mixtures reveal the applicability of the apparatus for reservoir fluids study.  相似文献   
98.
Edge matching puzzles have been amongst us for a long time now and traditionally they have been considered, both, a children’s game and an interesting mathematical divertimento. Their main characteristics have already been studied, and their worst-case complexity has been properly classified as a NP-complete problem. It is in recent times, specially after being used as the problem behind a money-prized contest, with a prize of 2US$ million for the first solver, that edge matching puzzles have attracted mainstream attention from wider audiences, including, of course, computer science people working on solving hard problems. We consider these competitions as an interesting opportunity to showcase SAT/CSP solving techniques when confronted to a real world problem to a broad audience, a part of the intrinsic, i.e. monetary, interest of such a contest. This article studies the NP-complete problem known as edge matching puzzle using SAT and CSP approaches for solving it. We will focus on providing, first and foremost, a theoretical framework, including a generalized definition of the problem. We will design and show algorithms for easy and fast problem instances generation, generators with easily tunable hardness. Afterwards we will provide with SAT and CSP models for the problems and we will study problem complexity, both typical case and worst-case complexity. We will also provide some specially crafted heuristics that result in a boost in solving time and study which is the effect of such heuristics.  相似文献   
99.
100.
In the density classification problem, a binary cellular automaton (CA) should decide whether an initial configuration contains more 0s or more 1s. The answer is given when all cells of the CA agree on a given state. This problem is known for having no exact solution in the case of binary deterministic one-dimensional CA. We investigate how randomness in CA may help us solve the problem. We analyse the behaviour of stochastic CA rules that perform the density classification task. We show that describing stochastic rules as a “blend” of deterministic rules allows us to derive quantitative results on the classification time and the classification time of previously studied rules. We introduce a new rule whose effect is to spread defects and to wash them out. This stochastic rule solves the problem with an arbitrary precision, that is, its quality of classification can be made arbitrarily high, though at the price of an increase of the convergence time. We experimentally demonstrate that this rule exhibits good scaling properties and that it attains qualities of classification never reached so far.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号