首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   30863篇
  免费   1240篇
  国内免费   113篇
电工技术   405篇
综合类   410篇
化学工业   5216篇
金属工艺   1307篇
机械仪表   653篇
建筑科学   1119篇
矿业工程   105篇
能源动力   979篇
轻工业   3998篇
水利工程   258篇
石油天然气   169篇
武器工业   7篇
无线电   2424篇
一般工业技术   4431篇
冶金工业   6379篇
原子能技术   302篇
自动化技术   4054篇
  2023年   139篇
  2022年   398篇
  2021年   783篇
  2020年   545篇
  2019年   603篇
  2018年   694篇
  2017年   777篇
  2016年   728篇
  2015年   562篇
  2014年   868篇
  2013年   1535篇
  2012年   1283篇
  2011年   1645篇
  2010年   1208篇
  2009年   1273篇
  2008年   1142篇
  2007年   999篇
  2006年   1025篇
  2005年   923篇
  2004年   965篇
  2003年   853篇
  2002年   885篇
  2001年   740篇
  2000年   577篇
  1999年   563篇
  1998年   2075篇
  1997年   1353篇
  1996年   1045篇
  1995年   759篇
  1994年   605篇
  1993年   669篇
  1992年   345篇
  1991年   384篇
  1990年   326篇
  1989年   275篇
  1988年   239篇
  1987年   189篇
  1986年   192篇
  1985年   200篇
  1984年   154篇
  1983年   99篇
  1982年   130篇
  1981年   147篇
  1980年   141篇
  1979年   107篇
  1978年   85篇
  1977年   152篇
  1976年   243篇
  1975年   95篇
  1973年   65篇
排序方式: 共有10000条查询结果,搜索用时 281 毫秒
971.
This article aims to tackle a practical three-dimensional packing problem, where a number of cartons of diverse sizes are to be packed into a bin with fixed length and width but open height. Each carton is allowed to be packed in any one of the six orientations, and the carton edges are parallel to the bin edges. The allowance of variable carton orientations exponentially increases the solution space and makes the problem very challenging to solve. This study first elaborately devises the packing procedure, which converts an arbitrary sequence of cartons into a compact packing solution and subsequently develops an improved genetic algorithm (IGA) to evolve a set of solutions. Moreover, a novel global search framework (GSF), utilizing the concept of evolutionary gradient, is proposed to further improve the solution quality. Numerical experiments indicate that IGA provides faster and better results and GSF demonstrates its superior performance, especially in solving relative large-size and heterogeneous instances. Applying the proposed algorithms to some benchmarking cases of the three-dimensional strip packing problem also indicates that the algorithms are robust and effective compared to existing methods in the literature.  相似文献   
972.
In multi-agent systems, the study of language and communication is an active field of research. In this paper we present the application of evolutionary strategies to the self-emergence of a common lexicon in a population of agents. By modeling the vocabulary or lexicon of each agent as an association matrix or look-up table that maps the meanings (i.e. the objects encountered by the agents or the states of the environment itself) into symbols or signals we check whether it is possible for the population to converge in an autonomous, decentralized way to a common lexicon, so that the communication efficiency of the entire population is optimal. We have conducted several experiments aimed at testing whether it is possible to converge with evolutionary strategies to an optimal Saussurean communication system. We have organized our experiments alongside two main lines: first, we have investigated the effect of the population size on the convergence results. Second, and foremost, we have also investigated the effect of the lexicon size on the convergence results. To analyze the convergence of the population of agents we have defined the population's consensus when all the agents (i.e. 100% of the population) share the same association matrix or lexicon. As a general conclusion we have shown that evolutionary strategies are powerful enough optimizers to guarantee the convergence to lexicon consensus in a population of autonomous agents.  相似文献   
973.
A supervised learning algorithm for quantum neural networks (QNN) based on a novel quantum neuron node implemented as a very simple quantum circuit is proposed and investigated. In contrast to the QNN published in the literature, the proposed model can perform both quantum learning and simulate the classical models. This is partly due to the neural model used elsewhere which has weights and non-linear activations functions. Here a quantum weightless neural network model is proposed as a quantisation of the classical weightless neural networks (WNN). The theoretical and practical results on WNN can be inherited by these quantum weightless neural networks (qWNN). In the quantum learning algorithm proposed here patterns of the training set are presented concurrently in superposition. This superposition-based learning algorithm (SLA) has computational cost polynomial on the number of patterns in the training set.  相似文献   
974.
In computer vision, camera calibration is a necessary process when the retrieval of information such as angles and distances is required. This paper addresses the multi-camera calibration problem with a single dimension calibration pattern under general motions. Currently, the known algorithms for solving this problem are based on the estimation of vanishing points. However, this estimate is very susceptible to noise, making the methods unsuitable for practical applications. Instead, this paper presents a new calibration algorithm, where the cameras are divided into binocular sets. The fundamental matrix of each binocular set is then estimated, allowing to perform a projective calibration of each camera. Then, the calibration is updated for the Euclidean space, ending the process. The calibration is possible without imposing any restrictions on the movement of the pattern and without any prior information about the cameras or motion. Experiments on synthetic and real images validate the new method and show that its accuracy makes it suitable also for practical applications.  相似文献   
975.
Business processes have become one of the key assets of organization, since these processes allow them to discover and control what occurs in their environments, with information systems automating most of an organization's processes. Unfortunately, and as a result of uncontrolled maintenance, information systems age over time until it is necessary to replace them with new and modernized systems. However, while systems are aging, meaningful business knowledge that is not present in any of the organization's other assets gradually becomes embedded in them. The preservation of this knowledge through the recovery of the underlying business processes is, therefore, a critical problem. This paper provides, as a solution to the aforementioned problem, a model‐driven procedure for recovering business processes from legacy information systems. The procedure proposes a set of models at different abstraction levels, along with the model transformations between them. The paper also provides a supporting tool, which facilitates its adoption. Moreover, a real‐life case study concerning an e‐government system applies the proposed recovery procedure to validate its effectiveness and efficiency. The case study was carried out by following a formal protocol to improve its rigor and replicability. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   
976.
Kuss and McLerran in a paper in this journal provide SAS code for the estimation of multinomial logistic models for correlated data. Their motivation derived from two papers that recommended to estimate such models using a Poisson likelihood, which is according to Kuss and McLerran "statistically correct but computationally inefficient". Kuss and McLerran propose several estimating methods. Some of these are based on the fact that the multinomial model is a multivariate binary model. Subsequently a procedure proposed by Wright is exploited to fit the models. In this paper we will show that the new computation methods, based on the approach by Wright, are statistically incorrect because they do not take into account that for multinomial data a multivariate link function is needed. An alternative estimation strategy is proposed using the clustered bootstrap.  相似文献   
977.
This article presents a case study on retrospective verification of the Linux Virtual File System (VFS), which is aimed at checking violations of API usage rules and memory properties. Since VFS maintains dynamic data structures and is written in a mixture of C and inlined assembly, modern software model checkers cannot be applied. Our case study centres around our novel automated software verification tool, the SOCA Verifier, which symbolically executes and analyses compiled code. We describe how this verifier deals with complex features such as memory access, pointer aliasing and computed jumps in the VFS implementation, while reducing manual modelling to a minimum. Our results show that the SOCA Verifier is capable of analysing the complex Linux VFS implementation reliably and efficiently, thereby going beyond traditional testing tools and into niches that current software model checkers do not reach. This testifies to the SOCA Verifier’s suitability as an effective and efficient bug-finding tool during the development of operating system components.  相似文献   
978.
This paper deals with the problem of supervised wrapper-based feature subset selection in datasets with a very large number of attributes. Recently the literature has contained numerous references to the use of hybrid selection algorithms: based on a filter ranking, they perform an incremental wrapper selection over that ranking. Though working fine, these methods still have their problems: (1) depending on the complexity of the wrapper search method, the number of wrapper evaluations can still be too large; and (2) they rely on a univariate ranking that does not take into account interaction between the variables already included in the selected subset and the remaining ones.Here we propose a new approach whose main goal is to drastically reduce the number of wrapper evaluations while maintaining good performance (e.g. accuracy and size of the obtained subset). To do this we propose an algorithm that iteratively alternates between filter ranking construction and wrapper feature subset selection (FSS). Thus, the FSS only uses the first block of ranked attributes and the ranking method uses the current selected subset in order to build a new ranking where this knowledge is considered. The algorithm terminates when no new attribute is selected in the last call to the FSS algorithm. The main advantage of this approach is that only a few blocks of variables are analyzed, and so the number of wrapper evaluations decreases drastically.The proposed method is tested over eleven high-dimensional datasets (2400-46,000 variables) using different classifiers. The results show an impressive reduction in the number of wrapper evaluations without degrading the quality of the obtained subset.  相似文献   
979.
980.
The control of the relative humidity and the temperature is important for the birds to be born. It is not easy to control the relative humidity, but it is possible to obtain the measure of the relative humidity as a consequence of the control of the temperature in a bird incubator. In this article, (1) the mathematical model for the control of temperature in the bird incubator is presented, (2) a functional network to approximate the relative humidity behavior in the bird incubator is proposed, (3) a control for the temperature in the bird incubator is proposed, the error of the proportional control applied to the mathematical model of the temperature of the bird incubator is assured to be uniformly stable, (4) the comparison results of four classic control laws for the control of the temperature considering the proposed mathematical model of the temperature and the functional network to approximate the relative humidity behavior in the bird incubator are presented.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号