首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3224篇
  免费   144篇
  国内免费   17篇
电工技术   47篇
综合类   13篇
化学工业   522篇
金属工艺   64篇
机械仪表   88篇
建筑科学   135篇
矿业工程   1篇
能源动力   101篇
轻工业   190篇
水利工程   27篇
石油天然气   7篇
无线电   586篇
一般工业技术   574篇
冶金工业   496篇
原子能技术   13篇
自动化技术   521篇
  2024年   6篇
  2023年   19篇
  2022年   37篇
  2021年   65篇
  2020年   41篇
  2019年   55篇
  2018年   71篇
  2017年   65篇
  2016年   72篇
  2015年   63篇
  2014年   97篇
  2013年   202篇
  2012年   130篇
  2011年   168篇
  2010年   131篇
  2009年   154篇
  2008年   175篇
  2007年   187篇
  2006年   152篇
  2005年   122篇
  2004年   113篇
  2003年   104篇
  2002年   98篇
  2001年   71篇
  2000年   67篇
  1999年   74篇
  1998年   163篇
  1997年   120篇
  1996年   85篇
  1995年   54篇
  1994年   62篇
  1993年   47篇
  1992年   20篇
  1991年   17篇
  1990年   22篇
  1989年   37篇
  1988年   26篇
  1987年   23篇
  1986年   33篇
  1985年   25篇
  1984年   20篇
  1983年   10篇
  1982年   9篇
  1981年   11篇
  1980年   8篇
  1979年   4篇
  1978年   7篇
  1977年   11篇
  1976年   12篇
  1973年   5篇
排序方式: 共有3385条查询结果,搜索用时 15 毫秒
91.
The computational complexity of scheduling jobs with released dates on an unbounded batch processing machine to minimize total completion time and on parallel unbounded batch processing machines to minimize total weighted completion time remains open. In this note we show that the first problem is NP-hard with respect to id-encoding, and the second one is strongly NP-hard.  相似文献   
92.
93.
We propose a new link metric called normalized advance (NADV) for geographic routing in multihop wireless networks. NADV selects neighbors with the optimal trade-off between proximity and link cost. Coupled with the local next hop decision in geographic routing, NADV enables an adaptive and efficient cost-aware routing strategy. Depending on the objective or message priority, applications can use the NADV framework to minimize various types of link cost.We present efficient methods for link cost estimation and perform detailed experiments in simulated environments. Our results show that NADV outperforms current schemes in many aspects: for example, in high noise environments with frequent packet losses, the use of NADV leads to 81% higher delivery ratio. When compared to centralized routing under certain settings, geographic routing using NADV finds paths whose cost is close to the optimum. We also conducted experiments in Emulab testbed and the results demonstrate that our proposed approach performs well in practice.  相似文献   
94.
This paper investigates and presents conditions that guarantee disturbance decoupled fault reconstruction using sliding mode observers, which are less stringent than those of previous work, and show that disturbance reconstruction is not necessary. An aircraft model validates the ideas proposed in this paper. Copyright © 2010 John Wiley and Sons Asia Pte Ltd and Chinese Automatic Control Society  相似文献   
95.
This paper studies a multi-goal Q-learning algorithm of cooperative teams. Member of the cooperative teams is simulated by an agent. In the virtual cooperative team, agents adapt its knowledge according to cooperative principles. The multi-goal Q-learning algorithm is approached to the multiple learning goals. In the virtual team, agents learn what knowledge to adopt and how much to learn (choosing learning radius). The learning radius is interpreted in Section 3.1. Five basic experiments are manipulated proving the validity of the multi-goal Q-learning algorithm. It is found that the learning algorithm causes agents to converge to optimal actions, based on agents’ continually updated cognitive maps of how actions influence learning goals. It is also proved that the learning algorithm is beneficial to the multiple goals. Furthermore, the paper analyzes how sensitive the learning performance is affected by the parameter values of the learning algorithm.  相似文献   
96.
A method for analyzing production systems by applying multi-objective optimization and data mining techniques on discrete-event simulation models, the so-called Simulation-based Innovization (SBI) is presented in this paper. The aim of the SBI analysis is to reveal insight on the parameters that affect the performance measures as well as to gain deeper understanding of the problem, through post-optimality analysis of the solutions acquired from multi-objective optimization. This paper provides empirical results from an industrial case study, carried out on an automotive machining line, in order to explain the SBI procedure. The SBI method has been found to be particularly suitable in this case study as the three objectives under study, namely total tardiness, makespan and average work-in-process, are in conflict with each other. Depending on the system load of the line, different decision variables have been found to be influencing. How the SBI method is used to find important patterns in the explored solution set and how it can be valuable to support decision making in order to improve the scheduling under different system loadings in the machining line are addressed.  相似文献   
97.
This paper presents an industrial application of simulation-based optimization (SBO) in the scheduling and real-time rescheduling of a complex machining line in an automotive manufacturer in Sweden. Apart from generating schedules that are robust and adaptive, the scheduler must be able to carry out rescheduling in real time in order to cope with the system uncertainty effectively. A real-time scheduling system is therefore needed to support not only the work of the production planner but also the operators on the shop floor by re-generating feasible schedules when required. This paper describes such a real-time scheduling system, which is in essence a SBO system integrated with the shop floor database system. The scheduling system, called OPTIMISE scheduling system (OSS), uses real-time data from the production line and sends back expert suggestions directly to the operators through Personal Digital Assistants (PDAs). The user interface helps in generating new schedules and enables the users to easily monitor the production progress through visualization of production status and allows them to forecast and display target performance measures. Initial results from this industrial application have shown that such a novel scheduling system can help both in improving the line throughput efficiently and simultaneously supporting real-time decision making.  相似文献   
98.
In this paper we investigate the problem of Simultaneous Localization and Mapping (SLAM) for a multi robot system. Relaxing some assumptions that characterize related work we propose an application of Rao-Blackwellized Particle Filters (RBPF) for the purpose of cooperatively estimating SLAM posterior. We consider a realistic setup in which the robots start from unknown initial poses (relative locations are unknown too), and travel in the environment in order to build a shared representation of the latter. The robots are required to exchange a small amount of information only when a rendezvous event occurs and to measure relative poses during the meeting. As a consequence the approach also applies when using an unreliable wireless channel or short range communication technologies (bluetooth, RFId, etc.). Moreover it allows to take into account the uncertainty in relative pose measurements. The proposed technique, which constitutes a distributed solution to the multi robot SLAM problem, is further validated through simulations and experimental tests.  相似文献   
99.
Reliability analysis and optimal version-updating for open source software   总被引:1,自引:0,他引:1  

Context

Although reliability is a major concern of most open source projects, research on this problem has attracted attention only recently. In addition, the optimal version-dating for open source software considering its special properties is not yet discussed.

Objective

In this paper, the reliability analysis and optimal version-updating for open source software are studied.

Method

A modified non-homogeneous Poisson process model is developed for open source software reliability modeling and analysis. Based on this model, optimal version-updating for open source software is investigated as well. In the decision process, the rapid release strategy and the level of reliability are the two most important factors. However, they are essentially contradicting with each other. In order to consider these two conflicting factors simultaneously, a new decision model based on multi-attribute utility theory is proposed.

Results

Our models are tested on the real world data sets from two famous open source projects: Apache and GNOME. It is found that traditional software reliability models provide overestimations of the reliability of open source software. In addition, the proposed decision model can help management to make a rational decision on the optimal version-updating for open source software.

Conclusion

Empirical results reveal that the proposed model for open source software reliability can describe the failure process more accurately. Furthermore, it can be seen that the proposed decision model can assist management to appropriately determine the optimal version-update time for open source software.  相似文献   
100.
Web data being transmitted over a network channel on the Internet with excessive amount of data causes data processing problems, which include selectively choosing useful information to be retained for various data applications. In this paper, we present an approach for filtering less-informative attribute data from a source Website. A scheme for filtering attributes, instead of tuples (records), from a Website becomes imperative, since filtering a complete tuple would lead to filtering some informative, as well as less-informative, attribute data in the tuple. Since filtered data at the source Website may be of interest to the user at the destination Website, we design a data recovery approach that maintains the minimal amount of information for data recovery purpose while imposing minimal overhead for data recovery at the source Website. Our data filtering and recovery approach (1) handles a wide range of Web data in different application domains (such as weather, stock exchanges, Internet traffic, etc.), (2) is dynamic in nature, since each filtering scheme adjusts the amount of data to be filtered as needed, and (3) is adaptive, which is appealing in an ever-changing Internet environment.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号