首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   685篇
  免费   27篇
  国内免费   1篇
电工技术   7篇
综合类   1篇
化学工业   131篇
金属工艺   19篇
机械仪表   24篇
建筑科学   41篇
矿业工程   2篇
能源动力   64篇
轻工业   62篇
水利工程   3篇
石油天然气   8篇
无线电   54篇
一般工业技术   117篇
冶金工业   69篇
原子能技术   28篇
自动化技术   83篇
  2024年   2篇
  2023年   5篇
  2022年   7篇
  2021年   17篇
  2020年   22篇
  2019年   28篇
  2018年   41篇
  2017年   23篇
  2016年   24篇
  2015年   17篇
  2014年   25篇
  2013年   70篇
  2012年   41篇
  2011年   46篇
  2010年   46篇
  2009年   43篇
  2008年   31篇
  2007年   28篇
  2006年   25篇
  2005年   13篇
  2004年   16篇
  2003年   10篇
  2002年   10篇
  2001年   8篇
  2000年   2篇
  1999年   7篇
  1998年   21篇
  1997年   16篇
  1996年   9篇
  1995年   6篇
  1994年   6篇
  1993年   3篇
  1992年   5篇
  1991年   3篇
  1990年   3篇
  1989年   4篇
  1988年   3篇
  1987年   3篇
  1986年   4篇
  1985年   2篇
  1984年   3篇
  1983年   2篇
  1980年   3篇
  1979年   4篇
  1977年   1篇
  1976年   2篇
  1974年   1篇
  1973年   1篇
  1972年   1篇
排序方式: 共有713条查询结果,搜索用时 31 毫秒
11.
A performance study of multiprocessor task scheduling algorithms   总被引:1,自引:0,他引:1  
Multiprocessor task scheduling is an important and computationally difficult problem. A large number of algorithms were proposed which represent various tradeoffs between the quality of the solution and the computational complexity and scalability of the algorithm. Previous comparison studies have frequently operated with simplifying assumptions, such as independent tasks, artificially generated problems or the assumption of zero communication delay. In this paper, we propose a comparison study with realistic assumptions. Our target problems are two well known problems of linear algebra: LU decomposition and Gauss–Jordan elimination. Both algorithms are naturally parallelizable but have heavy data dependencies. The communication delay will be explicitly considered in the comparisons. In our study, we consider nine scheduling algorithms which are frequently used to the best of our knowledge: min–min, chaining, A*, genetic algorithms, simulated annealing, tabu search, HLFET, ISH, and DSH with task duplication. Based on experimental results, we present a detailed analysis of the scalability, advantages and disadvantages of each algorithm.
Damla TurgutEmail:
  相似文献   
12.
Recent progress in energy harvesting technologies made it possible to build sensor networks with rechargeable nodes which target an indefinitely long operation. In these networks, the goal of energy management is to allocate the available energy such that the important performance metrics, such as the number of detected threats, are maximized. As the harvested energy is not sufficient for continuous operation, the scheduling of the active and inactive time is one of the main components of energy management. The active time scheduling protocols need to maintain the energy equilibrium of the nodes, while considering the uncertainties of the energy income, which is strongly influenced by the weather, and the energy expenditures, which are dependent on the behavior of the targets. In this paper, we describe and experimentally compare three active time scheduling protocols: (a) static active time, (b) dynamic active time based on a multi-parameter heuristic and (c) utility-based uniform sensing. We show that protocols which take into consideration the probabilistic models of the energy income and expenditure and can dynamically adapt to changes in the environment, can provide a significant performance advantage.  相似文献   
13.
A class of audio-visual data (fiction entertainment: movies, TV series) is segmented into scenes, which contain dialogs, using a novel hidden Markov model-based (HMM) method. Each shot is classified using both audio track (via classification of speech, silence and music) and visual content (face and location information). The result of this shot-based classification is an audio-visual token to be used by the HMM state diagram to achieve scene analysis. After simulations with circular and left-to-right HMM topologies, it is observed that both are performing very good with multi-modal inputs. Moreover, for circular topology, the comparisons between different training and observation sets show that audio and face information together gives the most consistent results among different observation sets.  相似文献   
14.
IT outsourcing is a complex and opaque decision problem. Managers facing a decision about IT outsourcing have difficulty in framing what needs to be thought about further in their discourses. Framing is one of the most crucial steps of human decision making and needs to be assisted to better understand a decision situation. In this research, we examine a number of decision primitives in the context of an IT outsourcing decision situation. We demonstrate how the decision primitives can be employed so that managers can probe deep to better understand a decision situation and to establish a decision basis. In the organizational setting, we exemplify the use of the decision primitives in relation to the perceived outsourcing implications for the managers looking for assistance in accommodating a knowledge management perspective on IT outsourcing. Consequently, we induce insight and a guideline on how to use knowledge management for effective outsourcing in one of the leading financial institutes in Europe.
Mehmet N. AydinEmail:
  相似文献   
15.
Coordination of multi agent systems remains as a problem since there is no prominent method suggests any universal solution. Metaheuristic agents are specific implementations of multi-agent systems, which imposes working together to solve optimisation problems using metaheuristic algorithms. An idea for coordinating metaheuristic agents borrowed from swarm intelligence is introduced in this paper. This swarm intelligence-based coordination framework has been implemented as swarms of simulated annealing agents collaborated with particle swarm optimization for multidimensional knapsack problem. A comparative performance analysis is also reported highlighting that the implementation has produced much better results than the previous works.  相似文献   
16.
The influence of duplex surface treatments consisting of a DC-pulsed plasma nitriding process and subsequent coatings of CrN and TiAlN deposited by physical vapor deposition(PVD)on AISI H13 tool steel was studied in this article.The treated samples were characterized using metallographic techniques,SEM,EDS,and microhardness methods.Hydro-abrasive erosion wear tests were performed in a specifically designed wear tester in which the samples were rotated in a wear tank containing a mixture of distilled water and ceramic abrasive chips with a fixed rotational speed.The wear rates caused by the abrasive particle impacts were assessed based on accumulated weight loss measurements.The worn surfaces were also characterized using optical microscopy,SEM,and EDS.Microhardness measurements indicated a significant increase in the surface hardness of the duplex-treated samples.The surfaces of the samples with the TiAlN coating were approximately 15 times harder than that of the untreated samples and 3 times that of the plasma nitrided samples.Hydro-abrasive erosion wear results showed that the duplex surface treatments,especially the CrN coating,displayed the highest erosion wear resistance.  相似文献   
17.
In this study, a non-aqueous method in a simple one pot reaction process was employed to synthesize nano-sized BaTiO3 particles and then electrophoretic deposition technique was employed for thin film coatings. In the first step of the preparation, metallic barium is directly dissolved in benzyl alcohol at slightly elevated temperatures. Then titanium isopropoxide was added following by a solvothermal treatment. At the end of the reaction, nearly spherical BaTiO3 nanoparticles were obtained typically 5 nm in diameter. After establishing the stability of the BaTiO3 suspension in ethanol, electrophoretic deposition process was performed without any additional operation. Alumina with platinum plating was used as substrate. To achieve the optimal process parameters, various voltages were applied by altering the cathode to anode distance as well as deposition time. High voltages application was possible without causing hydrolysis, because of the non-aqueous ethanol medium with higher surface charge of the nanoparticles. The deposited surface coatings were dried in air and sintered at various temperatures. SEM, EDX and XRD analysis were employed for the investigation of the coating.  相似文献   
18.
Predicting the amount of landfill gas (LFG) that will be recovered at a sanitary landfill is generally associated with a high level of uncertainty, which is primarily due to the uncertainty in the definition of the parameters that control the LFG generation rate and LFG transport. To quantify these uncertainties, a three-dimensional stochastic model for the generation and transport of LFG is proposed. Using Monte Carlo simulations, multiple realizations of key input parameters are generated. For each realization, LFG transport is simulated and then used to evaluate probabilistically the rates and efficiency of energy recovery. For demonstration, the stochastic model is applied to the Kemerburgaz landfill in Istanbul, Turkey. Uncertainty in the definition of three key parameters, namely: the LFG production rate, the waste gas permeability and the soil cover gas permeability were accounted for. Modeling results suggest that the collection system is sufficient to capture most of the generated gas, but that uncertainty in the factors controlling LFG production is the main source of uncertainty in the amount of energy that will be recovered.  相似文献   
19.
A new approach, called adaptive Q control, for tapping-mode atomic force microscopy (AFM) is introduced and implemented on a homemade AFM setup utilizing a laser Doppler vibrometer and a piezoactuated bimorph probe. In standard Q control, the effective Q factor of the scanning probe is adjusted prior to the scanning depending on the application. However, there is a trade-off in setting the effective Q factor of an AFM probe. The Q factor is either increased to reduce the tapping forces or decreased to increase the maximum achievable scan speed. Realizing these two benefits simultaneously using standard Q control is not possible. In adaptive Q control, the Q factor of the probe is set to an initial value as in standard Q control, but then modified on the fly during scanning when necessary to achieve this goal. In this article, we present the basic theory behind adaptive Q control, the electronics enabling the online modification of the probe's effective Q factor, and the results of the experiments comparing three different methods: scanning (a) without Q control, (b) with standard Q control, and (c) with adaptive Q control. The results show that the performance of adaptive Q control is superior to the other two methods.  相似文献   
20.
In this paper, a parallel implementation of the modular simulated annealing algorithm for classical job-shop scheduling is presented. The implementation is for a multi agent system running on the distributed resource machine, which is a novel, scalable, distributed virtual machine based on Java technology. The problems tackled are well known, difficult benchmarks, widely used to measure the efficiency of metaheuristics with respect to both the quality of the solutions and the central processing unit time. The empirical results obtained show that the method proposed is successful in comparison with a sequential version of modular simulated annealing algorithm and other methods described in the literature.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号