首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   785篇
  免费   49篇
  国内免费   29篇
电工技术   18篇
综合类   18篇
化学工业   129篇
金属工艺   31篇
机械仪表   34篇
建筑科学   11篇
矿业工程   6篇
能源动力   64篇
轻工业   3篇
石油天然气   2篇
武器工业   1篇
无线电   84篇
一般工业技术   79篇
冶金工业   27篇
原子能技术   2篇
自动化技术   354篇
  2024年   3篇
  2023年   84篇
  2022年   29篇
  2021年   17篇
  2020年   43篇
  2019年   40篇
  2018年   18篇
  2017年   44篇
  2016年   93篇
  2015年   57篇
  2014年   103篇
  2013年   45篇
  2012年   42篇
  2011年   27篇
  2010年   29篇
  2009年   19篇
  2008年   16篇
  2007年   35篇
  2006年   29篇
  2005年   20篇
  2004年   11篇
  2003年   6篇
  2002年   11篇
  2001年   8篇
  2000年   5篇
  1999年   9篇
  1998年   3篇
  1997年   5篇
  1996年   1篇
  1994年   1篇
  1993年   1篇
  1992年   2篇
  1991年   4篇
  1990年   1篇
  1988年   2篇
排序方式: 共有863条查询结果,搜索用时 281 毫秒
1.
A steelmaking-continuous casting (SCC) scheduling problem is an example of complex hybrid flow shop scheduling problem (HFSSP) with a strong industrial background. This paper investigates the SCC scheduling problem that involves controllable processing times (CPT) with multiple objectives concerning the total waiting time, earliness/tardiness and adjusting cost. The SCC scheduling problem with CPT is seldom discussed in the existing literature. This study is motivated by the practical situation of a large integrated steel company in which the just-in-time (JIT) and cost-cutting production strategy have become a significant concern. To address this complex HFSSP, the scheduling problem is decomposed into two subproblems: a parallel machine scheduling problem (PMSP) in the last stage and an HFSSP in the upstream stages. First, a hybrid differential evolution (HDE) algorithm combined with a variable neighborhood decomposition search (VNDS) is proposed for the former subproblem. Second, an iterative backward list scheduling (IBLS) algorithm is presented to solve the latter subproblem. The effectiveness of this bi-layer optimization approach is verified by computational experiments on well-designed and real-world scheduling instances. This study provides a new perspective on modeling and solving practical SCC scheduling problems.  相似文献   
2.
3.
《Journal of power sources》2006,158(1):169-176
A fuel cell is a device that can convert chemical energy into electricity directly. Among various types of fuel cells, both polymer electrolyte membrane fuel cells (PEMFCs) and direct methanol fuel cells (DMFCs) can work at low temperature (<80 °C). Therefore, they can be used to supply power for commercial portable electronics such as laptop computers, digital cameras, PDAs and cell phones. The focus of this paper is to investigate the performance of a miniaturized DMFC device using a micropump to deliver fuel. The core of this micropump is a piezoelectric ring-type bending actuator and the associated nozzle/diffuser for directing fuel flow. Based on the experimental measurements, it is found that the performance of the fuel cell can be significantly improved if enough fuel flow is induced by the micropump at anode. Three factors may contribute to the performance enhancement including replenishment of methanol, decrease of diffusion resistance and removal of carbon dioxide. In comparison with conventional mini pumps, the size of the piezoelectric micropump is much smaller and the energy consumption is much lower. Thus, it is very viable and effective to use a piezoelectric valveless micropump for fuel delivery in miniaturized DMFC power systems.  相似文献   
4.
FrequentItemsetMining (FIM) is one of the most important data mining tasks and is the foundation of many data mining tasks. In Big Data era, centralized FIM algorithms cannot meet the needs of FIM for big data in terms of time and space, so Distributed Frequent Itemset Mining (DFIM) algorithms have been designed to meet the above challenges. In this paper, LocalGlobal and RedistributionMining which are two main paradigms of DFIM algorithm are discussed; Two algorithms of these paradigms on MapReduce named LG and RM are proposed while MapReduce is a popular distributed computing model, and also the related work is discussed. The experimental results show that the RM algorithm has better performance in terms of computation and scalability of sites, and can be used as the basis for designing the DFIM algorithm based on MapReduce. This paper also discusses the main ideas of improving the DFIM algorithms based on MapReduce.  相似文献   
5.
6.
The problem of sensor deployment to achieve k-coverage of a field, where every point is covered by at least k sensors, is very critical in the design of energy-efficient wireless sensor networks (WSNs). It becomes more challenging in mission-oriented WSNs, where sensors have to move in order to k-cover a region of interest in the field. In this type of network, there are multiple missions (or monitoring tasks) to be accomplished, each of which has different requirements, particularly, in terms of coverage. In this paper, we consider the problem of k-coverage in mission-oriented mobile WSNs which we divide into two sub-problems, namely sensor placement and sensor selection. The sensor placement problem is to identify a subset of sensors and their locations in a region of interest so it is k-covered with a small number of sensors. The sensor selection problem is to determine which sensors should move to the above-computed locations in the region while minimizing the total energy consumption due to sensor mobility and communication. Specifically, we propose centralized and distributed approaches to solve the k-coverage problem in mission-oriented mobile WSNs. Our solution to the sensor placement problem is based on Helly’s Theorem and the geometric analysis of the Reuleaux triangle. First, we consider a deterministic (or disk) sensing model, where the sensing range is modeled as a disk. Then, based on the above analysis, we address the k-coverage problem using a more realistic sensing model, known as probabilistic sensing model. The latter reflects the stochastic nature of the characteristics of the sensors, namely sensing and communication ranges. Our centralized and distributed protocols enable the sensors to move toward a region of interest and k-cover it with a small number of sensors. Our experiments show a good match between simulation and analytical results. In particular, simulation results show that our solution to the k-coverage problem in mission-oriented mobile WSNs outperforms an existing one in terms of the number of sensors needed to k-cover a region of interest in the field and their total energy consumption due to communication, sensing, and mobility for the correct operation of the protocol.  相似文献   
7.
To enable the immediate and efficient dispatch of relief to victims of disaster, this study proposes a greedy-search-based, multi-objective, genetic algorithm capable of regulating the distribution of available resources and automatically generating a variety of feasible emergency logistics schedules for decision-makers. The proposed algorithm dynamically adjusts distribution schedules from various supply points according to the requirements at demand points in order to minimize unsatisfied demand for resources, time to delivery, and transportation costs. The proposed algorithm was applied to the case of the Chi–Chi earthquake in Taiwan to verify its performance. Simulation results demonstrate that under conditions of a limited/unlimited number of available vehicles, the proposed algorithm outperforms the MOGA and standard greedy algorithm in ‘time to delivery’ by an average of 63.57% and 46.15%, respectively, based on 10,000 iterations.  相似文献   
8.
王利  高宪文  王伟  王琦 《自动化学报》2014,40(9):1991-1997
针对目前冷轧薄板厂生产流程复杂、大量的多品种小批量合同并线生产,导致难以制定生产计划的问题,本文提出了混合模型子空间聚类(Subspace clustering mixed model,SCMM)方法,以合同中待加工钢卷的宽度、冷轧机组的入口厚度、 出口厚度以及合同的交货期为约束,对待生产合同进行组批. 依据冷轧厂实际生产过程,将冷轧机组视为核心节点,考虑准时交货、 在制品库存和生产流向产能分配的要求,对组批后的生产合同建立全流程合同计划模型,并且利用提出的时间段蚁群算法(Time-section ant colony optimization,TSA),制定合同计划.利用生产过程的实际数据测试,本文的方法优于人工排产,可以满足制定冷轧薄板全流程生产计划的要求.  相似文献   
9.
The viability of networked communities depends on the creation and disclosure of user-generated content and the frequency of user visitation (Facebook 10-K Annual Report, 2012). However, little is known about how to align the interests of user and social networking sites. In this study, we draw upon the principal-agent perspective to extend Pavlou et al.’s uncertainty mitigation model of online exchange relationships (2007) and propose an empirically tested model for aligning the incentives of the principal (user) and the agent (service provider). As suggested by Pavlou et al., we incorporated a multi-dimensional measure of trust: trust of provider and trust of members. The proposed model is empirically tested with survey data from 305 adults aged 20-55. The results support our model, delineating how real individuals with bounded rationality actually make decision about information disclosure under uncertainty in the social networking site context. There is show little to no relationship between online privacy concerns and information disclosure on online social network sites. Perceived benefits provide the linkage between the incentives of principal (user) and agent (provider) while usage intensity demonstrated the most significant impact on information disclosure. We argue that the phenomenon may be explained through Communication Privacy Management Theory. The present study enhances our understanding of agency theory and human judgment theory in the context of social media. Practical implications for understanding and facilitating online social exchange relationships are also discussed.  相似文献   
10.
Partitioning the universe of discourse and determining intervals containing useful temporal information and coming with better interpretability are critical for forecasting in fuzzy time series. In the existing literature, researchers seldom consider the effect of time variable when they partition the universe of discourse. As a result, and there is a lack of interpretability of the resulting temporal intervals. In this paper, we take the temporal information into account to partition the universe of discourse into intervals with unequal length. As a result, the performance improves forecasting quality. First, time variable is involved in partitioning the universe through Gath–Geva clustering-based time series segmentation and obtain the prototypes of data, then determine suitable intervals according to the prototypes by means of information granules. An effective method of partitioning and determining intervals is proposed. We show that these intervals carry well-defined semantics. To verify the effectiveness of the approach, we apply the proposed method to forecast enrollment of students of Alabama University and the Taiwan Stock Exchange Capitalization Weighted Stock Index. The experimental results show that the partitioning with temporal information can greatly improve accuracy of forecasting. Furthermore, the proposed method is not sensitive to its parameters.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号