首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   102篇
  免费   4篇
  国内免费   1篇
化学工业   14篇
金属工艺   2篇
机械仪表   1篇
建筑科学   1篇
能源动力   2篇
轻工业   2篇
石油天然气   13篇
无线电   27篇
一般工业技术   19篇
冶金工业   5篇
自动化技术   21篇
  2024年   4篇
  2023年   1篇
  2022年   3篇
  2021年   3篇
  2020年   4篇
  2019年   8篇
  2018年   14篇
  2017年   8篇
  2016年   8篇
  2014年   5篇
  2013年   6篇
  2012年   5篇
  2011年   5篇
  2010年   5篇
  2009年   2篇
  2008年   4篇
  2007年   1篇
  2006年   2篇
  2005年   2篇
  2004年   1篇
  2003年   1篇
  2002年   1篇
  2000年   2篇
  1999年   1篇
  1998年   7篇
  1996年   1篇
  1995年   1篇
  1993年   1篇
  1967年   1篇
排序方式: 共有107条查询结果,搜索用时 15 毫秒
71.
    
The interfacial tension that exists between brine and hydrocarbon is known as one of major properties in petroleum industries because it extremely affects oil trapping in reservoirs and consequently oil recovery. Due to aforementioned reasons the importance of investigation of this parameter has been highlighted. In the present study, Fuzzy C-means (FCM) algorithm was developed to predict interfacial tension between hydrocarbon and brine as function of different parameters such as pressure, temperature, carbon number of hydrocarbon and ionic strength of brine. The obtained results of predicting algorithm expressed its low relative error and deviation from the experimental data which gathered from the literature. Also the coefficients of determination (R2) for training and testing data were calculated 0.9508 and 0.9309 respectively. This predictive tool is simple and user friend to utilize and can be helpful for petroleum engineers to estimate interfacial tension between hydrocarbons and brine.  相似文献   
72.
    
Abstract

The oil recovery and rate of production are highly dependent on viscosity of reservoir fluid so this term becomes one of the attractive parameters in petroleum engineering. The viscosity of fluid is highly function of composition, temperature, and pressure so in this article, Grid partitioning based Fuzzy inference system approach is utilized as novel predictor to estimate dynamic viscosity of different normal alkanes in the wide range of operational conditions. In order to comparison of model output with actual data, an experimental dataset related to dynamic viscosity of n-alkanes is gathered. The graphical and statistical comparisons between model outputs and experimental data show the high quality performance of predicting algorithm. The coefficients of determination (R2) of training and testing phases are 0.9985 and 0.9980, respectively. The mentioned statistical indexes represent the great accuracy of model in prediction of dynamic viscosity.  相似文献   
73.
Enterprise Architecture (EA) is a holistic strategy that is commonly used to improve the alignment of enterprise’s business and Information Technology. Enterprise Architecture Implementation Methodology (EAIM) prepares a set of methods and practices for developing, managing, and maintaining an EA implementation project. There is ineffectiveness in existing EAIMs due to complexities emerging from EAIM practices, models, factors, and strategy. Consequently, EA projects may encounter lack of support in the following areas: requirements analysis, governance and evaluation, guideline for implementation, and continual improvement of EA implementation. The aim of this research is to develop an effective EAIM to support EA implementation. To fulfill this objective, the first step is to identify effective EA implementation practices and the factors that impact the effectiveness of EAIM. In this regard, a Systematic Literature Review (SLR) was conducted in order to identify the effective practices and factors of EAIM. Secondly, the proposed EAIM is developed based on the foundations and information extracted from the SLR and semi-structured interviews with EA practitioners. Finally, the proposed EAIM is evaluated by means of case study as the research methodology. The target audience for this research is twofold: (1) researchers who would extend the effective EA implementation and continue this research topic with further analysis and exploration; (2) practitioners who would like to employ an effective and lightweight EAIM for an EA project.  相似文献   
74.
Recently, optimization makes an important role in our day-to-day life. Evolutionary and population-based optimization algorithms are widely employed in several of engineering areas. The design of an optimization algorithm is a challenging endeavor caused of physical phenomena in order to obtain appropriate local and global search operators. Generally, local operators are fast. In contrast, global operators are used to find best solution in the search space; therefore they are slower compare to the local ones. The best review-knowledge of papers show that there are many optimization algorithms such as genetic algorithm, particle swarm optimization, artificial bee colony and etc in the engineering as a powerful tools. However, there is not a comprehensive review for theirs topologies and performance; therefore, the main goal of this paper is filling of this scientific gap. Moreover, several aspects of optimization heuristic designs and analysis are discussed in this paper. As a result, detailed explanation, comparison, and discussion on AI are achieved. Furthermore, some future research fields on AI are well summarized.  相似文献   
75.
Wireless Personal Communications - One of the challenges faced by machine learning in human activity recognition systems is the different distributions of the training and test samples. Transfer...  相似文献   
76.
Luminescent solar concentrators (LSC) absorb large-area solar radiation and guide down-converted emission to solar cells for electricity production. Quantum dots (QDs) have been widely engineered at device and quantum dot levels for LSCs. Here, we demonstrate cascaded energy transfer and exciton recycling at nanoassembly level for LSCs. The graded structure composed of different sized toxic-heavy-metal-free InP/ZnS core/shell QDs incorporated on copper doped InP QDs, facilitating exciton routing toward narrow band gap QDs at a high nonradiative energy transfer efficiency of 66%. At the final stage of non-radiative energy transfer, the photogenerated holes make ultrafast electronic transitions to copper-induced mid-gap states for radiative recombination in the near-infrared. The exciton recycling facilitates a photoluminescence quantum yield increase of 34% and 61% in comparison with semi-graded and ungraded energy profiles, respectively. Thanks to the suppressed reabsorption and enhanced photoluminescence quantum yield, the graded LSC achieved an optical quantum efficiency of 22.2%. Hence, engineering at nanoassembly level combined with nonradiative energy transfer and exciton funneling offer promise for efficient solar energy harvesting.  相似文献   
77.
This study investigated and compared the practical methods used for the efficient Field- Programmable Gate Array (FPGA) implementation of space-time adaptive processing (STAP). The most important part of calculating the STAP weights is the QR decomposition (QRD), which can be implemented using the modified Gram-Schmidt (MGS) algorithm. The results show that the method that uses QRD with less computational burden leads to a more effective implementation. Its structure was parameterised with the vector size to create a trade-off between the hardware and performance factors. For this purpose, QRD-MGS algorithm was first modified to increase the speed, and then the STAP weight vector was calculated. The implementation results show that decreasing the vector size decreases the resource utilisation, computational burden and the consumption power. While the computation time increases slightly, the updated rate of the STAP weights is maintained. For example, the STAP weights in a system with 6 antenna arrays, 10 received pulses and 200 range samples computed in 262 µs using a vector size of 17 on the Arria10 FPGA that has a maximum of 155 µs correlates to the QRD-MGS algorithm and 107 µs correlates to the other parts. Therefore, QRD-MGS algorithm is the most important component of the calculation of the STAP weight vector, and its simplification leads to efficient implementation.  相似文献   
78.
    
This study investigated the microbial decontamination of saffron using the low-pressure cold plasma (LPCP) technology. Therefore, other quality characteristics of saffron that create the color, taste, and aroma have also been studied. The highest microbial log reduction was observed at 110 W for 30 min. Total viable count (TVC), coliforms, molds, and yeasts log reduction were equal to 3.52, 4.62, 2.38, and 4.12 log CFU (colony-forming units)/g, respectively. The lowest decimal reduction times (D-values) were observed at 110 W, which were 9.01, 3.29, 4.17, and 8.93 min for TVC, coliforms, molds, and yeasts. LPCP treatment caused a significant increase in the product's color parameters (L*, a*, b*, ΔE, chroma, and hue angle). The results indicated that the LPCP darkened the treated stigma's color. Also, it reduced picrocrocin, safranal, and crocin in treated samples compared to the untreated control sample (p < .05). However, after examining these metabolites and comparing them with saffron-related ISO standards, all treated and control samples were good.  相似文献   
79.
    
In this paper, we investigate a multi-plant, production planning and distribution problem for the simultaneous optimisation of production, inventory control, demand allocation and distribution decisions. The objective of this rich problem is to satisfy the dynamic demand of customers while minimising the total cost of production, inventory and distribution. By solving the problem, we determine when the production needs to occur, how much has to be produced in each of the plants, how much has to be stored in each of the warehouses and how much needs to be delivered to each customer in each period. On a large real data-set inspired by a case obtained from an industrial partner, we show that the proposed integration is highly effective. Moreover, we study several trade-offs in a detailed sensitivity analysis. Our analyses indicate that the proposed scenarios give the company competitive advantage in terms of reduced total logistics cost, and also highlight more possibilities that become available taking advantage of an integrated approach towards logistics planning. These abundant opportunities are to be synergised and exploited in an interconnected open global logistics system.  相似文献   
80.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号