首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   31篇
  免费   0篇
化学工业   3篇
机械仪表   1篇
能源动力   1篇
轻工业   1篇
水利工程   1篇
石油天然气   1篇
无线电   1篇
一般工业技术   4篇
自动化技术   18篇
  2022年   1篇
  2021年   1篇
  2018年   1篇
  2013年   5篇
  2012年   1篇
  2011年   2篇
  2010年   3篇
  2009年   1篇
  2007年   2篇
  2006年   1篇
  2005年   2篇
  2004年   1篇
  2002年   2篇
  2001年   1篇
  2000年   1篇
  1999年   2篇
  1997年   1篇
  1994年   1篇
  1993年   1篇
  1986年   1篇
排序方式: 共有31条查询结果,搜索用时 15 毫秒
1.
Multi-objective robust optimization using a sensitivity region concept   总被引:6,自引:2,他引:4  
In multi-objective design optimization, it is quite desirable to obtain solutions that are multi-objectively optimum and insensitive to uncontrollable (noisy) parameter variations. We call such solutions robust Pareto solutions. In this paper we present a method to measure the multi-objective sensitivity of a design alternative, and an approach to use such a measure to obtain multi-objectively robust Pareto optimum solutions. Our sensitivity measure does not require a presumed probability distribution of uncontrollable parameters and does not utilize gradient information; therefore, it is applicable to multi-objective optimization problems that have non-differentiable and/or discontinuous objective functions, and also to problems with large parameter variations. As a demonstration, we apply our robust optimization method to an engineering example, the design of a vibrating platform. We show that the solutions obtained for this example are indeed robust.  相似文献   
2.
3.
Oil production during the primary stage is achieved due to the natural energy stored in the reservoir. Upon depletion of this energy, the production ceases or the oil production rate becomes so small that it will not be economical to operate. At this stage, a large fraction of the initial oil in place is still trapped under the ground. The oil recovery efficiency during the primary stage is within 10% to 30% depending on the nature of the reservoir. This means that more than70% of the initial oil in place is the target for the secondary and/or improved oil recovery techniques.During the secondary recovery stage, some kind of fluid is injected to push the oil from the injection well toward the producer. Water and gases are the most commonly used displacing fluids in this process. Waterflood is the most common practice of secondary oil recovery techniques. Injection of carbon dioxide or other gases is also a common practice to improve oil recovery efficiency. Regardless of the type of the fluid used to displace the oil, the displacing fluid could bypass the oil and early breakthrough could occur. In the case of waterflood, the water/oil ratio could become so high that the process ceases to be economical any more. For injection of CO2 or other gases, the high gas/oil ratio renders the process uneconomical. This is more dramatic for heterogeneous and layered reservoirs with contrasting permeability variation among the layers. To remedy the above problem, some kind of polymer solution is injected into the reservoir and is allowed to gel under certain conditions. The gel viscosity being much higher than the displacing fluid could impede the flow of displacing fluid through the already flooded regions; therefore, the displacing fluid is bound to find new paths which means additional oil can be displaced. Profile modification based on in situ gelation technology is an already proven economical process for improving oil recovery. There is a variety of gelation systems available in the market for the treatment of the reservoir. Most of the gelation systems in the market are based on cross-linking of polyacrylamide-type polymers by some kind of heavy metal ions such as chromium to produce a three-dimensional gel structure in situ in the reservoir. Recent research efforts at the University of Kansas have produced a new type of bio-polymer which gels without cross-linker. Gelation occurs by reducing the pH of the alkaline solution and the gelation process is reversible. This paper will discuss the in situ gelation techniques based on the commercially available systems and the newly discovered bio-polymer as mentioned above.  相似文献   
4.
Gelled polymers are being used increasingly to modify the movement of injected fluids in secondary and enhaced oil recovery processes. A common gelation process involves the reduction of Cr(VI) to Cr(III) in the presence of polyacrylamide. The Cr(III) reacts or interacts with the polymer to form a gel network. Although correlations of gelation time with principal process variables have been obtained, viscometric data have not been reported during or after gelation. These data are needed for fluid flow calculations in surface equipment and estimation of flow behaviour in reservoir rocks.

A Weissenberg Rheogoniometer, with cone and plate geometry, was used to obtain viscometric data for the gelation of polyacrylamide and chromium (III). Solutions consisting of polyacrylamide polymer, sodium dichromate-dihydrate and sodium bisulfite were gelled under a steady shear field at constant temperature. The shear stress versus time profile for the galation process was interpreted to define a gelation time and to determine the apparent viscosity of the gelled fluid. The gelation time decreased as the applied shear rate increased up to about 14.25 sec-1 and was affected by shear rate history. Viscometric properties of the gelled solutions were determined. Apparent viscosity of the gelled solutions decreased as the shear rate under which they were formed increased.

Post gelation studies indicated that gels exhibited a residual stress at zero shear rate and behaved as Bingham plastics under steady shear. Gels formed at low shear rates were more viscous than gels formed at high shear rates. However, the structure of these gels was susceptible to shear degradation.  相似文献   
5.

The integrated management of water supply and demand has been considered by many policymakers; due to its complexity the decision makers have faced many challenges so far. In this study, we proposed an efficient framework for managing water supply and demand in line with the economic and environmental objectives of the basin. To design this framework, a combination of ANFIS and multi-objective augmented ε-constraint programming models and TOPSIS were used. First, using hydrological data from 2001 to 2017, the rate of water release from the dam reservoir was estimated with the ANFIS model; afterwards, its allocation to agricultural areas was performed by combining multi-objective augmented ε-constraint models and TOPSIS. To prove the reliability of the proposed model, the southern Karkheh basin in Khuzestan province, Iran, was considered as a case study. The results have showed that this model is able to reduce irrigation water consumption and to improve its economic productivity in the basin.

  相似文献   
6.
Applications of multi-objective genetic algorithms (MOGAs) in engineering optimization problems often require numerous function calls. One way to reduce the number of function calls is to use an approximation in lieu of function calls. An approximation involves two steps: design of experiments (DOE) and metamodeling. This paper presents a new approach where both DOE and metamodeling are integrated with a MOGA. In particular, the DOE method reduces the number of generations in a MOGA, while the metamodeling reduces the number of function calls in each generation. In the present approach, the DOE locates a subset of design points that is estimated to better sample the design space, while the metamodeling assists in estimating the fitness of design points. Several numerical and engineering examples are used to demonstrate the applicability of this new approach. The results from these examples show that the proposed improved approach requires significantly fewer function calls and obtains similar solutions compared to a conventional MOGA and a recently developed metamodeling-assisted MOGA.  相似文献   
7.
There is an ever increasing need to use optimization methods for thermal design of data centers and the hardware populating them. Airflow simulations of cabinets and data centers are computationally intensive and this problem is exacerbated when the simulation model is integrated with a design optimization method. Generally speaking, thermal design of data center hardware can be posed as a constrained multi-objective optimization problem. A popular approach for solving this kind of problem is to use Multi-Objective Genetic Algorithms (MOGAs). However, the large number of simulation evaluations needed for MOGAs has been preventing their applications to realistic engineering design problems. In this paper, details of a substantially more efficient MOGA are formulated and demonstrated through a thermal analysis simulation model of a data center cabinet. First, a reduced-order model of the cabinet problem is constructed using the Proper Orthogonal Decomposition (POD). The POD model is then used to form the objective and constraint functions of an optimization model. Next, this optimization model is integrated with the new MOGA. The new MOGA uses a “kriging” guided operation in addition to conventional genetic algorithm operations to search the design space for global optimal design solutions. This approach for optimal design is essential to handle complex multi-objective situations, where the optimal solutions may be non-obvious from simple analyses or intuition. It is shown that in optimizing the data center cabinet problem, the new MOGA outperforms a conventional MOGA by estimating the Pareto front using 50% fewer simulation calls, which makes its use very promising for complex thermal design problems. Recommended by: Monem Beitelmal  相似文献   
8.
A new approach to metamodeling is introduced whereby a sequential technique is used to construct and simultaneously update mutually dependent metamodels for multiresponse, high-fidelity deterministic simulations. Unlike conventional approaches which produce a single metamodel for each scalar response independently, the present method uses the correlation among different simulation responses in the construction of the metamodel. These dependent metamodels are solved as a system of equations to estimate all individual responses simultaneously. Since several responses contribute to the construction of each individual metamodel, more information from the computed responses is used, thus improving the accuracy of the obtained metamodels. Examples are used to explore the relative performance of the proposed approach and show that the new approach outperforms conventional metamodeling approaches in terms of approximation accuracy. The new method should be particularly useful in problems that require very computationally intensive simulations.  相似文献   
9.
In this work, the potential for the auto-ignition of Iranian heavy oil during in situ combustion (ISC) process conditions was studied. Kinetic studies were carried out using thermal analysis techniques. Effects of oxygen partial pressure, reservoir pressure, and clay on the auto-ignition condition were investigated. Based on the experimental results obtained, a kinetic equation was derived for each of the different oil samples in the presence of different sands. The effect of partial pressure of oxygen in the injected air showed that at atmospheric pressure, low temperature combustion (LTC) was initiated at 275°C. Also, enriching the injected air by oxygen lowers the initial LTC temperature by up to 50°C. ARC experiments were undertaken to extend the studies to reservoir pressure conditions (1300 psi). It was found that activation energy in the LTC region was lowered as a consequence. As a result, initiation of LTC commenced at 115°C when air was injected. The effect of clay as a catalyst was also studied, and it was found that the activation energy decreases considerably when clay is present in the system. Experiments in a high-pressure combustion tube showed that LTC was initiated in the temperature range 120°–150°C, which is in line with the results obtained in the ARC. Fire flooding was sustained during the combustion tube test.  相似文献   
10.
Gradient-based methods, including Normal Boundary Intersection (NBI), for solving multi-objective optimization problems require solving at least one optimization problem for each solution point. These methods can be computationally expensive with an increase in the number of variables and/or constraints of the optimization problem. This paper provides a modification to the original NBI algorithm so that continuous Pareto frontiers are obtained “in one go,” i.e., by solving only a single optimization problem. Discontinuous Pareto frontiers require solving a significantly fewer number of optimization problems than the original NBI algorithm. In the proposed method, the optimization problem is solved using a quasi-Newton method whose history of iterates is used to obtain points on the Pareto frontier. The proposed and the original NBI methods have been applied to a collection of 16 test problems, including a welded beam design and a heat exchanger design problem. The results show that the proposed approach significantly reduces the number of function calls when compared to the original NBI algorithm.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号