首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   30篇
  免费   1篇
电工技术   1篇
化学工业   5篇
轻工业   1篇
无线电   1篇
一般工业技术   2篇
自动化技术   21篇
  2024年   1篇
  2023年   1篇
  2022年   1篇
  2020年   1篇
  2016年   1篇
  2014年   2篇
  2013年   5篇
  2011年   4篇
  2010年   1篇
  2008年   4篇
  2007年   3篇
  2006年   2篇
  2005年   1篇
  2001年   3篇
  1972年   1篇
排序方式: 共有31条查询结果,搜索用时 140 毫秒
21.
In this paper the model predictive control (MPC) technology is used for tackling the optimal drug administration problem. The important advantage of MPC compared to other control technologies is that it explicitly takes into account the constraints of the system. In particular, for drug treatments of living organisms, MPC can guarantee satisfaction of the minimum toxic concentration (MTC) constraints. A whole-body physiologically-based pharmacokinetic (PBPK) model serves as the dynamic prediction model of the system after it is formulated as a discrete-time state-space model. Only plasma measurements are assumed to be measured on-line. The rest of the states (drug concentrations in other organs and tissues) are estimated in real time by designing an artificial observer. The complete system (observer and MPC controller) is able to drive the drug concentration to the desired levels at the organs of interest, while satisfying the imposed constraints, even in the presence of modelling errors, disturbances and noise.  相似文献   
22.
There are many processes in a pulp and paper mill where an on-line parameter analyzer cannot be used due to several reasons

•The analyzer is very expensive

•It cannot survive in the environment we want to use it

•It is not operational due to hardware problems, maintenance etc

•Such an analyzer does not exist in the market

In all these situations it would be great for the mill to have an alternative way of measuring those parameters in real-time. Neural network models can serve as virtual sensors that infer process parameters from other variables, which can be measured on-line. One excellent application of inferential sensors in the pulp and paper industry is the on-line prediction of paper properties, like tensile, stretch, brightness, opacity etc. In tissue machines, the most important quality parameter is softness, which is usually measured in a very subjective manner by the touch of a human finger. In this work we examine how neural networks can be deployed in order to build online virtual sensors for softness and other tissue quality properties. The results are promising and show that neural network technology can improve productivity and minimize out of specs production in a tissue machine, by providing accurate real time monitoring ofquality parameters.  相似文献   
23.
Flavonoid fatty esters were prepared by acylation of flavonoids (rutin and naringin) by fatty acids (C8, C10, C12), catalyzed by immobilized lipase from Candida antarctica in various solvent systems. The reaction parameters affecting the conversion of the enzymatic process, such as the nature of the organic solvent and acyl donor used, the water activity (aw) of the system, as well as the acyl donor concentration have been investigated. At optimum reaction conditions, the conversion of flavonoids was 50—60% in tert‐butanol at aw less than 0.11. In all cases studied, only flavonoid monoester was identified, which indicates that this lipase‐catalyzed esterification is regioselective.  相似文献   
24.
This paper presents a new stochastic algorithm for solving hierarchical multiobjective optimization problems. The algorithm is based on the simulated annealing concept and returns a single solution that corresponds to the lexicographic ordering approach. The algorithm optimizes simultaneously the multiple objectives by assigning a different initial temperature to each one, according to its position in the hierarchy. A major advantage of the proposed method is its low computational cost. This is very critical, particularly, for online applications, where the time that is available for decision making is limited. The method is tested in a number of benchmark problems, which illustrate its ability to find near-optimal solutions even in nonconvex multiobjective optimization problems. The results are comparable with those that are produced by state-of-the-art multiobjective evolutionary algorithms, such as the Nondominated Sorting Genetic Algorithm II. The algorithm is further applied to the solution of a large-scale problem that is formulated online, when a multiobjective adaptive model predictive control (MPC) configuration is adopted. This particular control scheme involves an adaptive discrete-time model of the system, which is developed using the radial-basis-function neural-network architecture. A key issue in the success of the adaptation strategy is the introduction of a persistent excitation constraint, which is transformed to a top-priority objective. The overall methodology is applied to the control problem of a pH reactor and proves to be superior to conventional MPC configurations.  相似文献   
25.
Supply chains are complicated dynamical systems triggered by customer demands. Proper selection of equipment, machinery, buildings and transportation fleets is a key component for the success of such systems. However, efficiency of supply chains mostly depends on management decisions, which are often based on intuition and experience. Due to the increasing complexity of supply chain systems (which is the result of changes in customer preferences, the globalization of the economy and the stringy competition among companies), these decisions are often far from optimum. Another factor that causes difficulties in decision making is that different stages in supply chains are often supervised by different groups of people with different managing philosophies. From the early 1950s it became evident that a rigorous framework for analyzing the dynamics of supply chains and taking proper decisions could improve substantially the performance of the systems. Due to the resemblance of supply chains to engineering dynamical systems, control theory has provided a solid background for building such a framework. During the last half century many mathematical tools emerging from the control literature have been applied to the supply chain management problem. These tools vary from classical transfer function analysis to highly sophisticated control methodologies, such as model predictive control (MPC) and neuro-dynamic programming. The aim of this paper is to provide a review of this effort. The reader will find representative references of many alternative control philosophies and identify the advantages, weaknesses and complexities of each one. The bottom line of this review is that a joint co-operation between control experts and supply chain managers has the potential to introduce more realism to the dynamical models and develop improved supply chain management policies.  相似文献   
26.
This work presents the non-symmetric fuzzy means algorithm which is a new methodology for training Radial Basis Function neural network models. The method is based on a non-symmetric fuzzy partition of the space of input variables which results to networks with smaller structures and better approximation capabilities compared to other state-of-the-art training procedures. The lower modeling error and the smaller size of the produced models become particularly important when they are used in online applications. This is demonstrated by integrating the model produced by the proposed algorithm in a Model Predictive Control configuration, resulting in better control performance and shorter computational times.  相似文献   
27.
In this paper we study the problem of parametric minimization of convex piecewise quadratic functions. Our study provides a unifying framework for convex parametric quadratic and linear programs. Furthermore, it extends parametric optimization algorithms to problems with piecewise quadratic cost functions, paving the way for new applications of parametric optimization in explicit dynamic programming and optimal control with quadratic stage cost.  相似文献   
28.
Migrating organisational services, data and application on the Cloud is an important strategic decision for organisations due to the large number of benefits introduced by the usage of cloud computing, such as cost reduction and on-demand resources. Despite, however, many benefits, there are challenges and risks for cloud adaption related to (amongst others) data leakage, insecure APIs and shared technology vulnerabilities. These challenges need to be understood and analysed in the context of an organisation’s security and privacy goals and relevant cloud computing deployment models. Although the literature provides a large number of references to works that consider cloud computing security issues, no work has been provided, to our knowledge, which supports the elicitation of security and privacy requirements and the selection of an appropriate cloud deployment model based on such requirements. This work contributes towards this gap. In particular, we propose a requirements engineering framework to support the elicitation of security and privacy requirements and the selection of an appropriate deployment model based on the elicited requirements. Our framework provides a modelling language that builds on concepts from requirements, security, privacy and cloud engineering, and a systematic process. We use a real case study, based on the Greek National Gazette, to demonstrate the applicability of our work.  相似文献   
29.
In this paper, we construct and evaluate all nonisomorphic Latin hypercube designs with n≤16 runs, the use of which guarantee that the estimates of the first‐order effects are uncorrelated with each other and also uncorrelated with the estimates of the second‐order effects, in polynomial regression models. The produced designs are evaluated using well‐known and popular criteria, and optimal designs are presented in every case studied. An effort to construct nonisomorphic small Latin hypercubes in which only the estimates of the first‐order effects are required to be uncorrelated with each other has also been made, and new designs are presented. All the constructed designs, besides their stand‐alone properties, are useful for the construction of bigger orthogonal Latin hypercubes with desirable properties, using well‐known techniques proposed in the literature. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
30.
Hash functions are special cryptographic algorithms, which are applied wherever message integrity and authentication are critical. Implementations of these functions are cryptographic primitives widely used in common cryptographic schemes and security protocols such as Internet Protocol Security (IPSec) and Virtual Private Network (VPN). In this paper, a novel FPGA implementation of the Secure Hash Algorithm 1 (SHA-1) is proposed. The proposed architecture exploits the benefits of pipeline and re-timing of execution through pre-computation of intermediate temporal values. Pipeline allows division of the calculation of the hash value in four discreet stages, corresponding to the four required rounds of the algorithm. Re-timing is based on the decomposition of the SHA-1 expression to separate information dependencies and independencies. This allows pre-computation of intermediate temporal values in parallel to the calculation of other independent values. Exploiting the information dependencies, the fundamental operational block of SHA-1 is modified so that maximum operation frequency is increased by 30% approximately with negligible area penalty compared to other academic and commercial implementations. The proposed SHA-1 hash function was prototyped and verified using a XILINX FPGA device. The implementation’s characteristics are compared to alternative implementations proposed by the academia and the industry, which are available in the international IP market. The proposed implementation achieved a throughput that exceeded 2,5 Gbps, which is the highest among all similar IP cores for the targeted XILINX technology.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号