首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Noises are ubiquitous in genetic regulatory networks (GRNs). Gene regulation is inherently a stochastic process because of intrinsic and extrinsic noises that cause kinetic parameter variations and basal rate disturbance. Time delays are usually inevitable due to different biochemical reactions in such GRNs. In this paper, a delayed stochastic model with additive and multiplicative noises is utilized to describe stochastic GRNs. A feedback gene controller design scheme is proposed to guarantee that the GRN is mean‐square asymptotically stable with noise attenuation, where the structure of the controllers can be specified according to engineering requirements. By applying control theory and mathematical tools, the analytical solution to the control design problem is given, which helps to provide some insight into synthetic biology and systems biology. The control scheme is employed in a three‐gene network to illustrate the applicability and usefulness of the design. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

2.
3.
Computer integrated design systems will require an improved understanding of the engineering design process, including formalized notions of (1) the structure of the process, (2) the design tasks that need to be performed, and (3) the information required to carry out these tasks. To help formalize the design process, certain conceptual tools are needed, including an organizational model for the process. This paper outlines such a model for structural engineering design, termed theMultilevel Selection-Development model, which is based on a decomposition of design problems intoselection anddevelopment subproblems. A few basic concepts involved in the engineering design process are reviewed, and the proposed model is outlined. The application of the model to a simple example is presented, and management of the interactions between subproblems is discussed.  相似文献   

4.
One of the grand challenges for computer applications is the creation of a system that will provide accurate computer simulations of physical objects coupled with powerful design optimization tools to allow optimum prototyping and the final design of a broad range of physical objects. We refer to such a software environment aselectronic prototyping for physical object design (EPPOD). The research challenges in building such systems are in softwareintegration, in utilizingmassive parallelismto satisfy their large computational requirements, in incorporatingknowledgeinto the entire electronic prototyping process, in creatingintelligentuser interfaces for such systems, and in advancing thealgorithmic infrastructureneeded to support the desired functionality. In this paper we address issues related to the parallel processing of the computationally intensive components of the EPPOD problem solving environment on message passing parallel machines and present its software architecture. The parallel methodology adopted to map the underlying computations to parallel machines is based on the optimal decomposition of continuous and discrete geometric data associated with the physical object. One of the main goals of this methodology is thereuseof existing software parts while implementing various components of the EPPOD system on parallel computational environments. Finally, some performance data of the parallel algorithmic infrastructured developed are listed and discussed.  相似文献   

5.
Slag foaming is a steel-making process that has been shown to improve the efficiency of electric arc furnace plants. Unfortunately, slag foaming is a highly dynamic process that is difficult to control. This paper describes the development of an adaptive, intelligent control system for effectively manipulating the slag foaming process. The level-2 intelligent control system developed is based on three techniques from the field of computational intelligence (CI): (1) fuzzy logic, (2) genetic algorithms, and (3) neural networks. Results indicate that the computer software architecture presented in this paper is suitable for effectively manipulating complex engineering systems characterized by relatively slow process dynamics like those of a slag foaming operation.  相似文献   

6.

Modern computers allow a methodical search of possibly billions of experiments and the exploitation of interactions that are not known in advance. This enables a bottom-up process of design by assembling or configuring systems and testing the degree to which they fulfill the desired goal. We give two detailed examples of this process. One is referred to as Cartesian genetic programming and the other evolution-in-materio. In the former, evolutionary algorithms are used to exploit the interactions of software components representing mathematical, logical, or computational elements. In the latter, evolutionary algorithms are used to manipulate physical systems particularly at the electrical or electronic level. We compare and contrast both approaches and discuss possible new research directions by borrowing ideas from one and using them in the other.

  相似文献   

7.
Many methods of performing mechanism synthesis rely on an attempt to redefine the dimensions of the system in such a way that a deviation from the desired behavior is minimized by the use of optimization methods. During the optimization process the optimizer may, however, suggest values of the dimensions, or design variables, that lead to infeasible designs, i.e. dimensions for which the mechanism cannot be assembled in one or more positions.With the method proposed her, this problem is overcome by allowing the dimensions to vary during the motion of the system and subsequently minimizing the deviation of each variable dimension over a cycle. That is, for each time step the dimensions are allowed to change in order to obtain assembly as well as a desired kinematic behavior. This will lead to a variation of each dimension during a cycle of the mechanism, and this variation is the objective that is sought minimized. The minimization problem is solved using the optimality criterion.  相似文献   

8.
Future systems will be too complex to design and implement explicitly. Instead, we will have to learn to engineer complex behaviours indirectly: through the discovery and application of local rules of behaviour, applied to simple process components, from which desired behaviours predictably emerge through dynamic interactions between massive numbers of instances. This paper describes a process-oriented architecture for fine-grained concurrent systems that enables experiments with such indirect engineering. Examples are presented showing the differing complex behaviours that can arise from minor (non-linear) adjustments to low-level parameters, the difficulties in suppressing the emergence of unwanted (bad) behaviour, the unexpected relationships between apparently unrelated physical phenomena (shown up by their separate emergence from the same primordial process swamp) and the ability to explore and engineer completely new physics (such as force fields) by their emergence from low-level process interactions whose mechanisms can only be imagined, but not built, at the current time.  相似文献   

9.
This paper is concerned with the design and implementation of a simple but effective switching control strategy to regulate the dynamics of synthetic gene networks. The testbed circuit is IRMA, a recently developed network in S. Cerevisiae which is proposed as a suitable benchmark problem for control design. The proof-of-concept of an implementation of the control strategy in vivo is presented which is based on the use of microfluidics devices and fluorescence microscopy. Preliminary experimental results are given showing the good matching between the model predictions and the experimental observations.  相似文献   

10.
The sibling disciplines, systems and synthetic biology, are engaged in unraveling the complexity of the biological networks. One is trying to understand the design principle of the existing networks while the other is trying to engineer artificial gene networks with predicted functions. The significant and important role that computational intelligence can play to steer the life engineering discipline towards its ultimate goal, has been acknowledged since its time of birth. However, as the field is facing many challenges in building complex modules/systems from the simpler parts/devices, whether from scratch or through redesign, the role of computational assistance becomes even more crucial. Evolutionary computation, falling under the broader domain of artificial intelligence, is well-acknowledged for its near optimal solution seeking capability for poorly known and partially understood problems. Since the post genome period, these natural-selection simulating algorithms are playing a noteworthy role in identifying, analyzing and optimizing different types of biological networks. This article calls attention to how evolutionary computation can help synthetic biologists in assembling larger network systems from the lego-like parts.  相似文献   

11.
Synthetic biology aims to engineer and redesign biological systems for useful real-world applications in biomanufacturing, biosensing and biotherapy following a typical design-build-test cycle. Inspired from computer science and electronics, synthetic gene circuits have been designed to exhibit control over the flow of information in biological systems. Two types are Boolean logic inspired TRUE or FALSE digital logic and graded analog computation. Key principles for gene circuit engineering include modularity, orthogonality, predictability and reliability. Initial circuits in the field were small and hampered by a lack of modular and orthogonal components, however in recent years the library of available parts has increased vastly. New tools for high throughput DNA assembly and characterization have been developed enabling rapid prototyping, systematic in situ characterization, as well as automated design and assembly of circuits. Recently implemented computing paradigms in circuit memory and distributed computing using cell consortia will also be discussed. Finally, we will examine existing challenges in building predictable large-scale circuits including modularity, context dependency and metabolic burden as well as tools and methods used to resolve them. These new trends and techniques have the potential to accelerate design of larger gene circuits and result in an increase in our basic understanding of circuit and host behaviour.  相似文献   

12.
Community structure is an important topological feature of complex networks. Detecting community structure is a highly challenging problem in analyzing complex networks and has great importance in understanding the function and organization of networks. Up until now, numerous algorithms have been proposed for detecting community structure in complex networks. A wide range of these algorithms use the maximization of a quality function called modularity. In this article, three different algorithms, namely, MEM-net, OMA-net, and GAOMA-net, have been proposed for detecting community structure in complex networks. In GAOMA-net algorithm, which is the main proposed algorithm of this article, the combination of genetic algorithm (GA) and object migrating automata (OMA) has been used. In GAOMA-net algorithm, the MEM-net algorithm has been used as a heuristic to generate a portion of the initial population. The experiments on both real-world and synthetic benchmark networks indicate that GAOMA-net algorithm is efficient for detecting community structure in complex networks.  相似文献   

13.
The field of computational intelligence (CI) is primarily concerned with the development of computer systems that are capable of adapting to and exploiting information about their environments, much like organisms in natural systems are capable of doing. It is no coincidence therefore, that the field of CI relies heavily on computer techniques patterned after natural systems. Many of these techniques including neural networks, genetic algorithms, and fuzzy logic have demonstrated their utility in solving problems independent of other methods. However, as the systems we seek to control, design, and improve become increasingly complex, it is unlikely that any single CI technique will prove to be adequate. This paper describes an architecture combining the three CI techniques listed above that can be used to produce process control systems suitable for effectively manipulating complex engineering systems characterized by relatively slow process dynamics. Implementation of the architecture results in a level-two intelligent control system. The effectiveness of the level-two intelligent controller is demonstrated via application to an operating phosphate processing plant.  相似文献   

14.
ContextSoftware networks are directed graphs of static dependencies between source code entities (functions, classes, modules, etc.). These structures can be used to investigate the complexity and evolution of large-scale software systems and to compute metrics associated with software design. The extraction of software networks is also the first step in reverse engineering activities.ObjectiveThe aim of this paper is to present SNEIPL, a novel approach to the extraction of software networks that is based on a language-independent, enriched concrete syntax tree representation of the source code.MethodThe applicability of the approach is demonstrated by the extraction of software networks representing real-world, medium to large software systems written in different languages which belong to different programming paradigms. To investigate the completeness and correctness of the approach, class collaboration networks (CCNs) extracted from real-world Java software systems are compared to CCNs obtained by other tools. Namely, we used Dependency Finder which extracts entity-level dependencies from Java bytecode, and Doxygen which realizes language-independent fuzzy parsing approach to dependency extraction. We also compared SNEIPL to fact extractors present in language-independent reverse engineering tools.ResultsOur approach to dependency extraction is validated on six real-world medium to large-scale software systems written in Java, Modula-2, and Delphi. The results of the comparative analysis involving ten Java software systems show that the networks formed by SNEIPL are highly similar to those formed by Dependency Finder and more precise than the comparable networks formed with the help of Doxygen. Regarding the comparison with language-independent reverse engineering tools, SNEIPL provides both language-independent extraction and representation of fact bases.ConclusionSNEIPL is a language-independent extractor of software networks and consequently enables language-independent network-based analysis of software systems, computation of design software metrics, and extraction of fact bases for reverse engineering activities.  相似文献   

15.
This paper describes the use of an evolutionary design system known as GANNET to synthesize the structure of neural networks. Initial results are presented for two benchmark problems: the exclusive-or and the two-spirals. A variety of performance criteria and design components are used and comparisons are drawn between the performance of genetic algorithms and other related techniques on these problems.  相似文献   

16.
The main aim of this paper is to present MOGUL, a Methodology to Obtain Genetic fuzzy rule-based systems Under the iterative rule Learning approach. MOGUL will consist of some design guidelines that allow us to obtain different genetic fuzzy rule-based systems, i.e., evolutionary algorithm-based processes to automatically design fuzzy rule-based systems by learning and/or tuning the fuzzy rule base, following the same generic structure and able to cope with problems of a different nature. A specific evolutionary learning process obtained from the paradigm proposed to design unconstrained approximate Mamdani-type fuzzy rule-based systems will be introduced, and its accuracy in the solving of a real-world electrical engineering problem will be analyzed. ©1999 John Wiley & Sons, Inc.  相似文献   

17.
In this paper we summarize our experiences in building and integrating new generation, formal-methods based computer aided software engineering tools (CASE) to yield pragmatic improvements in software engineering processes in the telecommunication industry. We define an accelerated development methodology (ADM) for the specification, design, testing and re-engineering of telecommunications software. We identify two of the most significant barriers to adoption of tools and formal methods to speed up software development, namely the requirements engineering barrier and the legacy code re-engineering barrier, and show how the ADM methodology helps to overcome these barriers and improve time-to-market for telecommunications software.Our ADM methodology is based on the most widely accepted formal languages standardized by the International Telecommunications Union (ITU):

This paper emphasizes the following key components of our ADM methodology and their placement within the most common software engineering processes:

Author Keywords: Time-to-market; SDL tools; Formal methods; Software engineering processes; Telecommunications; Accelerated development  相似文献   

18.
Programming new cellular functions by using synthetic gene circuits is a key goal in synthetic biology, and an important element of this process is the ability to couple to the information processing systems of the host cell using synthetic systems with various signal-response characteristics. Here, we present a synthetic gene system in Escherichia coli whose signal-response curve may be tuned from band detection (strongest response within a band of input concentrations) to a switch-like sigmoidal response, simply by altering the temperature. This change from a band-detection response to a sigmoidal response has not previously been implemented. The system allows investigation of a range of signal-response behaviours with minimal effort: a single system, once inserted into the cells, provides a range of response curves without any genetic alterations or replacement with other systems. By altering its output, the system may couple to other synthetic or natural genetic circuits, and thus serve as a useful modular component. A mathematical model has also been developed which captures the essential qualitative behaviours of the circuit.  相似文献   

19.
Genetic process mining: an experimental evaluation   总被引:4,自引:0,他引:4  
One of the aims of process mining is to retrieve a process model from an event log. The discovered models can be used as objective starting points during the deployment of process-aware information systems (Dumas et al., eds., Process-Aware Information Systems: Bridging People and Software Through Process Technology. Wiley, New York, 2005) and/or as a feedback mechanism to check prescribed models against enacted ones. However, current techniques have problems when mining processes that contain non-trivial constructs and/or when dealing with the presence of noise in the logs. Most of the problems happen because many current techniques are based on local information in the event log. To overcome these problems, we try to use genetic algorithms to mine process models. The main motivation is to benefit from the global search performed by this kind of algorithms. The non-trivial constructs are tackled by choosing an internal representation that supports them. The problem of noise is naturally tackled by the genetic algorithm because, per definition, these algorithms are robust to noise. The main challenge in a genetic approach is the definition of a good fitness measure because it guides the global search performed by the genetic algorithm. This paper explains how the genetic algorithm works. Experiments with synthetic and real-life logs show that the fitness measure indeed leads to the mining of process models that are complete (can reproduce all the behavior in the log) and precise (do not allow for extra behavior that cannot be derived from the event log). The genetic algorithm is implemented as a plug-in in the ProM framework.  相似文献   

20.
In this work, we develop a stochastic model, GOP ARIMA (autoregressive integrated moving average for a group of pictures) for VBR processes with a regular GOP pattern. It explicitly incorporates the deterministic time-dependent behavior of frame-level VBR traffic. The GOP ARIMA model elaborately represents the inter- and intra-GOP sample autocorrelation structures and provides a physical explanation of observed stochastic characteristics of the empirical VBR process. We explain stochastic characteristics of the empirical VBR process, e.g., slowly decaying sample autocorrelations and strong correlations at the lags, based on the aspect of nonstationarity of the underlying process. The GOP ARIMA model generates synthetic traffic, which has the same multiplicative periodic sample autocorrelation structure as well as slowly decaying autocorrelations of the empirical VBR process. The simulation results show that the GOP ARIMA process very well captures the behavior of the empirical process in various respects: packet loss, packet delay, and frame corruption. Our work makes a contribution not only toward providing a theoretical explanation of the observed characteristics of the empirical VBR process but also toward the development of an efficient method for generating a more realistic synthetic sequence for various engineering purposes and for predicting future bandwidth requirements.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号