首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The purpose of this study is to survey the use of networks and network-based methods in systems biology. This study starts with an introduction to graph theory and basic measures allowing to quantify structural properties of networks. Then, the authors present important network classes and gene networks as well as methods for their analysis. In the last part of this study, the authors review approaches that aim at analysing the functional organisation of gene networks and the use of networks in medicine. In addition to this, the authors advocate networks as a systematic approach to general problems in systems biology, because networks are capable of assuming multiple roles that are very beneficial connecting experimental data with a functional interpretation in biological terms.  相似文献   

2.
Stem cells have the capability to self-renew and maintain their undifferentiated state or to differentiate into one or more specialised cell types. Stem cell expansion and manipulation ex vivo is a promising approach for engineering cell replacement therapies, and endogenous stem cells represent potential drugable targets for tissue repair. Before we can harness stem cells' therapeutic potential, we must first understand the intracellular mechanisms controlling their fate choices. These mechanisms involve complex signal transduction and gene regulation networks that feature, for example, intricate feed-forward loops, feedback loops and cross-talk between multiple signalling pathways. Systems biology applies computational and experimental approaches to investigate the emergent behaviour of collections of molecules and strives to explain how these numerous components interact to regulate molecular, cellular and organismal behaviour. Here we review systems biology, and in particular computational, efforts to understand the intracellular mechanisms of stem cell fate choice. We first discuss deterministic and stochastic models that synthesise molecular knowledge into mathematical formalism, enable simulation of important system behaviours and stimulate further experimentation. In addition, statistical analyses such as Bayesian networks and principal components analysis (PCA)/partial least squares (PLS) regression can distill large datasets into more readily managed networks and principal components that provide insights into the critical aspects and components of regulatory networks. Collectively, integrating modelling with experimentation has strong potential for enabling a deeper understanding of stem cell fate choice and thereby aiding the development of therapies to harness stem cells' therapeutic potential.  相似文献   

3.
Living systems comprise interacting biochemical components in very large networks. Given their high connectivity, biochemical dynamics are surprisingly not chaotic but quite robust to perturbations—a feature C.H. Waddington named canalization. Because organisms are also flexible enough to evolve, they arguably operate in a critical dynamical regime between order and chaos. The established theory of criticality is based on networks of interacting automata where Boolean truth values model presence/absence of biochemical molecules. The dynamical regime is predicted using network connectivity and node bias (to be on/off) as tuning parameters. Revising this to account for canalization leads to a significant improvement in dynamical regime prediction. The revision is based on effective connectivity, a measure of dynamical redundancy that buffers automata response to some inputs. In both random and experimentally validated systems biology networks, reducing effective connectivity makes living systems operate in stable or critical regimes even though the structure of their biochemical interaction networks predicts them to be chaotic. This suggests that dynamical redundancy may be naturally selected to maintain living systems near critical dynamics, providing both robustness and evolvability. By identifying how dynamics propagates preferably via effective pathways, our approach helps to identify precise ways to design and control network models of biochemical regulation and signalling.  相似文献   

4.
The presented work deals with the application of artificial neural networks in the modelling of the thermal decomposition process of friction composite systems based on polymer matrices reinforced by yarns. The thermal decomposition of the automotive clutch friction composite system consisting of a polymer blend reinforced by yarns from organic, inorganic and metallic fibres impregnated with resin, as well as its individual components, was monitored by a method of non‐isothermal thermogravimetry over a wide temperature range. A supervised feed‐forward back‐propagation multi‐layer artificial neural network model, with temperature as the only input parameter, has been developed to predict the thermogravimetric curves of weight loss and time derivative of weight loss of studied friction composite system and its individual components acquired at a fixed constant heating rate under a pure dry nitrogen atmosphere at a constant flow rate. It has been proven that an optimized model with a 1‐25‐6 architecture of an artificial neural network trained by a Levenberg‐Marquardt algorithm is able to predict simultaneously all the analyzed experimental thermogravimetric curves with a high level of reliability and that it thus represents the highly effective artificial intelligence tool for the modelling of thermal stability also of relatively complicated friction composite systems.  相似文献   

5.
A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an ‘engineering’ approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.  相似文献   

6.
Aqueous droplets in oil that are coated with lipid monolayers and joined through interface bilayers are useful for biophysical measurements on membrane proteins. Functional networks of droplets that can act as light sensors, batteries and electrical components can also be made by incorporating pumps, channels and pores into the bilayers. These networks of droplets mimic simple tissues, but so far have not been used in physiological environments because they have been constrained to a bulk oil phase. Here, we form structures called multisomes in which networks of aqueous droplets with defined compositions are encapsulated within small drops of oil in water. The encapsulated droplets adhere to one another and to the surface of the oil drop to form interface bilayers that allow them to communicate with each other and with the surrounding aqueous environment through membrane pores. The contents in the droplets can be released by changing the pH or temperature of the surrounding solution. The multicompartment framework of multisomes mimics a tissue and has potential applications in synthetic biology and medicine.  相似文献   

7.
High reliability of railway power systems is one of the essential criteria to ensure quality and cost-effectiveness of railway services. Evaluation of reliability at system level is essential for not only scheduling maintenance activities, but also identifying reliability-critical components. Various methods to compute reliability on individual components or regularly structured systems have been developed and proven to be effective. However, they are not adequate for evaluating complicated systems with numerous interconnected components, such as railway power systems, and locating the reliability critical components. Fault tree analysis (FTA) integrates the reliability of individual components into the overall system reliability through quantitative evaluation and identifies the critical components by minimum cut sets and sensitivity analysis. The paper presents the reliability evaluation of railway power systems by FTA and investigates the impact of maintenance activities on overall reliability. The applicability of the proposed methods is illustrated by case studies in AC railways  相似文献   

8.
An important area of research in systems biology involves the analysis and integration of genome-wide functional datasets. In this context, a major goal is the identification of a putative molecular network controlling physiological response from experimental data. With very fragmentary mechanistic information, this is a challenging task. A number of methods have been developed, each one with the potential to address an aspect of the problem. Here, we review some of the most widely used methodologies and report new results in support of the usefulness of modularization and other modelling techniques in identifying components of the molecular networks that are predictive of physiological response. We also discuss how system identification in biology could be approached, using a combination of methodologies that aim to reconstruct the relationship between molecular pathways and physiology at different levels of the organizational complexity of the molecular network.  相似文献   

9.
Feed-forward neural networks have been trained to identify and quantify heavy metals in mixtures under conditions where there were significant complications due to intermetallic compound formation. The networks were shown to be capable of (i) correlating voltammetric responses with individual heavy metals in complex mixtures, (ii) determining the relationship between responses and concentrations (including nonlinear relationships due to overlapping peaks and intermetallic compound formation), and (iii) rapidly determining concentrations of individual components from mixtures once trained. Using simulated data, modeled after complex interactions experimentally observed in samples containing Cu and Zn, it has been demonstrated that networks containing two layers of neurons (a nonlinear hidden layer and a linear output layer) can be trained to calculate concentrations under a variety of complicated situations. These include, but are not limited to, cases where the response of the intermetallic compound formed is observed as a shoulder of one of the pure metals and cases where the response of the intermetallic compound formed is not observed in the potential window. In addition, the network described above was trained to simultaneously determine concentrations of four metals (Cu, Pb, Cd, and Zn) in a concentration range where all responses were complicated by intermetallic compound formation (1-500 ppb).  相似文献   

10.
Limitations associated with the study of cancer biology in vitro, including a lack of extracellular matrix, have prompted an interest in analysing the behaviour of tumour cells in a three-dimensional environment. Such model systems can be used to better understand malignancy and metastasis and a cancer’s response to therapies. We review the materials that have been used in such models to date, including their fabrication techniques and the results from their study in cancer. Despite the variety of materials available, obstacles remain to perfecting an in vitro model system and we outline some of the challenges yet to be overcome.  相似文献   

11.
This paper proposes a model of an attractor-based innovation system for understanding tourism. Key components of the model are the attractor (that which attracts visitors), scene-maker, scene, collaborative networks between tourism and other firms and, finally, the crucial function of the scene-taker. Findings from eight in-depth case studies taken from around the world are summarized in the form of seven hypotheses concerning the operations of such innovation systems. It is argued that scene-takers, in the form of individual entrepreneurs and organizations, perform a crucial function in the innovation system in developing and maintaining the scene. Finally, some policy implications for building such a system are suggested.  相似文献   

12.
In practice, many systems exhibit load‐sharing behavior, where the surviving components share the total load imposed on the system. Different from general systems, the components of load‐sharing systems are interdependent in nature, in such a way that when one component fails, the system load has to be shared by the remaining components, which increases the failure rate or degradation rate of the remaining components. Because of the load‐sharing mechanism among components, parameter estimation and reliability assessment are usually complicated for load‐sharing systems. Although load‐sharing systems with components subject to sudden failures have been intensely studied in literatures with detailed estimation and analysis approaches, those with components subject to degradation are rarely investigated. In this paper, we propose the parameter estimation method for load‐sharing systems subject to continuous degradation with a constant load. Likelihood function based on the degradation data of components is established as a first step. The maximum likelihood estimators for unknown parameters are deduced and obtained via expectation‐maximization (EM) algorithm considering the nonclosed form of the likelihood function. Numerical examples are used to illustrate the effectiveness of the proposed method.  相似文献   

13.
We argue for a convergence of crystallography, materials science and biology, that will come about through asking materials questions about biology and biological questions about materials, illuminated by considerations of information. The complex structures now being studied in biology and produced in nanotechnology have outstripped the framework of classical crystallography, and a variety of organizing concepts are now taking shape into a more modern and dynamic science of structure, form and function. Absolute stability and equilibrium are replaced by metastable structures existing in a flux of energy-carrying information and moving within an energy landscape of complex topology. Structures give place to processes and processes to systems. The fundamental level is that of atoms. As smaller and smaller groups of atoms are used for their physical properties, quantum effects become important; already we see quantum computation taking shape. Concepts move towards those in life with the emergence of specifically informational structures. We now see the possibility of the artificial construction of a synthetic living system, different from biological life, but having many or all of the same properties. Interactions are essentially nonlinear and collective. Structures begin to have an evolutionary history with episodes of symbiosis. Underlying all the structures are constraints of time and space. Through hierarchization, a more general principle than the periodicity of crystals, structures may be found within structures on different scales. We must integrate unifying concepts from dynamical systems and information theory to form a coherent language and science of shape and structure beyond crystals. To this end, we discuss the idea of categorizing structures based on information according to the algorithmic complexity of their assembly.  相似文献   

14.
Many clinical trials for cancer precision medicine have yielded unsatisfactory results due to challenges such as drug resistance and low efficacy. Drug resistance is often caused by the complex compensatory regulation within the biomolecular network in a cancer cell. Recently, systems biological studies have modeled and simulated such complex networks to unravel the hidden mechanisms of drug resistance and identify promising new drug targets or combinatorial or sequential treatments for overcoming resistance to anticancer drugs. However, many of the identified targets or treatments present major difficulties for drug development and clinical application. Nanocarriers represent a path forward for developing therapies with these “undruggable” targets or those that require precise combinatorial or sequential application, for which conventional drug delivery mechanisms are unsuitable. Conversely, a challenge in nanomedicine has been low efficacy due to heterogeneity of cancers in patients. This problem can also be resolved through systems biological approaches by identifying personalized targets for individual patients or promoting the drug responses. Therefore, integration of systems biology and nanomaterial engineering will enable the clinical application of cancer precision medicine to overcome both drug resistance of conventional treatments and low efficacy of nanomedicine due to patient heterogeneity.  相似文献   

15.
Reverse engineering problems concerning the reconstruction and identification of gene regulatory networks through gene expression data are central issues in computational molecular biology and have become the focus of much research in the last few years. An approach has been proposed for inferring the complex causal relationships among genes from microarray experimental data, which is based on a novel neural fuzzy recurrent network. The method derives information on the gene interactions in a highly interpretable form (fuzzy rules) and takes into account the dynamical aspects of gene regulation through its recurrent structure. To determine the efficiency of the proposed approach, microarray data from two experiments relating to Saccharomyces cerevisiae and Escherichia coli have been used and experiments concerning gene expression time course prediction have been conducted. The interactions that have been retrieved among a set of genes known to be highly regulated during the yeast cell-cycle are validated by previous biological studies. The method surpasses other computational techniques, which have attempted genetic network reconstruction, by being able to recover significantly more biologically valid relationships among genes  相似文献   

16.
Sequence comparison and alignment has had an enormous impact on our understanding of evolution, biology and disease. Comparison and alignment of biological networks will probably have a similar impact. Existing network alignments use information external to the networks, such as sequence, because no good algorithm for purely topological alignment has yet been devised. In this paper, we present a novel algorithm based solely on network topology, that can be used to align any two networks. We apply it to biological networks to produce by far the most complete topological alignments of biological networks to date. We demonstrate that both species phylogeny and detailed biological function of individual proteins can be extracted from our alignments. Topology-based alignments have the potential to provide a completely new, independent source of phylogenetic information. Our alignment of the protein–protein interaction networks of two very different species—yeast and human—indicate that even distant species share a surprising amount of network topology, suggesting broad similarities in internal cellular wiring across all life on Earth.  相似文献   

17.
Genes regulate each other and form a gene regulatory network (GRN) to realise biological functions. Elucidating GRN from experimental data remains a challenging problem in systems biology. Numerous techniques have been developed and sparse linear regression methods become a promising approach to infer accurate GRNs. However, most linear methods are either based on steady‐state gene expression data or their statistical properties are not analysed. Here, two sparse penalties, adaptive least absolute shrinkage and selection operator and smoothly clipped absolute deviation, are proposed to infer GRNs from time‐course gene expression data based on an auto‐regressive model and their Oracle properties are proved under mild conditions. The effectiveness of those methods is demonstrated by applications to in silico and real biological data.Inspec keywords: genetics, autoregressive processesOther keywords: sparse penalties, gene regulatory networks, time‐course gene expression data, GRN, biological functions, systems biology, sparse linear regression methods, steady‐state gene expression data, adaptive least absolute shrinkage, selection operator, smoothly clipped absolute deviation, autoregressive model, Oracle properties  相似文献   

18.
The structure of complex networks has attracted much attention in recent years. It has been noted that many real-world examples of networked systems share a set of common architectural features. This raises important questions about their origin, for example whether such network attributes reflect common design principles or constraints imposed by selectional forces that have shaped the evolution of network topology. Is it possible to place the many patterns and forms of complex networks into a common space that reveals their relations, and what are the main rules and driving forces that determine which positions in such a space are occupied by systems that have actually evolved? We suggest that these questions can be addressed by combining concepts from two currently relatively unconnected fields. One is theoretical morphology, which has conceptualized the relations between morphological traits defined by mathematical models of biological form. The second is network science, which provides numerous quantitative tools to measure and classify different patterns of local and global network architecture across disparate types of systems. Here, we explore a new theoretical concept that lies at the intersection between both fields, the ‘network morphospace’. Defined by axes that represent specific network traits, each point within such a space represents a location occupied by networks that share a set of common ‘morphological’ characteristics related to aspects of their connectivity. Mapping a network morphospace reveals the extent to which the space is filled by existing networks, thus allowing a distinction between actual and impossible designs and highlighting the generative potential of rules and constraints that pervade the evolution of complex systems.  相似文献   

19.
Networks distribute energy, materials and information to the components of a variety of natural and human-engineered systems, including organisms, brains, the Internet and microprocessors. Distribution networks enable the integrated and coordinated functioning of these systems, and they also constrain their design. The similar hierarchical branching networks observed in organisms and microprocessors are striking, given that the structure of organisms has evolved via natural selection, while microprocessors are designed by engineers. Metabolic scaling theory (MST) shows that the rate at which networks deliver energy to an organism is proportional to its mass raised to the 3/4 power. We show that computational systems are also characterized by nonlinear network scaling and use MST principles to characterize how information networks scale, focusing on how MST predicts properties of clock distribution networks in microprocessors. The MST equations are modified to account for variation in the size and density of transistors and terminal wires in microprocessors. Based on the scaling of the clock distribution network, we predict a set of trade-offs and performance properties that scale with chip size and the number of transistors. However, there are systematic deviations between power requirements on microprocessors and predictions derived directly from MST. These deviations are addressed by augmenting the model to account for decentralized flow in some microprocessor networks (e.g. in logic networks). More generally, we hypothesize a set of constraints between the size, power and performance of networked information systems including transistors on chips, hosts on the Internet and neurons in the brain.  相似文献   

20.
Complex diseases are commonly believed to be caused by the breakdown of several correlated genes rather than individual genes. The availability of genome-wide data of high-throughput experiments provides us with new opportunity to explore this hypothesis by analysing the disease-related biomolecular networks, which are expected to bridge genotypes and disease phenotypes and further reveal the biological mechanisms of complex diseases. In this study, the authors review the existing network biology efforts to study complex diseases, such as breast cancer, diabetes and Alzheimer's disease, using high-throughput data and computational tools. Specifically, the authors categorise these existing methods into several classes based on the research topics, that is, disease genes, dysfunctional pathways, network signatures and drug-target networks. The authors also summarise the pros and cons of those methods from both computation and application perspectives, and further discuss research trends and future topics of this promising field.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号