首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
3.
Estimating model parameters from experimental data is a crucial technique for working with computational models in systems biology. Since stochastic models are increasingly important, parameter estimation methods for stochastic modelling are also of increasing interest. This study presents an extension to the ‘multiple shooting for stochastic systems (MSS)’ method for parameter estimation. The transition probabilities of the likelihood function are approximated with normal distributions. Means and variances are calculated with a linear noise approximation on the interval between succeeding measurements. The fact that the system is only approximated on intervals which are short in comparison with the total observation horizon allows to deal with effects of the intrinsic stochasticity. The study presents scenarios in which the extension is essential for successfully estimating the parameters and scenarios in which the extension is of modest benefit. Furthermore, it compares the estimation results with reversible jump techniques showing that the approximation does not lead to a loss of accuracy. Since the method is not based on stochastic simulations or approximative sampling of distributions, its computational speed is comparable with conventional least‐squares parameter estimation methods.Inspec keywords: stochastic systems, parameter estimation, probability, least squares approximationsOther keywords: deterministic inference, stochastic systems, multiple shooting, linear noise approximation, transition probabilities, systems biology, parameter estimation methods, likelihood function, normal distributions, intrinsic stochasticity effects, reversible jump techniques, approximative sampling, conventional least‐squares parameter estimation methods  相似文献   

4.
Ordered and chaotic superlattices have been identified in Nature that give rise to a variety of colours reflected by the skin of various organisms. In particular, organisms such as silvery fish possess superlattices that reflect a broad range of light from the visible to the UV. Such superlattices have previously been identified as ‘chaotic’, but we propose that apparent ‘chaotic’ natural structures, which have been previously modelled as completely random structures, should have an underlying fractal geometry. Fractal geometry, often described as the geometry of Nature, can be used to mimic structures found in Nature, but deterministic fractals produce structures that are too ‘perfect’ to appear natural. Introducing variability into fractals produces structures that appear more natural. We suggest that the ‘chaotic’ (purely random) superlattices identified in Nature are more accurately modelled by multi-generator fractals. Furthermore, we introduce fractal random Cantor bars as a candidate for generating both ordered and ‘chaotic’ superlattices, such as the ones found in silvery fish. A genetic algorithm is used to evolve optimal fractal random Cantor bars with multiple generators targeting several desired optical functions in the mid-infrared and the near-infrared. We present optimized superlattices demonstrating broadband reflection as well as single and multiple pass bands in the near-infrared regime.  相似文献   

5.
Of considerable interest are the evolutionary and developmental origins of complex, adaptive structures and the mechanisms that stabilize these structures. We consider the relationship between the evolutionary process of gene duplication and deletion and the stability of morphogenetic patterns produced by interacting activators and inhibitors. We compare the relative stability of patterns with a single activator and inhibitor (two-dimensional system) against a ‘redundant’ system with two activators or two inhibitors (three-dimensional system). We find that duplication events can both expand and contract the space of patterns. We study developmental robustness in terms of stochastic escape times from this space, also known as a ‘canalization potential’. We embed the output of pattern formation into an explicit evolutionary model of gene duplication, gene loss and variation in the steepness of the canalization potential. We find that under all constant conditions, the system evolves towards a preference for steep potentials associated with low phenotypic variability and longer lifespans. This preference leads to an overall decrease in the density of redundant genotypes as developmental robustness neutralizes the advantages of genetic robustness.  相似文献   

6.
One of the most dramatic consequences of climate change will be the intensification and increased frequency of extreme events. I used numerical simulations to understand and predict the consequences of directional trend (i.e. mean state) and increased variability of a climate variable (e.g. temperature), increased probability of occurrence of point extreme events (e.g. floods), selection pressure and effect size of mutations on a quantitative trait determining individual fitness, as well as the their effects on the population and genetic dynamics of a population of moderate size. The interaction among climate trend, variability and probability of point extremes had a minor effect on risk of extinction, time to extinction and distribution of the trait after accounting for their independent effects. The survival chances of a population strongly and linearly decreased with increasing strength of selection, as well as with increasing climate trend and variability. Mutation amplitude had no effects on extinction risk, time to extinction or genetic adaptation to the new climate. Climate trend and strength of selection largely determined the shift of the mean phenotype in the population. The extinction or persistence of the populations in an ‘extinction window’ of 10 years was well predicted by a simple model including mean population size and mean genetic variance over a 10-year time frame preceding the ‘extinction window’, although genetic variance had a smaller role than population size in predicting contemporary risk of extinction.  相似文献   

7.
We study a simplified model of gene regulatory network evolution in which links (regulatory interactions) are added via various selection rules that are based on the structural and dynamical features of the network nodes (genes). Similar to well-studied models of ‘explosive’ percolation, in our approach, links are selectively added so as to delay the transition to large-scale damage propagation, i.e. to make the network robust to small perturbations of gene states. We find that when selection depends only on structure, evolved networks are resistant to widespread damage propagation, even without knowledge of individual gene propensities for becoming ‘damaged’. We also observe that networks evolved to avoid damage propagation tend towards disassortativity (i.e. directed links preferentially connect high degree ‘source’ genes to low degree ‘target’ genes and vice versa). We compare our simulations to reconstructed gene regulatory networks for several different species, with genes and links added over evolutionary time, and we find a similar bias towards disassortativity in the reconstructed networks.  相似文献   

8.
Although flying insects have limited visual acuity (approx. 1°) and relatively small brains, many species pursue tiny targets against cluttered backgrounds with high success. Our previous computational model, inspired by electrophysiological recordings from insect ‘small target motion detector’ (STMD) neurons, did not account for several key properties described from the biological system. These include the recent observations of response ‘facilitation’ (a slow build-up of response to targets that move on long, continuous trajectories) and ‘selective attention’, a competitive mechanism that selects one target from alternatives. Here, we present an elaborated STMD-inspired model, implemented in a closed loop target-tracking system that uses an active saccadic gaze fixation strategy inspired by insect pursuit. We test this system against heavily cluttered natural scenes. Inclusion of facilitation not only substantially improves success for even short-duration pursuits, but it also enhances the ability to ‘attend’ to one target in the presence of distracters. Our model predicts optimal facilitation parameters that are static in space and dynamic in time, changing with respect to the amount of background clutter and the intended purpose of the pursuit. Our results provide insights into insect neurophysiology and show the potential of this algorithm for implementation in artificial visual systems and robotic applications.  相似文献   

9.
Seasonal climate forecasts are being used increasingly across a range of application sectors. A recent UK governmental report asked: how good are seasonal forecasts on a scale of 1–5 (where 5 is very good), and how good can we expect them to be in 30 years time? Seasonal forecasts are made from ensembles of integrations of numerical models of climate. We argue that ‘goodness’ should be assessed first and foremost in terms of the probabilistic reliability of these ensemble-based forecasts; reliable inputs are essential for any forecast-based decision-making. We propose that a ‘5’ should be reserved for systems that are not only reliable overall, but where, in particular, small ensemble spread is a reliable indicator of low ensemble forecast error. We study the reliability of regional temperature and precipitation forecasts of the current operational seasonal forecast system of the European Centre for Medium-Range Weather Forecasts, universally regarded as one of the world-leading operational institutes producing seasonal climate forecasts. A wide range of ‘goodness’ rankings, depending on region and variable (with summer forecasts of rainfall over Northern Europe performing exceptionally poorly) is found. Finally, we discuss the prospects of reaching ‘5’ across all regions and variables in 30 years time.  相似文献   

10.
The structure of complex networks has attracted much attention in recent years. It has been noted that many real-world examples of networked systems share a set of common architectural features. This raises important questions about their origin, for example whether such network attributes reflect common design principles or constraints imposed by selectional forces that have shaped the evolution of network topology. Is it possible to place the many patterns and forms of complex networks into a common space that reveals their relations, and what are the main rules and driving forces that determine which positions in such a space are occupied by systems that have actually evolved? We suggest that these questions can be addressed by combining concepts from two currently relatively unconnected fields. One is theoretical morphology, which has conceptualized the relations between morphological traits defined by mathematical models of biological form. The second is network science, which provides numerous quantitative tools to measure and classify different patterns of local and global network architecture across disparate types of systems. Here, we explore a new theoretical concept that lies at the intersection between both fields, the ‘network morphospace’. Defined by axes that represent specific network traits, each point within such a space represents a location occupied by networks that share a set of common ‘morphological’ characteristics related to aspects of their connectivity. Mapping a network morphospace reveals the extent to which the space is filled by existing networks, thus allowing a distinction between actual and impossible designs and highlighting the generative potential of rules and constraints that pervade the evolution of complex systems.  相似文献   

11.
In biological systems, individual phenotypes are typically adopted by multiple genotypes. Examples include protein structure phenotypes, where each structure can be adopted by a myriad individual amino acid sequence genotypes. These genotypes form vast connected ‘neutral networks’ in genotype space. The size of such neutral networks endows biological systems not only with robustness to genetic change, but also with the ability to evolve a vast number of novel phenotypes that occur near any one neutral network. Whether technological systems can be designed to have similar properties is poorly understood. Here we ask this question for a class of programmable electronic circuits that compute digital logic functions. The functional flexibility of such circuits is important in many applications, including applications of evolutionary principles to circuit design. The functions they compute are at the heart of all digital computation. We explore a vast space of 1045 logic circuits (‘genotypes’) and 1019 logic functions (‘phenotypes’). We demonstrate that circuits that compute the same logic function are connected in large neutral networks that span circuit space. Their robustness or fault-tolerance varies very widely. The vicinity of each neutral network contains circuits with a broad range of novel functions. Two circuits computing different functions can usually be converted into one another via few changes in their architecture. These observations show that properties important for the evolvability of biological systems exist in a commercially important class of electronic circuitry. They also point to generic ways to generate fault-tolerant, adaptable and evolvable electronic circuitry.  相似文献   

12.
Simplified mechanistic models of gene regulation are fundamental to systems biology and essential for synthetic biology. However, conventional simplified models typically have outputs that are not directly measurable and are based on assumptions that do not often hold under experimental conditions. To resolve these issues, we propose a ‘model reduction’ methodology and simplified kinetic models of total mRNA and total protein concentration, which link measurements, models and biochemical mechanisms. The proposed approach is based on assumptions that hold generally and include typical cases in systems and synthetic biology where conventional models do not hold. We use novel assumptions regarding the ‘speed of reactions’, which are required for the methodology to be consistent with experimental data. We also apply the methodology to propose simplified models of gene regulation in the presence of multiple protein binding sites, providing both biological insights and an illustration of the generality of the methodology. Lastly, we show that modelling total protein concentration allows us to address key questions on gene regulation, such as efficiency, burden, competition and modularity.  相似文献   

13.
Scatter hoarders are animals (e.g. squirrels) who cache food (nuts) over a number of sites for later collection. A certain minimum amount of food must be recovered, possibly after pilfering by another animal, in order to survive the winter. An optimal caching strategy is one that maximizes the survival probability, given worst case behaviour of the pilferer. We modify certain ‘accumulation games’ studied by Kikuta & Ruckle (2000 J. Optim. Theory Appl.) and Kikuta & Ruckle (2001 Naval Res. Logist.), which modelled the problem of optimal diversification of resources against catastrophic loss, to include the depth at which the food is hidden at each caching site. Optimal caching strategies can then be determined as equilibria in a new ‘caching game’. We show how the distribution of food over sites and the site-depths of the optimal caching varies with the animal''s survival requirements and the amount of pilfering. We show that in some cases, ‘decoy nuts’ are required to be placed above other nuts that are buried further down at the same site. Methods from the field of search games are used. Some empirically observed behaviour can be shown to be optimal in our model.  相似文献   

14.
Biological organisms rely on their ability to sense and respond appropriately to their environment. The molecular mechanisms that facilitate these essential processes are however subject to a range of random effects and stochastic processes, which jointly affect the reliability of information transmission between receptors and, for example, the physiological downstream response. Information is mathematically defined in terms of the entropy; and the extent of information flowing across an information channel or signalling system is typically measured by the ‘mutual information’, or the reduction in the uncertainty about the output once the input signal is known. Here, we quantify how extrinsic and intrinsic noise affects the transmission of simple signals along simple motifs of molecular interaction networks. Even for very simple systems, the effects of the different sources of variability alone and in combination can give rise to bewildering complexity. In particular, extrinsic variability is apt to generate ‘apparent’ information that can, in extreme cases, mask the actual information that for a single system would flow between the different molecular components making up cellular signalling pathways. We show how this artificial inflation in apparent information arises and how the effects of different types of noise alone and in combination can be understood.  相似文献   

15.
Biological systems are often represented as Boolean networks and analysed to identify sensitive nodes which on perturbation disproportionately change a predefined output. There exist different kinds of perturbation methods: perturbation of function, perturbation of state and perturbation in update scheme. Nodes may have defects in interpretation of the inputs from other nodes and calculation of the node output. To simulate these defects and systematically assess their effect on the system output, two new function perturbations, referred to as ‘not of function’ and ‘function of not’, are introduced. In the former, the inputs are assumed to be correctly interpreted but the output of the update rule is perturbed; and in the latter, each input is perturbed but the correct update rule is applied. These and previously used perturbation methods were applied to two existing Boolean models, namely the human melanogenesis signalling network and the fly segment polarity network. Through mathematical simulations, it was found that these methods successfully identified nodes earlier found to be sensitive using other methods, and were also able to identify sensitive nodes which were previously unreported.Inspec keywords: Boolean functions, perturbation theory, physiological modelsOther keywords: Boolean models, biological networks, perturbation methods, human melanogenesis signalling network, fly segment polarity network  相似文献   

16.
Infection systems where traits of the host, such as acquired immunity, interact with the infection process can show complex dynamic behaviour with counter-intuitive results. In this study, we consider the traits ‘immune status’ and ‘exposure history’, and our aim is to assess the influence of acquired individual heterogeneity in these traits. We have built an individual-based model of Eimeria acervulina infections, a protozoan parasite with an environmental stage that causes coccidiosis in chickens. With the model, we simulate outbreaks of the disease under varying initial contaminations. Heterogeneity in the traits arises stochastically through differences in the dose and frequency of parasites that individuals pick up from the environment. We find that the relationship between the initial contamination and the severity of an outbreak has a non-monotonous ‘wave-like’ pattern. This pattern can be explained by an increased heterogeneity in the host population caused by the infection process at the most severe outbreaks. We conclude that when dealing with these types of infection systems, models that are used to develop or evaluate control measures cannot neglect acquired heterogeneity in the host population traits that interact with the infection process.  相似文献   

17.
Triclocarban and triclosan, two potent antibacterial molecules present in many consumer products, have been subject to growing debate on a number of issues, particularly in relation to their possible role in causing microbial resistance. In this computational study, we present molecular-level insights into the interaction between these antimicrobial agents and hydrated phospholipid bilayers (taken as a simple model for the cell membrane). Simulations are conducted by a novel ‘dual-resolution’ molecular dynamics approach which combines accuracy with efficiency: the antimicrobials, modelled atomistically, are mixed with simplified (coarse-grain) models of lipids and water. A first set of calculations is run to study the antimicrobials'' transfer free energies and orientations as a function of depth inside the membrane. Both molecules are predicted to preferentially accumulate in the lipid headgroup–glycerol region; this finding, which reproduces corresponding experimental data, is also discussed in terms of a general relation between solute partitioning and the intramembrane distribution of pressure. A second set of runs involves membranes incorporated with different molar concentrations of antimicrobial molecules (up to one antimicrobial per two lipids). We study the effects induced on fundamental membrane properties, such as the electron density, lateral pressure and electrical potential profiles. In particular, the analysis of the spontaneous curvature indicates that increasing antimicrobial concentrations promote a ‘destabilizing’ tendency towards non-bilayer phases, as observed experimentally. The antimicrobials'' influence on the self-assembly process is also investigated. The significance of our results in the context of current theories of antimicrobial action is discussed.  相似文献   

18.
Epidemics are frequently simulated on redundantly wired contact networks, which have many more links between sites than are minimally required to connect all. Consequently, the modelled pathogen can travel numerous alternative routes, complicating effective containment strategies. These networks have moreover been found to exhibit ‘scale-free’ properties and percolation, suggesting resilience to damage. However, realistic H5N1 avian influenza transmission probabilities and containment strategies, here modelled on the British poultry industry network, show that infection dynamics can additionally express characteristic scales. These system-preferred scales constitute small areas within an observed power law distribution that exhibit a lesser slope than the power law itself, indicating a slightly increased relative likelihood. These characteristic scales are here produced by a network-pervading intranet of so-called hotspot sites that propagate large epidemics below the percolation threshold. This intranet is, however, extremely vulnerable; targeted inoculation of a mere 3–6% (depending on incorporated biosecurity measures) of the British poultry industry network prevents large and moderate H5N1 outbreaks completely, offering an order of magnitude improvement over previously advocated strategies affecting the most highly connected ‘hub’ sites. In other words, hotspots and hubs are separate functional entities that do not necessarily coincide, and hotspots can make more effective inoculation targets. Given the ubiquity and relevance of networks (epidemics, Internet, power grids, protein interaction), recognition of this spreading regime elsewhere would suggest a similar disproportionate sensitivity to such surgical interventions.  相似文献   

19.
Population extinction is a fundamental biological process with applications to ecology, epidemiology, immunology, conservation biology and genetics. Although a monotonic relationship between initial population size and mean extinction time is predicted by virtually all theoretical models, attempts at empirical demonstration have been equivocal. We suggest that this anomaly is best explained with reference to the transient properties of ensembles of populations. Specifically, we submit that under experimental conditions, many populations escape their initially vulnerable state to reach quasi-stationarity, where effects of initial conditions are erased. Thus, extinction of populations initialized far from quasi-stationarity may be exposed to a two-phase extinction hazard. An empirical prediction of this theory is that the fit Cox proportional hazards regression model for the observed survival time distribution of a group of populations will be shown to violate the proportional hazards assumption early in the experiment, but not at later times. We report results of two experiments with the cladoceran zooplankton Daphnia magna designed to exhibit this phenomenon. In one experiment, habitat size was also varied. Statistical analysis showed that in one of these experiments a transformation occurred so that very early in the experiment there existed a transient phase during which the extinction hazard was primarily owing to the initial population size, and that this was gradually replaced by a more stable quasi-stationary phase. In the second experiment, only habitat size unambiguously displayed an effect. Analysis of data pooled from both experiments suggests that the overall extinction time distribution in this system results from the mixture of extinctions during the initial rapid phase, during which the effects of initial population size can be considerable, and a longer quasi-stationary phase, during which only habitat size has an effect. These are the first results, to our knowledge, of a two-phase population extinction process.  相似文献   

20.
Parameterisation of kinetic models plays a central role in computational systems biology. Besides the lack of experimental data of high enough quality, some of the biggest challenges here are identification issues. Model parameters can be structurally non‐identifiable because of functional relationships. Noise in measured data is usually considered to be a nuisance for parameter estimation. However, it turns out that intrinsic fluctuations in particle numbers can make parameters identifiable that were previously non‐identifiable. The authors present a method to identify model parameters that are structurally non‐identifiable in a deterministic framework. The method takes time course recordings of biochemical systems in steady state or transient state as input. Often a functional relationship between parameters presents itself by a one‐dimensional manifold in parameter space containing parameter sets of optimal goodness. Although the system''s behaviour cannot be distinguished on this manifold in a deterministic framework it might be distinguishable in a stochastic modelling framework. Their method exploits this by using an objective function that includes a measure for fluctuations in particle numbers. They show on three example models, immigration‐death, gene expression and Epo‐EpoReceptor interaction, that this resolves the non‐identifiability even in the case of measurement noise with known amplitude. The method is applied to partially observed recordings of biochemical systems with measurement noise. It is simple to implement and it is usually very fast to compute. This optimisation can be realised in a classical or Bayesian fashion.Inspec keywords: biochemistry, physiological models, stochastic processes, measurement errors, fluctuations, parameter estimationOther keywords: model parameter identification, deterministic framework, biochemical system, steady state, transient state, stochastic modelling framework, objective function, immigration‐death model, gene expression, Epo–EpoReceptor interaction, stochastic fluctuations, measurement noise  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号