首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 890 毫秒
1.
Plasticity-inducing stimuli must typically be presented many times before synaptic plasticity is expressed, perhaps because induction signals gradually accumulate before overt strength changes occur. We consider memory dynamics in a mathematical model with synapses that integrate plasticity induction signals before expressing plasticity. We find that the memory trace initially rises before reaching a maximum and then falling. The memory signal dissociates into separate oblivescence and reminiscence components, with reminiscence initially dominating recall. In radical contrast, related but nonintegrative models exhibit only a highly problematic oblivescence. Synaptic integration mechanisms possess natural timescales, depending on the statistics of the induction signals. Together with neuromodulation, these timescales may therefore also begin to provide a natural account of the well-known spacing effect in the transition to late-phase plasticity. Finally, we propose experiments that could distinguish between integrative and nonintegrative synapses. Such experiments should further elucidate the synaptic signal processing mechanisms postulated by our model.  相似文献   

2.
Neural systems as nonlinear filters   总被引:1,自引:0,他引:1  
Maass W  Sontag ED 《Neural computation》2000,12(8):1743-1772
Experimental data show that biological synapses behave quite differently from the symbolic synapses in all common artificial neural network models. Biological synapses are dynamic; their "weight" changes on a short timescale by several hundred percent in dependence of the past input to the synapse. In this article we address the question how this inherent synaptic dynamics (which should not be confused with long term learning) affects the computational power of a neural network. In particular, we analyze computations on temporal and spatiotemporal patterns, and we give a complete mathematical characterization of all filters that can be approximated by feedforward neural networks with dynamic synapses. It turns out that even with just a single hidden layer, such networks can approximate a very rich class of nonlinear filters: all filters that can be characterized by Volterra series. This result is robust with regard to various changes in the model for synaptic dynamics. Our characterization result provides for all nonlinear filters that are approximable by Volterra series a new complexity hierarchy related to the cost of implementing such filters in neural systems.  相似文献   

3.
Stochastic models of synaptic plasticity propose that single synapses perform a directed random walk of fixed step sizes in synaptic strength, thereby embracing the view that the mechanisms of synaptic plasticity constitute a stochastic dynamical system. However, fluctuations in synaptic strength present a formidable challenge to such an approach. We have previously proposed that single synapses must interpose an integration and filtering mechanism between the induction of synaptic plasticity and the expression of synaptic plasticity in order to control fluctuations. We analyze a class of three such mechanisms in the presence of possibly non-Markovian plasticity induction processes, deriving expressions for the mean expression time in these models. One of these filtering mechanisms constitutes a discrete low-pass filter that could be implemented on a small collection of molecules at single synapses, such as CaMKII, and we analyze this discrete filter in some detail. After considering Markov induction processes, we examine our own stochastic model of spike-timing-dependent plasticity, for which the probability density functions of the induction of plasticity steps have previously been derived. We determine the dependence of the mean time to express a plasticity step on pre- and postsynaptic firing rates in this model, and we also consider, numerically, the long-term stability against fluctuations of patterns of neuronal connectivity that typically emerge during neuronal development.  相似文献   

4.
We postulate that a simple, three-state synaptic switch governs changes in synaptic strength at individual synapses. Under this switch rule, we show that a variety of experimental results on timing-dependent plasticity can emerge from temporal and spatial averaging over multiple synapses and multiple spike pairings. In particular, we show that a critical window for the interaction of pre- and postsynaptic spikes emerges as an ensemble property of the collective system, with individual synapses exhibiting only a minimal form of spike coincidence detection. In addition, we show that a Bienenstock-Cooper-Munro-like, rate-based plasticity rule emerges directly from such a model. This demonstrates that two apparently separate forms of neuronal plasticity can emerge from a much simpler rule governing the plasticity of individual synapses.  相似文献   

5.
In most neural network models, synapses are treated as static weights that change only with the slow time scales of learning. It is well known, however, that synapses are highly dynamic and show use-dependent plasticity over a wide range of time scales. Moreover, synaptic transmission is an inherently stochastic process: a spike arriving at a presynaptic terminal triggers the release of a vesicle of neurotransmitter from a release site with a probability that can be much less than one. We consider a simple model for dynamic stochastic synapses that can easily be integrated into common models for networks of integrate-and-fire neurons (spiking neurons). The parameters of this model have direct interpretations in terms of synaptic physiology. We investigate the consequences of the model for computing with individual spikes and demonstrate through rigorous theoretical results that the computational power of the network is increased through the use of dynamic synapses.  相似文献   

6.
Experimental studies have observed synaptic potentiation when a presynaptic neuron fires shortly before a postsynaptic neuron and synaptic depression when the presynaptic neuron fires shortly after. The dependence of synaptic modulation on the precise timing of the two action potentials is known as spike-timing dependent plasticity (STDP). We derive STDP from a simple computational principle: synapses adapt so as to minimize the postsynaptic neuron's response variability to a given presynaptic input, causing the neuron's output to become more reliable in the face of noise. Using an objective function that minimizes response variability and the biophysically realistic spike-response model of Gerstner (2001), we simulate neurophysiological experiments and obtain the characteristic STDP curve along with other phenomena, including the reduction in synaptic plasticity as synaptic efficacy increases. We compare our account to other efforts to derive STDP from computational principles and argue that our account provides the most comprehensive coverage of the phenomena. Thus, reliability of neural response in the face of noise may be a key goal of unsupervised cortical adaptation.  相似文献   

7.
In standard Hebbian models of developmental synaptic plasticity, synaptic normalization must be introduced in order to constrain synaptic growth and ensure the presence of activity-dependent, competitive dynamics. In such models, multiplicative normalization cannot segregate afferents whose patterns of electrical activity are positively correlated, while subtractive normalization can. It is now widely believed that multiplicative normalization cannot segregate positively correlated afferents in any Hebbian model. However, we recently provided a counterexample to this belief by demonstrating that our own neurotrophic model of synaptic plasticity, which can segregate positively correlated afferents, can be reformulated as a nonlinear Hebbian model with competition implemented through multiplicative normalization. We now perform an analysis of a general class of Hebbian models under general forms of synaptic normalization. In particular, we extract conditions on the forms of these rules that guarantee that such models possess a fixed-point structure permitting the segregation of all but perfectly correlated afferents. We find that the failure of multiplicative normalization to segregate positively correlated afferents in a standard Hebbian model is quite atypical.  相似文献   

8.
9.
Attractor neural networks (ANNs) based on the Ising model are naturally fully connected and are homogeneous in structure. These features permit a deep understanding of the underlying mechanism, but limit the applicability of these models to the brain. A more biologically realistic model can be derived from an equally simple physical model by utilizing recurrent self-trapping inputs to supplement very sparse intranetwork interactions. This paper reports the analysis of a one-dimensional (1-D) ANN coupled to a second system that computes overlaps with a single stored memory. Results show that: 1) the 1-D self-trapping model is equivalent to an isolated ANN with both full connectivity of one strength and nearest neighbor synapses of an independent strength; 2) the dynamics of ANN and self-trapping updates are independent; 3) there is a critical synaptic noise level below which memory retrieval occurs; 4) the 1-D self-trapping model converges to a fully connected Hopfield model for zero strength nearest neighbor synapses, and has a greater magnitude memory overlap for nonzero strength nearest neighbor synapses; and (5) the mechanism of self-trapping is an iterative map on the mean overlap as a function of the reentrant input.  相似文献   

10.
Senn W  Fusi S 《Neural computation》2005,17(10):2106-2138
Learning in a neuronal network is often thought of as a linear superposition of synaptic modifications induced by individual stimuli. However, since biological synapses are naturally bounded, a linear superposition would cause fast forgetting of previously acquired memories. Here we show that this forgetting can be avoided by introducing additional constraints on the synaptic and neural dynamics. We consider Hebbian plasticity of excitatory synapses. A synapse is modified only if the postsynaptic response does not match the desired output. With this learning rule, the original memory performances with unbounded weights are regained, provided that (1) there is some global inhibition, (2) the learning rate is small, and (3) the neurons can discriminate small differences in the total synaptic input (e.g., by making the neuronal threshold small compared to the total postsynaptic input). We prove in the form of a generalized perceptron convergence theorem that under these constraints, a neuron learns to classify any linearly separable set of patterns, including a wide class of highly correlated random patterns. During the learning process, excitation becomes roughly balanced by inhibition, and the neuron classifies the patterns on the basis of small differences around this balance. The fact that synapses saturate has the additional benefit that nonlinearly separable patterns, such as similar patterns with contradicting outputs, eventually generate a subthreshold response, and therefore silence neurons that cannot provide any information.  相似文献   

11.
We studied the hypothesis that synaptic dynamics is controlled by three basic principles: (1) synapses adapt their weights so that neurons can effectively transmit information, (2) homeostatic processes stabilize the mean firing rate of the postsynaptic neuron, and (3) weak synapses adapt more slowly than strong ones, while maintenance of strong synapses is costly. Our results show that a synaptic update rule derived from these principles shares features, with spike-timing-dependent plasticity, is sensitive to correlations in the input and is useful for synaptic memory. Moreover, input selectivity (sharply tuned receptive fields) of postsynaptic neurons develops only if stimuli with strong features are presented. Sharply tuned neurons can coexist with unselective ones, and the distribution of synaptic weights can be unimodal or bimodal. The formulation of synaptic dynamics through an optimality criterion provides a simple graphical argument for the stability of synapses, necessary for synaptic memory.  相似文献   

12.
Synapses play a central role in neural computation: the strengths of synaptic connections determine the function of a neural circuit. In conventional models of computation, synaptic strength is assumed to be a static quantity that changes only on the slow timescale of learning. In biological systems, however, synaptic strength undergoes dynamic modulation on rapid timescales through mechanisms such as short term facilitation and depression. Here we describe a general model of computation that exploits dynamic synapses, and use a backpropagation-like algorithm to adjust the synaptic parameters. We show that such gradient descent suffices to approximate a given quadratic filter by a rather small neural system with dynamic synapses. We also compare our network model to artificial neural networks designed for time series processing. Our numerical results are complemented by theoretical analyses which show that even with just a single hidden layer such networks can approximate a surprisingly large class of nonlinear filters: all filters that can be characterized by Volterra series. This result is robust with regard to various changes in the model for synaptic dynamics.  相似文献   

13.
In earlier work we presented a stochastic model of spike-timing-dependent plasticity (STDP) in which STDP emerges only at the level of temporal or spatial synaptic ensembles. We derived the two-spike interaction function from this model and showed that it exhibits an STDP-like form. Here, we extend this work by examining the general n-spike interaction functions that may be derived from the model. A comparison between the two-spike interaction function and the higher-order interaction functions reveals profound differences. In particular, we show that the two-spike interaction function cannot support stable, competitive synaptic plasticity, such as that seen during neuronal development, without including modifications designed specifically to stabilize its behavior. In contrast, we show that all the higher-order interaction functions exhibit a fixed-point structure consistent with the presence of competitive synaptic dynamics. This difference originates in the unification of our proposed "switch" mechanism for synaptic plasticity, coupling synaptic depression and synaptic potentiation processes together. While three or more spikes are required to probe this coupling, two spikes can never do so. We conclude that this coupling is critical to the presence of competitive dynamics and that multispike interactions are therefore vital to understanding synaptic competition.  相似文献   

14.
The cerebellar cortical circuitry may support a distinct second form of associative learning, complementary to the well-known synaptic plasticity (long term depression, LTD) that has been previously shown. As the granule cell axons ascend to the molecular layer, they make multiple synapses on the overlying Purkinje cells (PC). This ascending branch (AB) input, which has been ignored in models of cerebellar learning, is likely to be functionally distinct from the parallel fiber (PF) synaptic input. We predict that AB-PF correlations lead to Hebbian-type learning at the PF-PC synapse, including long term potentiation (LTP), and allowing the cortical circuit to combine AB-PF LTP for feedforward state prediction with climbing fiber LTD for feedback error correction. The new learning mechanism could therefore add computational capacity to cerebellar models and may explain more of the experimental data.  相似文献   

15.
Cortical neurons in vivo undergo a continuous bombardment due to synaptic activity, which acts as a major source of noise. Here, we investigate the effects of the noise filtering by synapses with various levels of realism on integrate-and-fire neuron dynamics. The noise input is modeled by white (for instantaneous synapses) or colored (for synapses with a finite relaxation time) noise. Analytical results for the modulation of firing probability in response to an oscillatory input current are obtained by expanding a Fokker-Planck equation for small parameters of the problem - when both the amplitude of the modulation is small compared to the background firing rate and the synaptic time constant is small compared to the membrane time constant. We report here the detailed calculations showing that if a synaptic decay time constant is included in the synaptic current model, the firing-rate modulation of the neuron due to an oscillatory input remains finite in the high-frequency limit with no phase lag. In addition, we characterize the low-frequency behavior and the behavior of the high-frequency limit for intermediate decay times. We also characterize the effects of introducing a rise time to the synaptic currents and the presence of several synaptic receptors with different kinetics. In both cases, we determine, using numerical simulations, an effective decay time constant that describes the neuronal response completely.  相似文献   

16.
We model experience-dependent plasticity in the adult rat S1 cortical representation of the whiskers (the barrel cortex) which has been produced by trimming all whiskers on one side of the snout except two. This manipulation alters the pattern of afferent sensory activity while avoiding any direct nerve damage. Our simplified model circuitry represents multiple cortical layers and inhibitory neurons within each layer of a barrel-column. Utilizing a computational model we show that the evolution of the response bias in the barrel-column towards spared whiskers is consistent with synaptic modifications that follow the rules of the Bienenstock, Cooper and Munro (BCM) theory. The BCM theory postulates that a neuron possesses a dynamic synaptic modification threshold, thetaM, which dictates whether the neuron's activity at any given instant will lead to strengthening or weakening of the synapses impinging on it. However, the major prediction of our model is the explanation of the delay in response potentiation in the layer-IV neurons through a masking effect produced by the thresholded monotonically increasing inhibition expressed by either the logarithmic function, h(x) = mu log(1 + x), or by the power function, h(x) = mu x(0.8-0.9), where mu is a constant. Furthermore, simulated removal of the supragranular layers (layers II/III) reduces plasticity of neurons in the remaining layers (IV-VI) and points to the role of noise in synaptic plasticity.  相似文献   

17.
Neural associative memories are perceptron-like single-layer networks with fast synaptic learning typically storing discrete associations between pairs of neural activity patterns. Previous work optimized the memory capacity for various models of synaptic learning: linear Hopfield-type rules, the Willshaw model employing binary synapses, or the BCPNN rule of Lansner and Ekeberg, for example. Here I show that all of these previous models are limit cases of a general optimal model where synaptic learning is determined by probabilistic Bayesian considerations. Asymptotically, for large networks and very sparse neuron activity, the Bayesian model becomes identical to an inhibitory implementation of the Willshaw and BCPNN-type models. For less sparse patterns, the Bayesian model becomes identical to Hopfield-type networks employing the covariance rule. For intermediate sparseness or finite networks, the optimal Bayesian learning rule differs from the previous models and can significantly improve memory performance. I also provide a unified analytical framework to determine memory capacity at a given output noise level that links approaches based on mutual information, Hamming distance, and signal-to-noise ratio.  相似文献   

18.
Computer simulation of a CA1 hippocampal pyramidal neuron is used to estimate the effects of synaptic and spatio-temporal noise on such a cell's ability to accurately calculate the weighted sum of its inputs, presented in the form of transient patterns of activity. Comparison is made between the pattern recognition capability of the cell in the presence of this noise and that of a noise-free computing unit in an artificial neural network model of a heteroassociative memory. Spatio-temporal noise due to the spatial distribution of synaptic input and quantal variance at each synapse degrade the accuracy of signal integration and consequently reduce pattern recognition performance in the cell. It is shown here that a certain degree of asynchrony in action potential arrival at different synapses, however, can improve signal integration. Signal amplification by voltage-dependent conductances in the dendrites, provided by synaptic NMDA receptors, and sodium and calcium ion channels, also improves integration and pattern recognition. While the biological sources of noise are significant when few patterns are stored in the associative memory of which the cell is a part, when large numbers of patterns are stored the noise from the other stored patterns comes to dominate the pattern recognition process. In this situation, the pattern recognition performance of the pyramidal cell is within a factor of two of that of the computing unit in the artificial neural network model.  相似文献   

19.
A distributed and locally reprogrammable address–event receiver has been designed, in which incoming address–events are monitored simultaneously by all synapses, allowing for arbitrarily large axonal fan-out without reducing channel capacity. Synapses can change the address of their presynaptic neuron, allowing the distributed implementation of a biologically realistic learning rule, with both synapse formation and elimination (synaptic rewiring). Probabilistic synapse formation leads to topographic map development, made possible by a cross-chip current-mode calculation of Euclidean distance. As well as synaptic plasticity in rewiring, synapses change weights using a competitive Hebbian learning rule (spike-timing-dependent plasticity). The weight plasticity allows receptive fields to be modified based on spatio–temporal correlations in the inputs, and the rewiring plasticity allows these modifications to become embedded in the network topology.   相似文献   

20.
Recently we presented a stochastic, ensemble-based model of spike-timing-dependent plasticity. In this model, single synapses do not exhibit plasticity depending on the exact timing of pre- and postsynaptic spikes, but spike-timing-dependent plasticity emerges only at the temporal or synaptic ensemble level. We showed that such a model reproduces a variety of experimental results in a natural way, without the introduction of various, ad hoc nonlinearities characteristic of some alternative models. Our previous study was restricted to an examination, analytically, of two-spike interactions, while higher-order, multispike interactions were only briefly examined numerically. Here we derive exact, analytical results for the general n-spike interaction functions in our model. Our results form the basis for a detailed examination, performed elsewhere, of the significant differences between these functions and the implications these differences have for the presence, or otherwise, of stable, competitive dynamics in our model.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号