首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Synapses play a central role in neural computation: the strengths of synaptic connections determine the function of a neural circuit. In conventional models of computation, synaptic strength is assumed to be a static quantity that changes only on the slow timescale of learning. In biological systems, however, synaptic strength undergoes dynamic modulation on rapid timescales through mechanisms such as short term facilitation and depression. Here we describe a general model of computation that exploits dynamic synapses, and use a backpropagation-like algorithm to adjust the synaptic parameters. We show that such gradient descent suffices to approximate a given quadratic filter by a rather small neural system with dynamic synapses. We also compare our network model to artificial neural networks designed for time series processing. Our numerical results are complemented by theoretical analyses which show that even with just a single hidden layer such networks can approximate a surprisingly large class of nonlinear filters: all filters that can be characterized by Volterra series. This result is robust with regard to various changes in the model for synaptic dynamics.  相似文献   

2.
In this article we revisit the classical neuroscience paradigm of Hebbian learning. We find that it is difficult to achieve effective associative memory storage by Hebbian synaptic learning, since it requires network-level information at the synaptic level or sparse coding level. Effective learning can yet be achieved even with nonsparse patterns by a neuronal process that maintains a zero sum of the incoming synaptic efficacies. This weight correction improves the memory capacity of associative networks from an essentially bounded one to a memory capacity that scales linearly with network size. It also enables the effective storage of patterns with multiple levels of activity within a single network. Such neuronal weight correction can be successfully carried out by activity-dependent homeostasis of the neuron's synaptic efficacies, which was recently observed in cortical tissue. Thus, our findings suggest that associative learning by Hebbian synaptic learning should be accompanied by continuous remodeling of neuronally driven regulatory processes in the brain.  相似文献   

3.
It is known that storage capacity per synapse increases by synaptic pruning in the case of a correlation-type associative memory model. However, the storage capacity of the entire network then decreases. To overcome this difficulty, we propose decreasing the connectivity while keeping the total number of synapses constant by introducing delayed synapses. In this paper, a discrete synchronous-type model with both delayed synapses and their prunings is discussed as a concrete example of the proposal. First, we explain the Yanai-Kim theory by employing statistical neurodynamics. This theory involves macrodynamical equations for the dynamics of a network with serial delay elements. Next, considering the translational symmetry of the explained equations, we rederive macroscopic steady-state equations of the model by using the discrete Fourier transformation. The storage capacities are analyzed quantitatively. Furthermore, two types of synaptic prunings are treated analytically: random pruning and systematic pruning. As a result, it becomes clear that in both prunings, the storage capacity increases as the length of delay increases and the connectivity of the synapses decreases when the total number of synapses is constant. Moreover, an interesting fact becomes clear: the storage capacity asymptotically approaches 2//spl pi/ due to random pruning. In contrast, the storage capacity diverges in proportion to the logarithm of the length of delay by systematic pruning and the proportion constant is 4//spl pi/. These results theoretically support the significance of pruning following an overgrowth of synapses in the brain and may suggest that the brain prefers to store dynamic attractors such as sequences and limit cycles rather than equilibrium states.  相似文献   

4.
Neural systems as nonlinear filters   总被引:1,自引:0,他引:1  
Maass W  Sontag ED 《Neural computation》2000,12(8):1743-1772
Experimental data show that biological synapses behave quite differently from the symbolic synapses in all common artificial neural network models. Biological synapses are dynamic; their "weight" changes on a short timescale by several hundred percent in dependence of the past input to the synapse. In this article we address the question how this inherent synaptic dynamics (which should not be confused with long term learning) affects the computational power of a neural network. In particular, we analyze computations on temporal and spatiotemporal patterns, and we give a complete mathematical characterization of all filters that can be approximated by feedforward neural networks with dynamic synapses. It turns out that even with just a single hidden layer, such networks can approximate a very rich class of nonlinear filters: all filters that can be characterized by Volterra series. This result is robust with regard to various changes in the model for synaptic dynamics. Our characterization result provides for all nonlinear filters that are approximable by Volterra series a new complexity hierarchy related to the cost of implementing such filters in neural systems.  相似文献   

5.
We study the effect of competition between short-term synaptic depression and facilitation on the dynamic properties of attractor neural networks, using Monte Carlo simulation and a mean-field analysis. Depending on the balance of depression, facilitation, and the underlying noise, the network displays different behaviors, including associative memory and switching of activity between different attractors. We conclude that synaptic facilitation enhances the attractor instability in a way that (1) intensifies the system adaptability to external stimuli, which is in agreement with experiments, and (2) favors the retrieval of information with less error during short time intervals.  相似文献   

6.
It has been shown in studies of biological synaptic plasticity that synaptic efficacy can change in a very short time window, compared to the time scale associated with typical neural events. This time scale is small enough to possibly have an effect on pattern recall processes in neural networks. We study properties of a neural network which uses a cyclic Hebb rule. Then we add the short term potentiation of synapses in the recall phase. We show that this approach preserves the ability of the network to recognize the patterns stored by the network and that the network does not respond to other patterns at the same time. We show that this approach dramatically increases the capacity of the network at the cost of a longer pattern recall process. We discuss that the network possesses two types of recall. The fast recall does not need synaptic plasticity to recognize a pattern, while the slower recall utilizes synaptic plasticity. This is something that we all experience in our daily lives: some memories can be recalled promptly whereas recollection of other memories requires much more time.  相似文献   

7.
Neurophysiological experiments show that the strength of synaptic connections can undergo substantial changes on a short time scale. These changes depend on the history of the presynaptic input. Using mean-field techniques, we study how short-time dynamics of synaptic connections influence the performance of attractor neural networks in terms of their memory capacity and capability to process external signals. For binary discrete-time as well as for firing rate continuous-time neural networks, the fixed points of the network dynamics are shown to be unaffected by synaptic dynamics. However, the stability of patterns changes considerably. Synaptic depression turns out to reduce the storage capacity. On the other hand, synaptic depression is shown to be advantageous for processing of pattern sequences. The analytical results on stability, size of the basins of attraction and on the switching between patterns are complemented by numerical simulations.  相似文献   

8.
9.
The CA3 region of the hippocampus is a recurrent neural network that is essential for the storage and replay of sequences of patterns that represent behavioral events. Here we present a theoretical framework to calculate a sparsely connected network's capacity to store such sequences. As in CA3, only a limited subset of neurons in the network is active at any one time, pattern retrieval is subject to error, and the resources for plasticity are limited. Our analysis combines an analytical mean field approach, stochastic dynamics, and cellular simulations of a time-discrete McCulloch-Pitts network with binary synapses. To maximize the number of sequences that can be stored in the network, we concurrently optimize the number of active neurons, that is, pattern size, and the firing threshold. We find that for one-step associations (i.e., minimal sequences), the optimal pattern size is inversely proportional to the mean connectivity c, whereas the optimal firing threshold is independent of the connectivity. If the number of synapses per neuron is fixed, the maximum number P of stored sequences in a sufficiently large, nonmodular network is independent of its number N of cells. On the other hand, if the number of synapses scales as the network size to the power of 3/2, the number of sequences P is proportional to N. In other words, sequential memory is scalable. Furthermore, we find that there is an optimal ratio r between silent and nonsilent synapses at which the storage capacity alpha = P//[c(1 + r)N] assumes a maximum. For long sequences, the capacity of sequential memory is about one order of magnitude below the capacity for minimal sequences, but otherwise behaves similar to the case of minimal sequences. In a biologically inspired scenario, the information content per synapse is far below theoretical optimality, suggesting that the brain trades off error tolerance against information content in encoding sequential memories.  相似文献   

10.
In this paper, we investigate the effect of synaptogenesis on memories in the brain, using the abstract-associative memory model, Hopfield model with the zero-order synaptic decay. Using the numerical simulation, we demonstrate the possibility that synaptogenesis plays a role in maintaining recent memories embedded in the network while avoiding overloading. For the network consisting of 1000 units, it turned out that the minimum decay rate to avoid overloading is 0.02, and the optimal decay rate to maximize the storage capacity is 0.08. We also show that the average numbers of replacement synapses at each learning step corresponding to these two values are 1187 and 21024, respectively.  相似文献   

11.
In most neural network models, synapses are treated as static weights that change only with the slow time scales of learning. It is well known, however, that synapses are highly dynamic and show use-dependent plasticity over a wide range of time scales. Moreover, synaptic transmission is an inherently stochastic process: a spike arriving at a presynaptic terminal triggers the release of a vesicle of neurotransmitter from a release site with a probability that can be much less than one. We consider a simple model for dynamic stochastic synapses that can easily be integrated into common models for networks of integrate-and-fire neurons (spiking neurons). The parameters of this model have direct interpretations in terms of synaptic physiology. We investigate the consequences of the model for computing with individual spikes and demonstrate through rigorous theoretical results that the computational power of the network is increased through the use of dynamic synapses.  相似文献   

12.
Recent experimental findings show that the efficacy of transmission in cortical synapses depends on presynaptic activity. In most neural models, however, the synapses are regarded as static entities where this dependence is not included. We study the role of activity-dependent (dynamic) synapses in neuronal responses to temporal patterns of afferent activity. Our results demonstrate that, for suitably chosen threshold values, dynamic synapses are capable of coincidence detection (CD) over a much larger range of frequencies than static synapses. The phenomenon appears to be valid for an integrate-and-fire as well as a Hodgkin-Huxley neuron and various types of CD tasks.  相似文献   

13.
Image preprocessing with dynamic synapses   总被引:1,自引:0,他引:1  
Different algorithms suitable for a specific class of picture were developed for image processing. We will represent the filtering capability of a spiking neural network based on dynamic synapses. For this intention we chose an x-ray image of the human coronary trees and another noisy image. In other words the task at hand is to show how accurately such a network is able to store various aspects (object/background) of stimulus in the variables which describe dynamic of synaptic response. The behavior of these synapses influences the effective connection in the network in a short time-scale. Such a network has a low activity and a balanced behavior. Dynamic synapses are able to adjust their behavior by fast changing stimuli. These synapses retain the information in the variables, such as potential and time.  相似文献   

14.
15.
Electronic neuromorphic devices with on-chip, on-line learning should be able to modify quickly the synaptic couplings to acquire information about new patterns to be stored (synaptic plasticity) and, at the same time, preserve this information on very long time scales (synaptic stability). Here, we illustrate the electronic implementation of a simple solution to this stability-plasticity problem, recently proposed and studied in various contexts. It is based on the observation that reducing the analog depth of the synapses to the extreme (bistable synapses) does not necessarily disrupt the performance of the device as an associative memory, provided that 1) the number of neurons is large enough; 2) the transitions between stable synaptic states are stochastic; and 3) learning is slow. The drastic reduction of the analog depth of the synaptic variable also makes this solution appealing from the point of view of electronic implementation and offers a simple methodological alternative to the technological solution based on floating gates. We describe the full custom analog very large-scale integration (VLSI) realization of a small network of integrate-and-fire neurons connected by bistable deterministic plastic synapses which can implement the idea of stochastic learning. In the absence of stimuli, the memory is preserved indefinitely. During the stimulation the synapse undergoes quick temporary changes through the activities of the pre- and postsynaptic neurons; those changes stochastically result in a long-term modification of the synaptic efficacy. The intentionally disordered pattern of connectivity allows the system to generate a randomness suited to drive the stochastic selection mechanism. We check by a suitable stimulation protocol that the stochastic synaptic plasticity produces the expected pattern of potentiation and depression in the electronic network.  相似文献   

16.
Zhou L  Zhao S  Nadim F 《Neurocomputing》2007,70(10-12):2050-2054
Network plasticity arises in large part due to the effects of exogenous neuromodulators. We investigate the neuromodulatory effects on short-term synaptic dynamics. The synapse from the lateral pyloric (LP) to the pyloric dilator (PD) neuron in the pyloric network of the crab C. borealis has both spike-mediated and non-spike-mediated (graded) components. Previous studies have shown that the graded component of this synapse exhibits short-term depression. Recent results from our lab indicate that in the presence of neuromodulatory peptide proctolin, low-amplitude presynaptic stimuli switch the short-term dynamics of this graded component from depression to facilitation. In this study, we show that this facilitation is correlated with the activation of a presynaptic inward current that is blocked by Mn(2+) suggesting that it is a slowly-accumulating Ca(2+) current. We modify a mechanistic model of synaptic release by assuming that the low-voltage-activating Ca(2+) current in our system is composed of two currents with fast (I(CaF)) and slow (I(CaS)) kinetics. We show that if proctolin adjusts the activation rate of I(CaS), this leads to accumulation of local intracellular Ca(2+) in response to multiple presynaptic voltage stimuli which, in turn, results in synaptic facilitation. Additionally, we assume that proctolin increases the maximal conductances of Ca(2+) currents in the model, consistent with the increased synaptic release found in the experiments. We find that these two presynaptic actions of proctolin in the model are sufficient to describe its actions on the short-term dynamics of the LP to PD synapse.  相似文献   

17.
Stochastic models of synaptic plasticity propose that single synapses perform a directed random walk of fixed step sizes in synaptic strength, thereby embracing the view that the mechanisms of synaptic plasticity constitute a stochastic dynamical system. However, fluctuations in synaptic strength present a formidable challenge to such an approach. We have previously proposed that single synapses must interpose an integration and filtering mechanism between the induction of synaptic plasticity and the expression of synaptic plasticity in order to control fluctuations. We analyze a class of three such mechanisms in the presence of possibly non-Markovian plasticity induction processes, deriving expressions for the mean expression time in these models. One of these filtering mechanisms constitutes a discrete low-pass filter that could be implemented on a small collection of molecules at single synapses, such as CaMKII, and we analyze this discrete filter in some detail. After considering Markov induction processes, we examine our own stochastic model of spike-timing-dependent plasticity, for which the probability density functions of the induction of plasticity steps have previously been derived. We determine the dependence of the mean time to express a plasticity step on pre- and postsynaptic firing rates in this model, and we also consider, numerically, the long-term stability against fluctuations of patterns of neuronal connectivity that typically emerge during neuronal development.  相似文献   

18.
Short-term synaptic plasticity and network behavior.   总被引:3,自引:0,他引:3  
We develop a minimal time-continuous model for use-dependent synaptic short-term plasticity that can account for both short-term depression and short-term facilitation. It is analyzed in the context of the spike response neuron model. Explicit expressions are derived for the synaptic strength as a function of previous spike arrival times. These results are then used to investigate the behavior of large networks of highly interconnected neurons in the presence of short-term synaptic plasticity. We extend previous results so as to elucidate the existence and stability of limit cycles with coherently firing neurons. After the onset of an external stimulus, we have found complex transient network behavior that manifests itself as a sequence of different modes of coherent firing until a stable limit cycle is reached.  相似文献   

19.
Human and animal studies show that mammalian brains undergo massive synaptic pruning during childhood, losing about half of the synapses by puberty. We have previously shown that maintaining the network performance while synapses are deleted requires that synapses be properly modified and pruned, with the weaker synapses removed. We now show that neuronal regulation, a mechanism recently observed to maintain the average neuronal input field of a postsynaptic neuron, results in a weight-dependent synaptic modification. Under the correct range of the degradation dimension and synaptic upper bound, neuronal regulation removes the weaker synapses and judiciously modifies the remaining synapses. By deriving optimal synaptic modification functions in an excitatory-inhibitory network, we prove that neuronal regulation implements near-optimal synaptic modification and maintains the performance of a network undergoing massive synaptic pruning. These findings support the possibility that neural regulation complements the action of Hebbian synaptic changes in the self-organization of the developing brain.  相似文献   

20.
Shao J  Tsao TH  Butera R 《Neural computation》2006,18(9):2029-2035
Bursting, a dynamical phenomenon whereby episodes of neural action potentials are punctuated by periodic episodes of inactivity, is ubiquitous in neural systems. Examples include components of the respiratory rhythm generating circuitry in the brain stem, spontaneous activity in the neonatal rat spinal cord, and developing neural networks in the retina of the immature ferret. Bursting can also manifest itself in single neurons. Bursting dynamics require one or more kinetic processes slower than the timescale of the action potentials. Such processes usually manifest themselves in intrinsic ion channel properties, such as slow voltage-dependent gating or calcium-dependent processes, or synaptic mechanisms, such as synaptic depression. In this note, we show rhythmic bursting in a simulated neural network where no such slow processes exist at the cellular or synaptic level. Rather, the existence of rhythmic bursting is critically dependent on the connectivity of the network and manifests itself only when connectivity is characterized as small world. The slow process underlying the timescale of bursting manifests itself as a progressive synchronization of the network within each burst.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号