首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Some neurons encode information about the orientation or position of an animal, and can maintain their response properties in the absence of visual input. Examples include head direction cells in rats and primates, place cells in rats and spatial view cells in primates. 'Continuous attractor' neural networks model these continuous physical spaces by using recurrent collateral connections between the neurons which reflect the distance between the neurons in the state space (e.g. head direction space) of the animal. These networks maintain a localized packet of neuronal activity representing the current state of the animal. We show how the synaptic connections in a one-dimensional continuous attractor network (of for example head direction cells) could be self-organized by associative learning. We also show how the activity packet could be moved from one location to another by idiothetic (self-motion) inputs, for example vestibular or proprioceptive, and how the synaptic connections could self-organize to implement this. The models described use 'trace' associative synaptic learning rules that utilize a form of temporal average of recent cell activity to associate the firing of rotation cells with the recent change in the representation of the head direction in the continuous attractor. We also show how a nonlinear neuronal activation function that could be implemented by NMDA receptors could contribute to the stability of the activity packet that represents the current state of the animal.  相似文献   

2.
We introduce a model of generalized Hebbian learning and retrieval in oscillatory neural networks modeling cortical areas such as hippocampus and olfactory cortex. Recent experiments have shown that synaptic plasticity depends on spike timing, especially on synapses from excitatory pyramidal cells, in hippocampus, and in sensory and cerebellar cortex. Here we study how such plasticity can be used to form memories and input representations when the neural dynamics are oscillatory, as is common in the brain (particularly in the hippocampus and olfactory cortex). Learning is assumed to occur in a phase of neural plasticity, in which the network is clamped to external teaching signals. By suitable manipulation of the nonlinearity of the neurons or the oscillation frequencies during learning, the model can be made, in a retrieval phase, either to categorize new inputs or to map them, in a continuous fashion, onto the space spanned by the imprinted patterns. We identify the first of these possibilities with the function of olfactory cortex and the second with the observed response characteristics of place cells in hippocampus. We investigate both kinds of networks analytically and by computer simulations, and we link the models with experimental findings, exploring, in particular, how the spike timing dependence of the synaptic plasticity constrains the computational function of the network and vice versa.  相似文献   

3.
Realistic neural networks involve the coexistence of stiff, coupled, continuous differential equations arising from the integrations of individual neurons, with the discrete events with delays used for modeling synaptic connections. We present here an integration method, the local variable time-step method (lvardt), that uses separate variable-step integrators for individual neurons in the network. Cells that are undergoing excitation tend to have small time steps, and cells that are at rest with little synaptic input tend to have large time steps. A synaptic input to a cell causes reinitialization of only that cell's integrator without affecting the integration of other cells. We illustrated the use of lvardt on three models: a worst-case synchronizing mutual-inhibition model, a best-case synfire chain model, and a more realistic thalamocortical network model.  相似文献   

4.
It is generally accepted among neuroscientists that the sensory cortex of the brain is arranged in a layered structure. Based on a unified quantum holographic approach to artificial neural network models implemented with coherent, hybrid optoelectronic, or analog electronic neurocomputer architectures, the present paper establishes a novel identity for the matching polynomials of complete bichromatic graphs which implement the intrinsic connections between neurons of local networks located in neural layers.  相似文献   

5.
The role of synaptic dynamics in processing neural information is investigated in a neural information channel with realistic model neurons having chaotic intrinsic dynamics. Our neuron models are realized in simple analogue circuits, and our synaptic connections are realized both in analogue circuits and through a dynamic clamp program. The information which is input to the first chaotic neuron in the channel emerges partially absent and partially 'hidden'. Part is absent because of the dynamical effects of the chaotic oscillation that effectively acts as a noisy channel. The 'hidden' part is recoverable. We show that synaptic parameters, most significantly receptor binding time constants, can be tuned to enhance the information transmission by the combination of a neuron plus a synapse. We discuss how the dynamics of the synapse can be used to recover 'hidden' information using average mutual information as a measure of the quality of information transport.  相似文献   

6.
Many biological neural network models face the problem of scalability because of the limited computational power of today's computers. Thus, it is difficult to assess the efficiency of these models to solve complex problems such as image processing. Here, we describe how this problem can be tackled using event-driven computation. Only the neurons that emit a discharge are processed and, as long as the average spike discharge rate is low, millions of neurons and billions of connections can be modelled. We describe the underlying computation and implementation of such a mechanism in SpikeNET, our neural network simulation package. The type of model one can build is not only biologically compliant, it is also computationally efficient as 400 000 synaptic weights can be propagated per second on a standard desktop computer. In addition, for large networks, we can set very small time steps (< 0.01 ms) without significantly increasing the computation time. As an example, this method is applied to solve complex cognitive tasks such as face recognition in natural images.  相似文献   

7.
Hoshino O 《Neural computation》2007,19(12):3310-3334
Accumulating evidence suggests that auditory cortical neurons exhibit widespread-onset responses and restricted sustained responses to sound stimuli. When a sound stimulus is presented to a subject, the auditory cortex first responds with transient discharges across a relatively large population of neurons, showing widespread-onset responses. As time passes, the activation becomes restricted to a small population of neurons that are preferentially driven by the stimulus, showing restricted sustained responses. The sustained responses are considered to have a role in expressing information about the stimulus, but it remains to be seen what roles the widespread-onset responses have in auditory information processing. We carried out numerical simulations of a neural network model for a lateral belt area of auditory cortex. In the network, dynamic cell assemblies expressed information about auditory sounds. Lateral excitatory and inhibitory connections were made between cell assemblies, respectively, by direct and indirect projections via interneurons. Widespread-onset neuronal responses to sound stimuli (bandpassed noises) took place over the network if lateral excitation preceded lateral inhibition, making a time widow for the onset responses. The widespread-onset responses contributed to the accelerating reaction time of neurons to sensory stimulation. Lateral interaction among dynamic cell assemblies was essential for maintaining ongoing membrane potentials near thresholds for action potential generation, thereby accelerating reaction time to subsequent sensory input as well. We suggest that the widespread-onset neuronal responses and the ongoing subthreshold cortical state, for which the coordination of lateral synaptic interaction among dissimilar cell assemblies is essential, may work together in order for the auditory cortex to quickly detect the sudden occurrence of sounds from the external environment.  相似文献   

8.
Identifying the optimal stimuli for a sensory neuron is often a difficult process involving trial and error. By analyzing the relationship between stimuli and responses in feedforward and stable recurrent neural network models, we find that the stimulus yielding the maximum firing rate response always lies on the topological boundary of the collection of all allowable stimuli, provided that individual neurons have increasing input-output relations or gain functions and that the synaptic connections are convergent between layers with nondegenerate weight matrices. This result suggests that in neurophysiological experiments under these conditions, only stimuli on the boundary need to be tested in order to maximize the response, thereby potentially reducing the number of trials needed for finding the most effective stimuli. Even when the gain functions allow firing rate cutoff or saturation, a peak still cannot exist in the stimulus-response relation in the sense that moving away from the optimum stimulus always reduces the response. We further demonstrate that the condition for nondegenerate synaptic connections also implies that proper stimuli can independently perturb the activities of all neurons in the same layer. One example of this type of manipulation is changing the activity of a single neuron in a given processing layer while keeping that of all others constant. Such stimulus perturbations might help experimentally isolate the interactions of selected neurons within a network.  相似文献   

9.
Cortical sensory neurons are known to be highly variable, in the sense that responses evoked by identical stimuli often change dramatically from trial to trial. The origin of this variability is uncertain, but it is usually interpreted as detrimental noise that reduces the computational accuracy of neural circuits. Here we investigate the possibility that such response variability might in fact be beneficial, because it may partially compensate for a decrease in accuracy due to stochastic changes in the synaptic strengths of a network. We study the interplay between two kinds of noise, response (or neuronal) noise and synaptic noise, by analyzing their joint influence on the accuracy of neural networks trained to perform various tasks. We find an interesting, generic interaction: when fluctuations in the synaptic connections are proportional to their strengths (multiplicative noise), a certain amount of response noise in the input neurons can significantly improve network performance, compared to the same network without response noise. Performance is enhanced because response noise and multiplicative synaptic noise are in some ways equivalent. So if the algorithm used to find the optimal synaptic weights can take into account the variability of the model neurons, it can also take into account the variability of the synapses. Thus, the connection patterns generated with response noise are typically more resistant to synaptic degradation than those obtained without response noise. As a consequence of this interplay, if multiplicative synaptic noise is present, it is better to have response noise in the network than not to have it. These results are demonstrated analytically for the most basic network consisting of two input neurons and one output neuron performing a simple classification task, but computer simulations show that the phenomenon persists in a wide range of architectures, including recurrent (attractor) networks and sensorimotor networks that perform coordinate transformations. The results suggest that response variability could play an important dynamic role in networks that continuously learn.  相似文献   

10.
The discrete-time neural network proposed by Hopfield can be used for storing and recognizing binary patterns. Here, we investigate how the performance of this network on pattern recognition task is altered when neurons are removed and the weights of the synapses corresponding to these deleted neurons are divided among the remaining synapses. Five distinct ways of distributing such weights are evaluated. We speculate how this numerical work about synaptic compensation may help to guide experimental studies on memory rehabilitation interventions.  相似文献   

11.
Hopfield networks are a class of neural network models where non-linear graded response neurons organized into networks with effectively symmetric synaptic connections are able to implement interesting algorithms, thereby introducing the concept of information storage in the stable states of dynamical systems. In addition to opening up the possibility of using system dynamics as a vehicle to gain potentially useful insights into the behaviour of such networks, especially in the field or nonelectrical engineering, we study the dynamics of the state-space trajectory as well as time domain evolution of sensitivities of the states with respect to circuit parameters.  相似文献   

12.
Information geometry of Boltzmann machines   总被引:3,自引:0,他引:3  
A Boltzmann machine is a network of stochastic neurons. The set of all the Boltzmann machines with a fixed topology forms a geometric manifold of high dimension, where modifiable synaptic weights of connections play the role of a coordinate system to specify networks. A learning trajectory, for example, is a curve in this manifold. It is important to study the geometry of the neural manifold, rather than the behavior of a single network, in order to know the capabilities and limitations of neural networks of a fixed topology. Using the new theory of information geometry, a natural invariant Riemannian metric and a dual pair of affine connections on the Boltzmann neural network manifold are established. The meaning of geometrical structures is elucidated from the stochastic and the statistical point of view. This leads to a natural modification of the Boltzmann machine learning rule.  相似文献   

13.
Mining temporal network models from discrete event streams is an important problem with applications in computational neuroscience, physical plant diagnostics, and human–computer interaction modeling. In this paper, we introduce the notion of excitatory networks which are essentially temporal models where all connections are stimulative, rather than inhibitive. The emphasis on excitatory connections facilitates learning of network models by creating bridges to frequent episode mining. Specifically, we show that frequent episodes help identify nodes with high mutual information relationships and that such relationships can be summarized into a dynamic Bayesian network (DBN). This leads to an algorithm that is significantly faster than state-of-the-art methods for inferring DBNs, while simultaneously providing theoretical guarantees on network optimality. We demonstrate the advantages of our approach through an application in neuroscience, where we show how strong excitatory networks can be efficiently inferred from both mathematical models of spiking neurons and several real neuroscience datasets.  相似文献   

14.
We present a dynamical theory of integrate-and-fire neurons with strong synaptic coupling. We show how phase-locked states that are stable in the weak coupling regime can destabilize as the coupling is increased, leading to states characterized by spatiotemporal variations in the interspike intervals (ISIs). The dynamics is compared with that of a corresponding network of analog neurons in which the outputs of the neurons are taken to be mean firing rates. A fundamental result is that for slow interactions, there is good agreement between the two models (on an appropriately defined timescale). Various examples of desynchronization in the strong coupling regime are presented. First, a globally coupled network of identical neurons with strong inhibitory coupling is shown to exhibit oscillator death in which some of the neurons suppress the activity of others. However, the stability of the synchronous state persists for very large networks and fast synapses. Second, an asymmetric network with a mixture of excitation and inhibition is shown to exhibit periodic bursting patterns. Finally, a one-dimensional network of neurons with long-range interactions is shown to desynchronize to a state with a spatially periodic pattern of mean firing rates across the network. This is modulated by deterministic fluctuations of the instantaneous firing rate whose size is an increasing function of the speed of synaptic response.  相似文献   

15.
Dayhoff JE 《Neural computation》2007,19(9):2433-2467
We demonstrate a model in which synchronously firing ensembles of neurons are networked to produce computational results. Each ensemble is a group of biological integrate-and-fire spiking neurons, with probabilistic interconnections between groups. An analogy is drawn in which each individual processing unit of an artificial neural network corresponds to a neuronal group in a biological model. The activation value of a unit in the artificial neural network corresponds to the fraction of active neurons, synchronously firing, in a biological neuronal group. Weights of the artificial neural network correspond to the product of the interconnection density between groups, the group size of the presynaptic group, and the postsynaptic potential heights in the synchronous group model. All three of these parameters can modulate connection strengths between neuronal groups in the synchronous group models. We give an example of nonlinear classification (XOR) and a function approximation example in which the capability of the artificial neural network can be captured by a neural network model with biological integrate-and-fire neurons configured as a network of synchronously firing ensembles of such neurons. We point out that the general function approximation capability proven for feedforward artificial neural networks appears to be approximated by networks of neuronal groups that fire in synchrony, where the groups comprise integrate-and-fire neurons. We discuss the advantages of this type of model for biological systems, its possible learning mechanisms, and the associated timing relationships.  相似文献   

16.
Understanding the nonlinear dynamics of an olfactory bulb (OB) is essential for the modelling of the brain and nervous system. We have analysed the nature of odour-receptor interactions and the conditions controlling neural oscillations. This analysis is the basis for the proposed biologically plausible three-tiered model of an oscillation-driven neural network (ODNN) with three non-linearities. The layered architecture of the bulb is viewed as a composition of different processing stages performing specific computational tasks. The presented three-tiered model of the olfactory system (TTOS) contains the sensory, olfactory bulb and anterior nucleus tiers. The number of excitatory (mitral/tufted) cells differs from the number of inhibitory (granule) cells, which improves the cognitive ability of the model. The odour molecules are first received at the sensory layer, where receptor neurons spatio-temporally encode them in terms of spiking frequencies. Neurons expressing a specific receptor project to two or more topographically fixed glomeruli in the OB and create a sensory map. Excitatory postsynaptic potentials are formed in the primary dendrite of mitral cells and are encoded in an exclusive way to present them to the coupled non-linear oscillatory model of the next mitral-granule layer. In a noisy background, our model functions as an associative memory, although it operates in oscillatory mode. While feed-forward networks and recurrent networks with symmetric connections always converge to static states, learning and pattern retrieval in an asymmetrically connected neural network based on oscillations are not well studied. We derive the requirements under which a state is stable and test whether a given equilibrium state is stable against noise. The ODNN demonstrates its capability to discriminate odours by using nonlinear dendro-dendritic interactions between neurons. This model allows us to visualise and analyse how the brain is able to encode information from countless molecules with different odour receptors.  相似文献   

17.
Karsten  Andreas  Bernd  Ana D.  Thomas 《Neurocomputing》2008,71(7-9):1694-1704
Biologically plausible excitatory neural networks develop a persistent synchronized pattern of activity depending on spontaneous activity and synaptic refractoriness (short term depression). By fixed synaptic weights synchronous bursts of oscillatory activity are stable and involve the whole network. In our modeling study we investigate the effect of a dynamic Hebbian-like learning mechanism, spike-timing-dependent plasticity (STDP), on the changes of synaptic weights depending on synchronous activity and network connection strategies (small-world topology). We show that STDP modifies the weights of synaptic connections in such a way that synchronization of neuronal activity is considerably weakened. Networks with a higher proportion of long connections can sustain a higher level of synchronization in spite of STDP influence. The resulting distribution of the synaptic weights in single neurons depends both on the global statistics of firing dynamics and on the number of incoming and outgoing connections.  相似文献   

18.
Synaptic noise due to intense network activity can have a significant impact on the electrophysiological properties of individual neurons. This is the case for the cerebral cortex, where ongoing activity leads to strong barrages of synaptic inputs, which act as the main source of synaptic noise affecting on neuronal dynamics. Here, we characterize the subthreshold behavior of neuronal models in which synaptic noise is represented by either additive or multiplicative noise, described by Ornstein-Uhlenbeck processes. We derive and solve the Fokker-Planck equation for this system, which describes the time evolution of the probability density function for the membrane potential. We obtain an analytic expression for the membrane potential distribution at steady state and compare this expression with the subthreshold activity obtained in Hodgkin-Huxley-type models with stochastic synaptic inputs. The differences between multiplicative and additive noise models suggest that multiplicative noise is adequate to describe the high-conductance states similar to in vivo conditions. Because the steady-state membrane potential distribution is easily obtained experimentally, this approach provides a possible method to estimate the mean and variance of synaptic conductances in real neurons.  相似文献   

19.
Synapses play a central role in neural computation: the strengths of synaptic connections determine the function of a neural circuit. In conventional models of computation, synaptic strength is assumed to be a static quantity that changes only on the slow timescale of learning. In biological systems, however, synaptic strength undergoes dynamic modulation on rapid timescales through mechanisms such as short term facilitation and depression. Here we describe a general model of computation that exploits dynamic synapses, and use a backpropagation-like algorithm to adjust the synaptic parameters. We show that such gradient descent suffices to approximate a given quadratic filter by a rather small neural system with dynamic synapses. We also compare our network model to artificial neural networks designed for time series processing. Our numerical results are complemented by theoretical analyses which show that even with just a single hidden layer such networks can approximate a surprisingly large class of nonlinear filters: all filters that can be characterized by Volterra series. This result is robust with regard to various changes in the model for synaptic dynamics.  相似文献   

20.
Stochastic dynamics of a finite-size spiking neural network   总被引:4,自引:0,他引:4  
Soula H  Chow CC 《Neural computation》2007,19(12):3262-3292
We present a simple Markov model of spiking neural dynamics that can be analytically solved to characterize the stochastic dynamics of a finite-size spiking neural network. We give closed-form estimates for the equilibrium distribution, mean rate, variance, and autocorrelation function of the network activity. The model is applicable to any network where the probability of firing of a neuron in the network depends on only the number of neurons that fired in a previous temporal epoch. Networks with statistically homogeneous connectivity and membrane and synaptic time constants that are not excessively long could satisfy these conditions. Our model completely accounts for the size of the network and correlations in the firing activity. It also allows us to examine how the network dynamics can deviate from mean field theory. We show that the model and solutions are applicable to spiking neural networks in biophysically plausible parameter regimes.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号