首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The determination of temporal and spatial correlations in neuronal activity is one of the most important neurophysiological tools to gain insight into the mechanisms of information processing in the brain. Its interpretation is complicated by the difficulty of disambiguating the effects of architecture, single-neuron properties, and network dynamics. We present a theory that describes the contribution of the network dynamics in a network of "spiking" neurons. For a simple neuron model including refractory properties, we calculate the temporal cross-correlations in a completely homogeneous, excitatory, fully connected network in a stable, stationary state, for stochastic dynamics in both discrete and continuous time. We show that even for this simple network architecture, the cross-correlations exhibit a large variety of qualitatively different properties, strongly dependent on the level of noise, the decay constant of the refractory function, and the network activity. At the critical point, the cross-correlations oscillate with a frequency that depends on the refractory properties or decay exponentially with a diverging damping constant (for "weak" refractory properties). We also investigate the effect of the synaptic time constants. It is shown that these time constants may, apart from their influence on the asymmetric peak arising from the direct synaptic connection, also affect the long-term properties of the cross-correlations.  相似文献   

2.
The emergence of synchrony in the activity of large, heterogeneous networks of spiking neurons is investigated. We define the robustness of synchrony by the critical disorder at which the asynchronous state becomes linearly unstable. We show that at low firing rates, synchrony is more robust in excitatory networks than in inhibitory networks, but excitatory networks cannot display any synchrony when the average firing rate becomes too high. We introduce a new regime where all inputs, external and internal, are strong and have opposite effects that cancel each other when averaged. In this regime, the robustness of synchrony is strongly enhanced, and robust synchrony can be achieved at a high firing rate in inhibitory networks. On the other hand, in excitatory networks, synchrony remains limited in frequency due to the intrinsic instability of strong recurrent excitation.  相似文献   

3.
A network of leaky integrate-and-fire (IAF) neurons is proposed to segment gray-scale images. The network architecture with local competition between neurons that encode segment assignments of image blocks is motivated by a histogram clustering approach to image segmentation. Lateral excitatory connections between neighboring image sites yield a local smoothing of segments. The mean firing rate of class membership neurons encodes the image segmentation. A weight modification scheme is proposed that estimates segment-specific prototypical histograms. The robustness properties of the network implementation make it amenable to an analog VLSI realization. Results on synthetic and real-world images demonstrate the effectiveness of the architecture.  相似文献   

4.
Spiking neural networks constitute a modern neural network paradigm that overlaps machine learning and computational neurosciences. Spiking neural networks use neuron models that possess a great degree of biological realism. The most realistic model of the neuron is the one created by Alan Lloyd Hodgkin and Andrew Huxley. However, the Hodgkin–Huxley model, while accurate, is computationally very inefficient. Eugene Izhikevich created a simplified neuron model based on the Hodgkin–Huxley equations. This model has better computational efficiency than the original proposed by Hodgkin and Huxley, and yet it can successfully reproduce all known firing patterns. However, there are not many articles dealing with implementations of this model for a functional neural network. This study presents a spiking neural network architecture that utilizes improved Izhikevich neurons with the purpose of evaluating its speed and efficiency. Since the field of spiking neural networks has reinvigorated the interest in biological plausibility, biological realism was an additional goal. The network is tested on the correct classification of logic gates (including XOR) and on the iris dataset. Results and possible improvements are also discussed.  相似文献   

5.
We present in this paper a general model of recurrent networks of spiking neurons, composed of several populations, and whose interaction pattern is set with a random draw. We use for simplicity discrete time neuron updating, and the emitted spikes are transmitted through randomly delayed lines. In excitatory-inhibitory networks, we show that inhomogeneous delays may favour synchronization provided that the inhibitory delays distribution is significantly stronger than the excitatory one. In that case, slow waves of synchronous activity appear (this synchronous activity is stronger in inhibitory population). This synchrony allows for a fast ada ptivity of the network to various input stimuli. In networks observing the constraint of short range excitation and long range inhibition, we show that under some parameter settings, this model displays properties of –1– dynamic retention –2– input normalization –3– target tracking. Those properties are of interest for modelling biological topologically organized structures, and for robotic applications taking place in noisy environments where targets vary in size, speed and duration. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

6.
Solving graph algorithms with networks of spiking neurons   总被引:1,自引:0,他引:1  
Spatio-temporal coding that combines spatial constraints with temporal sequencing is of great interest to brain-like circuit modelers. In this paper we present some new ideas of how these types of circuits can self-organize. We introduce a temporal correlation rule based on the time difference between the firing of neurons. With the aid of this rule we show an analogy between a graph and a network of spiking neurons. The shortest path, clustering based on the nearest neighbor, and the minimal spanning tree algorithms are solved using the proposed approach.  相似文献   

7.
We investigate possibilities of inducing temporal structures without fading memory in recurrent networks of spiking neurons strictly operating in the pulse-coding regime. We extend the existing gradient-based algorithm for training feedforward spiking neuron networks, SpikeProp (Bohte, Kok, & La Poutré, 2002), to recurrent network topologies, so that temporal dependencies in the input stream are taken into account. It is shown that temporal structures with unbounded input memory specified by simple Moore machines (MM) can be induced by recurrent spiking neuron networks (RSNN). The networks are able to discover pulse-coded representations of abstract information processing states coding potentially unbounded histories of processed inputs. We show that it is often possible to extract from trained RSNN the target MM by grouping together similar spike trains appearing in the recurrent layer. Even when the target MM was not perfectly induced in a RSNN, the extraction procedure was able to reveal weaknesses of the induced mechanism and the extent to which the target machine had been learned.  相似文献   

8.
Dayhoff JE 《Neural computation》2007,19(9):2433-2467
We demonstrate a model in which synchronously firing ensembles of neurons are networked to produce computational results. Each ensemble is a group of biological integrate-and-fire spiking neurons, with probabilistic interconnections between groups. An analogy is drawn in which each individual processing unit of an artificial neural network corresponds to a neuronal group in a biological model. The activation value of a unit in the artificial neural network corresponds to the fraction of active neurons, synchronously firing, in a biological neuronal group. Weights of the artificial neural network correspond to the product of the interconnection density between groups, the group size of the presynaptic group, and the postsynaptic potential heights in the synchronous group model. All three of these parameters can modulate connection strengths between neuronal groups in the synchronous group models. We give an example of nonlinear classification (XOR) and a function approximation example in which the capability of the artificial neural network can be captured by a neural network model with biological integrate-and-fire neurons configured as a network of synchronously firing ensembles of such neurons. We point out that the general function approximation capability proven for feedforward artificial neural networks appears to be approximated by networks of neuronal groups that fire in synchrony, where the groups comprise integrate-and-fire neurons. We discuss the advantages of this type of model for biological systems, its possible learning mechanisms, and the associated timing relationships.  相似文献   

9.
An increasing number of research groups are developing custom hybrid analog/digital very large scale integration (VLSI) chips and systems that implement hundreds to thousands of spiking neurons with biophysically realistic dynamics, with the intention of emulating brainlike real-world behavior in hardware and robotic systems rather than simply simulating their performance on general-purpose digital computers. Although the electronic engineering aspects of these emulation systems is proceeding well, progress toward the actual emulation of brainlike tasks is restricted by the lack of suitable high-level configuration methods of the kind that have already been developed over many decades for simulations on general-purpose computers. The key difficulty is that the dynamics of the CMOS electronic analogs are determined by transistor biases that do not map simply to the parameter types and values used in typical abstract mathematical models of neurons and their networks. Here we provide a general method for resolving this difficulty. We describe a parameter mapping technique that permits an automatic configuration of VLSI neural networks so that their electronic emulation conforms to a higher-level neuronal simulation. We show that the neurons configured by our method exhibit spike timing statistics and temporal dynamics that are the same as those observed in the software simulated neurons and, in particular, that the key parameters of recurrent VLSI neural networks (e.g., implementing soft winner-take-all) can be precisely tuned. The proposed method permits a seamless integration between software simulations with hardware emulations and intertranslatability between the parameters of abstract neuronal models and their emulation counterparts. Most important, our method offers a route toward a high-level task configuration language for neuromorphic VLSI systems.  相似文献   

10.
In this paper, we describe a new Synaptic Plasticity Activity Rule (SAPR) developed for use in networks of spiking neurons. Such networks can be used for simulations of physiological experiments as well as for other computations like image analysis. Most synaptic plasticity rules use artificially defined functions to modify synaptic connection strengths. In contrast, our rule makes use of the existing postsynaptic potential values to compute the value of adjustment. The network of spiking neurons we consider consists of excitatory and inhibitory neurons. Each neuron is implemented as an integrate-and-fire model that accurately mimics the behavior of biological neurons. To test performance of our new plasticity rule we designed a model of a biologically-inspired signal processing system, and used it for object detection in eye images of diabetic retinopathy patients, and lung images of cystic fibrosis patients. The results show that the network detects the edges of objects within an image, essentially segmenting it. Our ultimate goal, however, is not the development of an image segmentation tool that would be more efficient than nonbiological algorithms, but developing a physiologically correct neural network model that could be applied to a wide range of neurological experiments. We decided to validate the SAPR by using it in a network of spiking neurons for image segmentation because it is easy to visually assess the results. An important thing is that image segmentation is done in an entirely unsupervised way.  相似文献   

11.
A simulation procedure is described for making feasible large-scale simulations of recurrent neural networks of spiking neurons and plastic synapses. The procedure is applicable if the dynamic variables of both neurons and synapses evolve deterministically between any two successive spikes. Spikes introduce jumps in these variables, and since spike trains are typically noisy, spikes introduce stochasticity into both dynamics. Since all events in the simulation are guided by the arrival of spikes, at neurons or synapses, we name this procedure event-driven. The procedure is described in detail, and its logic and performance are compared with conventional (synchronous) simulations. The main impact of the new approach is a drastic reduction of the computational load incurred upon introduction of dynamic synaptic efficacies, which vary organically as a function of the activities of the pre- and postsynaptic neurons. In fact, the computational load per neuron in the presence of the synaptic dynamics grows linearly with the number of neurons and is only about 6% more than the load with fixed synapses. Even the latter is handled quite efficiently by the algorithm. We illustrate the operation of the algorithm in a specific case with integrate-and-fire neurons and specific spike-driven synaptic dynamics. Both dynamical elements have been found to be naturally implementable in VLSI. This network is simulated to show the effects on the synaptic structure of the presentation of stimuli, as well as the stability of the generated matrix to the neural activity it induces.  相似文献   

12.
Oscillatory and synchronized neural activities are commonly found in the brain, and evidence suggests that many of them are caused by global feedback. Their mechanisms and roles in information processing have been discussed often using purely feedforward networks or recurrent networks with constant inputs. On the other hand, real recurrent neural networks are abundant and continually receive information-rich inputs from the outside environment or other parts of the brain. We examine how feedforward networks of spiking neurons with delayed global feedback process information about temporally changing inputs. We show that the network behavior is more synchronous as well as more correlated with and phase-locked to the stimulus when the stimulus frequency is resonant with the inherent frequency of the neuron or that of the network oscillation generated by the feedback architecture. The two eigenmodes have distinct dynamical characteristics, which are supported by numerical simulations and by analytical arguments based on frequency response and bifurcation theory. This distinction is similar to the class I versus class II classification of single neurons according to the bifurcation from quiescence to periodic firing, and the two modes depend differently on system parameters. These two mechanisms may be associated with different types of information processing.  相似文献   

13.
We demonstrate that spiking neural networks encoding information in the timing of single spikes are capable of computing and learning clusters from realistic data. We show how a spiking neural network based on spike-time coding and Hebbian learning can successfully perform unsupervised clustering on real-world data, and we demonstrate how temporal synchrony in a multilayer network can induce hierarchical clustering. We develop a temporal encoding of continuously valued data to obtain adjustable clustering capacity and precision with an efficient use of neurons: input variables are encoded in a population code by neurons with graded and overlapping sensitivity profiles. We also discuss methods for enhancing scale-sensitivity of the network and show how the induced synchronization of neurons within early RBF layers allows for the subsequent detection of complex clusters.  相似文献   

14.
A notation for the functional specification of a wide range of neural networks consisting of temporal or non-temporal neurons, is proposed. The notation is primarily a mathematical framework, but it can also be illustrated graphically and can be extended into a language in order to be automated. Its basic building blocks are processing entities, finer grained than neurons, connected by instant links, and as such they form sets of interacting entities resulting in bigger and more sophisticated structures. The hierarchical nature of the notation supports both top-down and bottom-up specification approaches. The use of the notation is evaluated by a detailed example of an integrated tangible agent consisting of sensors, a computational part, and actuators. A process from specification to both software and hardware implementation is proposed.  相似文献   

15.
Many biological neural network models face the problem of scalability because of the limited computational power of today's computers. Thus, it is difficult to assess the efficiency of these models to solve complex problems such as image processing. Here, we describe how this problem can be tackled using event-driven computation. Only the neurons that emit a discharge are processed and, as long as the average spike discharge rate is low, millions of neurons and billions of connections can be modelled. We describe the underlying computation and implementation of such a mechanism in SpikeNET, our neural network simulation package. The type of model one can build is not only biologically compliant, it is also computationally efficient as 400 000 synaptic weights can be propagated per second on a standard desktop computer. In addition, for large networks, we can set very small time steps (< 0.01 ms) without significantly increasing the computation time. As an example, this method is applied to solve complex cognitive tasks such as face recognition in natural images.  相似文献   

16.
We present a dynamical theory of integrate-and-fire neurons with strong synaptic coupling. We show how phase-locked states that are stable in the weak coupling regime can destabilize as the coupling is increased, leading to states characterized by spatiotemporal variations in the interspike intervals (ISIs). The dynamics is compared with that of a corresponding network of analog neurons in which the outputs of the neurons are taken to be mean firing rates. A fundamental result is that for slow interactions, there is good agreement between the two models (on an appropriately defined timescale). Various examples of desynchronization in the strong coupling regime are presented. First, a globally coupled network of identical neurons with strong inhibitory coupling is shown to exhibit oscillator death in which some of the neurons suppress the activity of others. However, the stability of the synchronous state persists for very large networks and fast synapses. Second, an asymmetric network with a mixture of excitation and inhibition is shown to exhibit periodic bursting patterns. Finally, a one-dimensional network of neurons with long-range interactions is shown to desynchronize to a state with a spatially periodic pattern of mean firing rates across the network. This is modulated by deterministic fluctuations of the instantaneous firing rate whose size is an increasing function of the speed of synaptic response.  相似文献   

17.
Simple model of spiking neurons   总被引:18,自引:0,他引:18  
A model is presented that reproduces spiking and bursting behavior of known types of cortical neurons. The model combines the biologically plausibility of Hodgkin-Huxley-type dynamics and the computational efficiency of integrate-and-fire neurons. Using this model, one can simulate tens of thousands of spiking cortical neurons in real time (1 ms resolution) using a desktop PC.  相似文献   

18.
19.
20.
We introduce and test a system for simulating networks of conductance-based neuron models using analog circuits. At the single-cell level, we use custom-designed analog circuits (ASICs) that simulate two types of spiking neurons based on Hodgkin-Huxley like dynamics: "regular spiking" excitatory neurons with spike-frequency adaptation, and "fast spiking" inhibitory neurons. Synaptic interactions are mediated by conductance-based synaptic currents described by kinetic models. Connectivity and plasticity rules are implemented digitally through a real time interface between a computer and a PCI board containing the ASICs. We show a prototype system of a few neurons interconnected with synapses undergoing spike-timing dependent plasticity (STDP), and compare this system with numerical simulations. We use this system to evaluate the effect of parameter dispersion on the behavior of small circuits of neurons. It is shown that, although the exact spike timings are not precisely emulated by the ASIC neurons, the behavior of small networks with STDP matches that of numerical simulations. Thus, this mixed analog-digital architecture provides a valuable tool for real-time simulations of networks of neurons with STDP. They should be useful for any real-time application, such as hybrid systems interfacing network models with biological neurons.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号