共查询到20条相似文献,搜索用时 0 毫秒
1.
Sterne P 《Neural computation》2012,24(8):2053-2077
We present a neural network that is capable of completing and correcting a spiking pattern given only a partial, noisy version. It operates in continuous time and represents information using the relative timing of individual spikes. The network is capable of correcting and recalling multiple patterns simultaneously. We analyze the network's performance in terms of information recall. We explore two measures of the capacity of the network: one that values the accurate recall of individual spike times and another that values only the presence or absence of complete patterns. Both measures of information are found to scale linearly in both the number of neurons and the period of the patterns, suggesting these are natural measures of network information. We show a smooth transition from encodings that provide precise spike times to flexible encodings that can encode many scenes. This makes it plausible that many diverse tasks could be learned with such an encoding. 相似文献
2.
The responses of neurons to time-varying injected currents are reproducible on a trial-by-trial basis in vitro, but when a constant current is injected, small variances in interspike intervals across trials add up, eventually leading to a high variance in spike timing. It is unclear whether this difference is due to the nature of the input currents or the intrinsic properties of the neurons. Neuron responses can fail to be reproducible in two ways: dynamical noise can accumulate over time and lead to a desynchronization over trials, or several stable responses can exist, depending on the initial condition. Here we show, through simulations and theoretical considerations, that for a general class of spiking neuron models, which includes, in particular, the leaky integrate-and-fire model as well as nonlinear spiking models, aperiodic currents, contrary to periodic currents, induce reproducible responses, which are stable under noise, change in initial conditions and deterministic perturbations of the input. We provide a theoretical explanation for aperiodic currents that cross the threshold. 相似文献
3.
We introduce and test a system for simulating networks of conductance-based neuron models using analog circuits. At the single-cell level, we use custom-designed analog circuits (ASICs) that simulate two types of spiking neurons based on Hodgkin-Huxley like dynamics: "regular spiking" excitatory neurons with spike-frequency adaptation, and "fast spiking" inhibitory neurons. Synaptic interactions are mediated by conductance-based synaptic currents described by kinetic models. Connectivity and plasticity rules are implemented digitally through a real time interface between a computer and a PCI board containing the ASICs. We show a prototype system of a few neurons interconnected with synapses undergoing spike-timing dependent plasticity (STDP), and compare this system with numerical simulations. We use this system to evaluate the effect of parameter dispersion on the behavior of small circuits of neurons. It is shown that, although the exact spike timings are not precisely emulated by the ASIC neurons, the behavior of small networks with STDP matches that of numerical simulations. Thus, this mixed analog-digital architecture provides a valuable tool for real-time simulations of networks of neurons with STDP. They should be useful for any real-time application, such as hybrid systems interfacing network models with biological neurons. 相似文献
4.
Francis George C. Cabarle Henry N. Adorna Mario J. Pérez-Jiménez 《Neural computing & applications》2016,27(5):1337-1347
Spiking neural P systems (in short, SNP systems) are parallel, distributed, and nondeterministic computing devices inspired by biological spiking neurons. Recently, a class of SNP systems known as SNP systems with structural plasticity (in short, SNPSP systems) was introduced. SNPSP systems represent a class of SNP systems that have dynamism applied to the synapses, i.e. neurons can use plasticity rules to create or remove synapses. In this work, we impose the restriction of sequentiality on SNPSP systems, using four modes: max, min, max-pseudo-, and min-pseudo-sequentiality. We also impose a normal form for SNPSP systems as number acceptors and generators. Conditions for (non)universality are then provided. Specifically, acceptors are universal in all modes, while generators need a nondeterminism source in two modes, which in this work is provided by the plasticity rules. 相似文献
5.
Spiking neural P systems (SN P systems, for short) are a class of distributed parallel computing devices inspired by the way neurons communicate by means of spikes, where neurons work in parallel in the sense that each neuron that can fire should fire at each computation step, and neurons can be different in the sense that they can have different sets of spiking rules. In this work, we consider SN P systems with the restrictions: (1) all neurons are homogeneous in the sense that each neuron has the same set of rules; (2) at each step the neuron with the maximum number of spikes among the neurons that are active (can spike) will fire. These restrictions correspond to the fact that the system consists of only one kind of neurons and a global view of the whole network makes the system sequential. The computation power of homogeneous SN P systems working in the sequential mode induced by the maximum spike number is investigated. Specifically, it is proved that such systems are universal as both generating and accepting devices. 相似文献
6.
Brezina V 《Neurocomputing》2007,70(10-12):1863-1869
Variability of the neuronal spike pattern is usually thought of in terms of the information that the different interspike intervals might be encoding. However, the very presence of the variability can have other kinds of functional significance. Here we consider the example of the B15/B16-ARC neuromuscular system of Aplysia, a model system for the study of neuromuscular modulation and control. We show that variability of motor neuron spike timing at the input to the system penetrates throughout the system, affecting all downstream variables including modulator release, modulator concentrations, modulatory actions, and the contraction of the muscle. Furthermore, not only does the variability penetrate through the system, but it is actually instrumental in maintaining its modulation and contractions at a robust, physiological level. 相似文献
7.
8.
Neural Computing and Applications - The transmission of weather information of a location at certain time intervals affects the living conditions of the people there directly or indirectly.... 相似文献
9.
In the area of membrane computing, time-freeness has been defined as the ability for a timed membrane system to produce always
the same result, independently of the execution times associated to the rules. In this paper, we use a similar idea in the
framework of spiking neural P systems, a model inspired by the structure and the functioning of neural cells. In particular,
we introduce stochastic spiking neural P systems where the time of firing for an enabled spiking rule is probabilistically
chosen and we investigate when, and how, these probabilities can influence the ability of the systems to simulate, in a reliable
way, universal machines, such as register machines. 相似文献
10.
Zafeirios C. Papazachos Helen D. Karatza 《Simulation Modelling Practice and Theory》2009,17(7):1276-1289
Gang scheduling is a common task scheduling policy for parallel and distributed systems which combines elements of space-sharing and time-sharing. In this paper we present a migration strategy which reduces the fragmentation in the schedule caused by gang scheduled jobs. We consider the existence of high priority jobs in the workload. These jobs need to be started immediately and they may interrupt a parallel job’s execution. A distributed system consisting of two homogeneous clusters is simulated to evaluate the performance for various workloads. We study the impact on performance of the variability in service time of the parallel tasks. Our simulation results indicate that the proposed strategy can result in a significant performance gain and that the performance improvement depends on the variability of gang tasks’ service time. 相似文献
11.
Turlough Neary 《Natural computing》2010,9(4):831-851
It is shown here that there is no standard spiking neural P system that simulates Turing machines with less than exponential
time and space overheads. The spiking neural P systems considered here have a constant number of neurons that is independent
of the input length. Following this, we construct a universal spiking neural P system with exhaustive use of rules that simulates
Turing machines in linear time and has only 10 neurons. 相似文献
12.
Francis George C. Cabarle Henry N. Adorna Mario J. Pérez-Jiménez 《Natural computing》2016,15(4):533-539
Spiking neural P systems (in short, SN P systems) are membrane computing models inspired by the pulse coding of information in biological neurons. SN P systems with standard rules have neurons that emit at most one spike (the pulse) each step, and have either an input or output neuron connected to the environment. A variant known as SN P modules generalize SN P systems by using extended rules (more than one spike can be emitted each step) and a set of input and output neurons. In this work we continue relating SN P modules and finite automata. In particular, we amend and improve previous constructions for the simulatons of deterministic finite automata and state transducers. Our improvements reduce the number of neurons from three down to one, so our results are optimal. We also simulate finite automata with output, and we use these simulations to generate automatic sequences. 相似文献
13.
Theory of input spike auto- and cross-correlations and their effect on the response of spiking neurons 总被引:1,自引:0,他引:1
Spike correlations between neurons are ubiquitous in the cortex, but their role is not understood. Here we describe the firing response of a leaky integrate-and-fire neuron (LIF) when it receives a temporarily correlated input generated by presynaptic correlated neuronal populations. Input correlations are characterized in terms of the firing rates, Fano factors, correlation coefficients, and correlation timescale of the neurons driving the target neuron. We show that the sum of the presynaptic spike trains cannot be well described by a Poisson process. In fact, the total input current has a nontrivial two-point correlation function described by two main parameters: the correlation timescale (how precise the input correlations are in time) and the correlation magnitude (how strong they are). Therefore, the total current generated by the input spike trains is not well described by a white noise gaussian process. Instead, we model the total current as a colored gaussian process with the same mean and two-point correlation function, leading to the formulation of the problem in terms of a Fokker-Planck equation. Solutions of the output firing rate are found in the limit of short and long correlation timescales. The solutions described here expand and improve on our previous results (Moreno, de la Rocha, Renart, & Parga, 2002) by presenting new analytical expressions for the output firing rate for general IF neurons, extending the validity of the results for arbitrarily large correlation magnitude, and by describing the differential effect of correlations on the mean-driven or noise-dominated firing regimes. Also the details of this novel formalism are given here for the first time. We employ numerical simulations to confirm the analytical solutions and study the firing response to sudden changes in the input correlations. We expect this formalism to be useful for the study of correlations in neuronal networks and their role in neural processing and information transmission. 相似文献
14.
The gastric mill network of the stomatogastric ganglion of the crab Cancer borealis is comprised of a set of neurons that require modulatory input from outside the stomatogastric ganglion and input from the pyloric network of the animal in order to oscillate. Here we study how the frequency of the gastric mill network is determined when it receives rhythmic input from two different sources but where the timing of these inputs may differ. We find that over a certain range of the time difference one of the two rhythmic inputs plays no role what so ever in determining the network frequency, while in another range, both inputs work together to determine the frequency. The existence and stability of periodic solutions to model sets of equations are obtained analytically using geometric singular perturbation theory. The results are validated through numerical simulations. Comparisons to experiments are also presented. 相似文献
15.
Touboul J 《Neural computation》2011,23(7):1704-1742
Bidimensional spiking models are garnering a lot of attention for their simplicity and their ability to reproduce various spiking patterns of cortical neurons and are used particularly for large network simulations. These models describe the dynamics of the membrane potential by a nonlinear differential equation that blows up in finite time, coupled to a second equation for adaptation. Spikes are emitted when the membrane potential blows up or reaches a cutoff θ. The precise simulation of the spike times and of the adaptation variable is critical, for it governs the spike pattern produced and is hard to compute accurately because of the exploding nature of the system at the spike times. We thoroughly study the precision of fixed time-step integration schemes for this type of model and demonstrate that these methods produce systematic errors that are unbounded, as the cutoff value is increased, in the evaluation of the two crucial quantities: the spike time and the value of the adaptation variable at this time. Precise evaluation of these quantities therefore involves very small time steps and long simulation times. In order to achieve a fixed absolute precision in a reasonable computational time, we propose here a new algorithm to simulate these systems based on a variable integration step method that either integrates the original ordinary differential equation or the equation of the orbits in the phase plane, and compare this algorithm with fixed time-step Euler scheme and other more accurate simulation algorithms. 相似文献
16.
Lumer ED 《Neural computation》2000,12(1):181-194
Synaptic interactions in cortical circuits involve strong recurrent excitation between nearby neurons and lateral inhibition that is more widely spread. This architecture is commonly thought to promote a winner-take-all competition, in which a small fraction of neuronal responses is selected for further processing. Here I report that such a competition is remarkably sensitive to the timing of neuronal action potentials. This is shown using simulations of model neurons and synaptic connections representing a patch of cortical tissue. In the simulations, uncorrelated discharge among neuronal units results in patterns of response dominance and suppression, that is, in a winner-take-all competition. Synchronization of firing, however, prevents such competition. These results demonstrate a novel property of recurrent cortical-like circuits, suggesting that the temporal patterning of cortical activity may play an important part in selection among stimuli competing for the control of attention and motor action. 相似文献
17.
Stochastic dynamics of a finite-size spiking neural network 总被引:4,自引:0,他引:4
We present a simple Markov model of spiking neural dynamics that can be analytically solved to characterize the stochastic dynamics of a finite-size spiking neural network. We give closed-form estimates for the equilibrium distribution, mean rate, variance, and autocorrelation function of the network activity. The model is applicable to any network where the probability of firing of a neuron in the network depends on only the number of neurons that fired in a previous temporal epoch. Networks with statistically homogeneous connectivity and membrane and synaptic time constants that are not excessively long could satisfy these conditions. Our model completely accounts for the size of the network and correlations in the firing activity. It also allows us to examine how the network dynamics can deviate from mean field theory. We show that the model and solutions are applicable to spiking neural networks in biophysically plausible parameter regimes. 相似文献
18.
In this paper, a binaural sound source lateralization spiking neural network (NN) will be presented which is inspired by most recent neurophysiological studies on the role of certain nuclei in the superior olivary complex (SOC) and the inferior colliculus (IC). The binaural sound source lateralization neural network (BiSoLaNN) is a spiking NN based on neural mechanisms, utilizing complex neural models, and attempting to simulate certain parts of nuclei of the auditory system in detail. The BiSoLaNN utilizes both excitatory and inhibitory ipsilateral and contralateral influences arrayed in only one delay line originating in the contralateral side to achieve a sharp azimuthal localization. It will be shown that the proposed model can be used both for purposes of understanding the mechanisms of an NN of the auditory system and for sound source lateralization tasks in technical applications, e.g., its use with the Darmstadt robotic head (DRH). 相似文献
19.
Kate Forbes-Riley Mihai Rotaru Diane J. Litman 《User Modeling and User-Adapted Interaction》2008,18(1-2):11-43
We hypothesize that student affect is a useful predictor of spoken dialogue system performance, relative to other parameters. We test this hypothesis in the context of our spoken dialogue tutoring system, where student learning is the primary performance metric. We first present our system and corpora, which have been annotated with several student affective states, student correctness and discourse structure. We then discuss unigram and bigram parameters derived from these annotations. The unigram parameters represent each annotation type individually, as well as system-generic features. The bigram parameters represent annotation combinations, including student state sequences and student states in the discourse structure context. We then use these parameters to build learning models. First, we build simple models based on correlations between each of our parameters and learning. Our results suggest that our affect parameters are among our most useful predictors of learning, particularly in specific discourse structure contexts. Next, we use the PARADISE framework (multiple linear regression) to build complex learning models containing only the most useful subset of parameters. Our approach is a value-added one; we perform a number of model-building experiments, both with and without including our affect parameters, and then compare the performance of the models on the training and the test sets. Our results show that when included as inputs, our affect parameters are selected as predictors in most models, and many of these models show high generalizability in testing. Our results also show that overall, the affect-included models significantly outperform the affect-excluded models. 相似文献
20.
Emery N Brown Riccardo Barbieri Valérie Ventura Robert E Kass Loren M Frank 《Neural computation》2002,14(2):325-346
Measuring agreement between a statistical model and a spike train data series, that is, evaluating goodness of fit, is crucial for establishing the model's validity prior to using it to make inferences about a particular neural system. Assessing goodness-of-fit is a challenging problem for point process neural spike train models, especially for histogram-based models such as perstimulus time histograms (PSTH) and rate functions estimated by spike train smoothing. The time-rescaling theorem is a well-known result in probability theory, which states that any point process with an integrable conditional intensity function may be transformed into a Poisson process with unit rate. We describe how the theorem may be used to develop goodness-of-fit tests for both parametric and histogram-based point process models of neural spike trains. We apply these tests in two examples: a comparison of PSTH, inhomogeneous Poisson, and inhomogeneous Markov interval models of neural spike trains from the supplementary eye field of a macque monkey and a comparison of temporal and spatial smoothers, inhomogeneous Poisson, inhomogeneous gamma, and inhomogeneous inverse gaussian models of rat hippocampal place cell spiking activity. To help make the logic behind the time-rescaling theorem more accessible to researchers in neuroscience, we present a proof using only elementary probability theory arguments. We also show how the theorem may be used to simulate a general point process model of a spike train. Our paradigm makes it possible to compare parametric and histogram-based neural spike train models directly. These results suggest that the time-rescaling theorem can be a valuable tool for neural spike train data analysis. 相似文献