首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
We consider networks of a large number of neurons (or units, processors, ...), whose dynamics are fully asynchronous with overlapping updating. We suppose that the neurons take a finite number of states (discrete states), and that the updating scheme is discrete in time. We make no hypotheses on the activation function of the neurons; the networks may have multiple cycles and basins. We derive conditions on the initialization of the networks, which ensures convergence to fixed points only. Application to a fully asynchronous Hopfield neural network allows us to validate our study.  相似文献   

2.
This paper brings a correction to the formulation of the basins of fixed-point states of fully asynchronous discrete-time discrete-state dynamic networks presented in the above titled paper (ibid., vol. 17, no. 2, pp. 397-408, Mar 06). In our subsequent works on totally asynchronous systems, we have discovered that the formulation given in that previous paper lacks an additional condition. We present in this paper why the previous formulation is incomplete and give the correct formulation.  相似文献   

3.
Complex-valued multistate neural associative memory   总被引:2,自引:0,他引:2  
A model of a multivalued associative memory is presented. This memory has the form of a fully connected attractor neural network composed of multistate complex-valued neurons. Such a network is able to perform the task of storing and recalling gray-scale images. It is also shown that the complex-valued fully connected neural network may be considered as a generalization of a Hopfield network containing real-valued neurons. A computational energy function is introduced and evaluated in order to prove network stability for asynchronous dynamics. Storage capacity as related to the number of accessible neuron states is also estimated.  相似文献   

4.
Golomb D  Hansel D 《Neural computation》2000,12(5):1095-1139
The prevalence of coherent oscillations in various frequency ranges in the central nervous system raises the question of the mechanisms that synchronize large populations of neurons. We study synchronization in models of large networks of spiking neurons with random sparse connectivity. Synchrony occurs only when the average number of synapses, M, that a cell receives is larger than a critical value, Mc. Below Mc, the system is in an asynchronous state. In the limit of weak coupling, assuming identical neurons, we reduce the model to a system of phase oscillators that are coupled via an effective interaction, gamma. In this framework, we develop an approximate theory for sparse networks of identical neurons to estimate Mc analytically from the Fourier coefficients of gamma. Our approach relies on the assumption that the dynamics of a neuron depend mainly on the number of cells that are presynaptic to it. We apply this theory to compute Mc for a model of inhibitory networks of integrate-and-fire (I&F) neurons as a function of the intrinsic neuronal properties (e.g., the refractory period Tr), the synaptic time constants, and the strength of the external stimulus, Iext. The number Mc is found to be nonmonotonous with the strength of Iext. For Tr = 0, we estimate the minimum value of Mc over all the parameters of the model to be 363.8. Above Mc, the neurons tend to fire in smeared one-cluster states at high firing rates and smeared two-or-more-cluster states at low firing rates. Refractoriness decreases Mc at intermediate and high firing rates. These results are compared to numerical simulations. We show numerically that systems with different sizes, N, behave in the same way provided the connectivity, M, is such that 1/Meff = 1/M - 1/N remains constant when N varies. This allows extrapolating the large N behavior of a network from numerical simulations of networks of relatively small sizes (N = 800 in our case). We find that our theory predicts with remarkable accuracy the value of Mc and the patterns of synchrony above Mc, provided the synaptic coupling is not too large. We also study the strong coupling regime of inhibitory sparse networks. All of our simulations demonstrate that increasing the coupling strength reduces the level of synchrony of the neuronal activity. Above a critical coupling strength, the network activity is asynchronous. We point out a fundamental limitation for the mechanisms of synchrony relying on inhibition alone, if heterogeneities in the intrinsic properties of the neurons and spatial fluctuations in the external input are also taken into account.  相似文献   

5.
Small networks of cultured hippocampal neurons respond to transient stimulation with rhythmic network activity (reverberation) that persists for several seconds, constituting an in vitro model of synchrony, working memory, and seizure. This mode of activity has been shown theoretically and experimentally to depend on asynchronous neurotransmitter release (an essential feature of the developing hippocampus) and is supported by a variety of developing neuronal networks despite variability in the size of populations (10-200 neurons) and in patterns of synaptic connectivity. It has previously been reported in computational models that "small-world" connection topology is ideal for the propagation of similar modes of network activity, although this has been shown only for neurons utilizing synchronous (phasic) synaptic transmission. We investigated how topological constraints on synaptic connectivity could shape the stability of reverberations in small networks that also use asynchronous synaptic transmission. We found that reverberation duration in such networks was resistant to changes in topology and scaled poorly with network size. However, normalization of synaptic drive, by reducing the variance of synaptic input across neurons, stabilized reverberation in such networks. Our results thus suggest that the stability of both normal and pathological states in developing networks might be shaped by variance-normalizing constraints on synaptic drive. We offer an experimental prediction for the consequences of such regulation on the behavior of small networks.  相似文献   

6.
We study the emergence of synchronized burst activity in networks of neurons with spike adaptation. We show that networks of tonically firing adapting excitatory neurons can evolve to a state where the neurons burst in a synchronized manner. The mechanism leading to this burst activity is analyzed in a network of integrate-and-fire neurons with spike adaptation. The dependence of this state on the different network parameters is investigated, and it is shown that this mechanism is robust against inhomogeneities, sparseness of the connectivity, and noise. In networks of two populations, one excitatory and one inhibitory, we show that decreasing the inhibitory feedback can cause the network to switch from a tonically active, asynchronous state to the synchronized bursting state. Finally, we show that the same mechanism also causes synchronized burst activity in networks of more realistic conductance-based model neurons.  相似文献   

7.
Zemel RS  Mozer MC 《Neural computation》2001,13(5):1045-1064
Attractor networks, which map an input space to a discrete output space, are useful for pattern completion--cleaning up noisy or missing input features. However, designing a net to have a given set of attractors is notoriously tricky; training procedures are CPU intensive and often produce spurious attractors and ill-conditioned attractor basins. These difficulties occur because each connection in the network participates in the encoding of multiple attractors. We describe an alternative formulation of attractor networks in which the encoding of knowledge is local, not distributed. Although localist attractor networks have similar dynamics to their distributed counterparts, they are much easier to work with and interpret. We propose a statistical formulation of localist attractor net dynamics, which yields a convergence proof and a mathematical interpretation of model parameters. We present simulation experiments that explore the behavior of localist attractor networks, showing that they yield few spurious attractors, and they readily exhibit two desirable properties of psychological and neurobiological models: priming (faster convergence to an attractor if the attractor has been recently visited) and gang effects (in which the presence of an attractor enhances the attractor basins of neighboring attractors).  相似文献   

8.
Brunel N  Hansel D 《Neural computation》2006,18(5):1066-1110
GABAergic interneurons play a major role in the emergence of various types of synchronous oscillatory patterns of activity in the central nervous system. Motivated by these experimental facts, modeling studies have investigated mechanisms for the emergence of coherent activity in networks of inhibitory neurons. However, most of these studies have focused either when the noise in the network is absent or weak or in the opposite situation when it is strong. Hence, a full picture of how noise affects the dynamics of such systems is still lacking. The aim of this letter is to provide a more comprehensive understanding of the mechanisms by which the asynchronous states in large, fully connected networks of inhibitory neurons are destabilized as a function of the noise level. Three types of single neuron models are considered: the leaky integrate-and-fire (LIF) model, the exponential integrate-and-fire (EIF), model and conductance-based models involving sodium and potassium Hodgkin-Huxley (HH) currents. We show that in all models, the instabilities of the asynchronous state can be classified in two classes. The first one consists of clustering instabilities, which exist in a restricted range of noise. These instabilities lead to synchronous patterns in which the population of neurons is broken into clusters of synchronously firing neurons. The irregularity of the firing patterns of the neurons is weak. The second class of instabilities, termed oscillatory firing rate instabilities, exists at any value of noise. They lead to cluster state at low noise. As the noise is increased, the instability occurs at larger coupling, and the pattern of firing that emerges becomes more irregular. In the regime of high noise and strong coupling, these instabilities lead to stochastic oscillations in which neurons fire in an approximately Poisson way with a common instantaneous probability of firing that oscillates in time.  相似文献   

9.
动态突触型Hopfield神经网络的动态特性研究   总被引:3,自引:1,他引:3  
王直杰  范宏  严晨 《控制与决策》2006,21(7):771-775
提出一种基于动态突触的离散型Hoppfield神经网(DSDNN)模型,给出了DSDNN的连接权值的动态演化模型及其神经元的状态更新模型.证明了DSDNN的平衡点与常规离散型Hopfield神经网络的平衡点具有一一对应的关系,分析了平衡点的稳定性.最后通过仿真分析了DSDNN的动态演化特性与其参数的关系。  相似文献   

10.
The high-conductance state of cortical networks   总被引:3,自引:0,他引:3  
We studied the dynamics of large networks of spiking neurons with conductance-based (nonlinear) synapses and compared them to networks with current-based (linear) synapses. For systems with sparse and inhibition-dominated recurrent connectivity, weak external inputs induced asynchronous irregular firing at low rates. Membrane potentials fluctuated a few millivolts below threshold, and membrane conductances were increased by a factor 2 to 5 with respect to the resting state. This combination of parameters characterizes the ongoing spiking activity typically recorded in the cortex in vivo. Many aspects of the asynchronous irregular state in conductance-based networks could be sufficiently well characterized with a simple numerical mean field approach. In particular, it correctly predicted an intriguing property of conductance-based networks that does not appear to be shared by current-based models: they exhibit states of low-rate asynchronous irregular activity that persist for some period of time even in the absence of external inputs and without cortical pacemakers. Simulations of larger networks (up to 350,000 neurons) demonstrated that the survival time of self-sustained activity increases exponentially with network size.  相似文献   

11.
We analyze a routing scheme for a broad class of networks which converges (in the Cesaro sense) with probability one to the set of approximate Cesaro-Wardrop equilibria, an extension of the notion of a Wardrop equilibrium. The network model allows for wireline networks where delays are caused by flows on links, as well as wireless networks, a primary motivation for us, where delays are caused by other flows in the vicinity of nodes. The routing algorithm is distributed, using only the local information about observed delays by the nodes, and is moreover impervious to clock offsets at nodes. The scheme is also fully asynchronous, since different iterates have their own counters and the orders of packets and their acknowledgments may be scrambled. Finally, the scheme is adaptive to the traffic patterns in the network. The demonstration of convergence in a fully dynamic context involves the treatment of two-time scale distributed asynchronous stochastic iterations. Using an ordinary differential equation approach, the invariant measures are identified. Due to a randomization feature in the algorithm, a direct stochastic analysis shows that the algorithm avoids non-Wardrop equilibria. Finally, some comments on the existence, uniqueness, stability, and other properties of Wardrop equilibria are made.  相似文献   

12.
离散Hopfield神经网络是一类特殊的反馈网络,可广泛应用于联想记忆设计、组合优化计算等方面.反馈神经网络的稳定性不仅被认为是神经网络最基本的问题之一,同时也是神经网络各种应用的基础.为此,利用状态转移方程和定义能量函数的方法,研究离散Hopfield神经网络在部分并行演化模式下的渐近行为,并举例说明了一个已有结论是错误的,同时给出了一些新的网络收敛于稳定状态的充分条件.所获结果进一步推广了一些已有的结论.  相似文献   

13.
We investigate the formation of synfire waves in a balanced network of integrate-and-fire neurons. The synaptic connectivity of this network embodies synfire chains within a sparse random connectivity. This network can exhibit global oscillations but can also operate in an asynchronous activity mode. We analyze the correlations of two neurons in a pool as convenient indicators for the state of the network. We find, using different models, that these indicators depend on a scaling variable. Beyond a critical point, strong correlations and large network oscillations are obtained. We looked for the conditions under which a synfire wave could be propagated on top of an otherwise asynchronous state of the network. This condition was found to be highly restrictive, requiring a large number of neurons for its implementation in our network. The results are based on analytic derivations and simulations.  相似文献   

14.
Fast oscillations and in particular gamma-band oscillation (20-80 Hz) are commonly observed during brain function and are at the center of several neural processing theories. In many cases, mathematical analysis of fast oscillations in neural networks has been focused on the transition between irregular and oscillatory firing viewed as an instability of the asynchronous activity. But in fact, brain slice experiments as well as detailed simulations of biological neural networks have produced a large corpus of results concerning the properties of fully developed oscillations that are far from this transition point. We propose here a mathematical approach to deal with nonlinear oscillations in a network of heterogeneous or noisy integrate-and-fire neurons connected by strong inhibition. This approach involves limited mathematical complexity and gives a good sense of the oscillation mechanism, making it an interesting tool to understand fast rhythmic activity in simulated or biological neural networks. A surprising result of our approach is that under some conditions, a change of the strength of inhibition only weakly influences the period of the oscillation. This is in contrast to standard theoretical and experimental models of interneuron network gamma oscillations (ING), where frequency tightly depends on inhibition strength, but it is similar to observations made in some in vitro preparations in the hippocampus and the olfactory bulb and in some detailed network models. This result is explained by the phenomenon of suppression that is known to occur in strongly coupled oscillating inhibitory networks but had not yet been related to the behavior of oscillation frequency.  相似文献   

15.
Miller P 《Neural computation》2006,18(6):1268-1317
Attractor networks are likely to underlie working memory and integrator circuits in the brain. It is unknown whether continuous quantities are stored in an analog manner or discretized and stored in a set of discrete attractors. In order to investigate the important issue of how to differentiate the two systems, here we compare the neuronal spiking activity that arises from a continuous (line) attractor with that from a series of discrete attractors. Stochastic fluctuations cause the position of the system along its continuous attractor to vary as a random walk, whereas in a discrete attractor, noise causes spontaneous transitions to occur between discrete states at random intervals. We calculate the statistics of spike trains of neurons firing as a Poisson process with rates that vary according to the underlying attractor network. Since individual neurons fire spikes probabilistically and since the state of the network as a whole drifts randomly, the spike trains of individual neurons follow a doubly stochastic (Poisson) point process. We compare the series of spike trains from the two systems using the autocorrelation function, Fano factor, and interspike interval (ISI) distribution. Although the variation in rate can be dramatically different, especially for short time intervals, surprisingly both the autocorrelation functions and Fano factors are identical, given appropriate scaling of the noise terms. Since the range of firing rates is limited in neurons, we also investigate systems for which the variation in rate is bounded by either rigid limits or because of leak to a single attractor state, such as the Ornstein-Uhlenbeck process. In these cases, the time dependence of the variance in rate can be different between discrete and continuous systems, so that in principle, these processes can be distinguished using second-order spike statistics.  相似文献   

16.
In this paper, we present an energy conservation scheme for wireless ad hoc and sensor networks using gossiping to place nodes in an energy saving sleep state. The technique is termed the Gossip-based Sleep Protocol (GSP). With GSP, each node randomly goes to sleep for some time with gossip sleep probability p. GSP is based on the observation that in a well connected network there are usually many paths between a source and destination, so a percentage of nodes can be in an energy conserving sleep mode without losing network connectivity. GSP needs few operations, scales to large networks and does not require a wireless node to maintain the states of other nodes. We propose two versions of GSP, one for synchronous networks and one for asynchronous networks, and afterward extend GSP to adapt to network traffic conditions. We show the advantages of the GSP approach through both simulations and analysis.  相似文献   

17.
We study asynchronous broadcasting in packet radio networks. A radio network is represented by a directed graph, in which one distinguished source node stores a message that needs to be disseminated among all the remaining nodes. An asynchronous execution of a protocol is a sequence of events, each consisting of simultaneous deliveries of messages. The correctness of protocols is considered for specific adversarial models defined by restrictions on events the adversary may schedule. A protocol specifies how many times the source message is to be retransmitted by each node. The total number of transmissions over all the nodes is called the work of the broadcast protocol; it is used as complexity measure. We study computational problems, to be solved by deterministic centralized algorithms, either to find a broadcast protocol or to verify the correctness of a protocol, for a given network. The amount of work necessary to make a protocol correct may have to be exponential in the size of network. There is a polynomial-time algorithm to find a broadcast protocol for a given network. We show that certain problems about broadcasting protocols for given networks are complete in NP and co-NP complexity classes.  相似文献   

18.
Determining the architecture of a neural network is an important issue for any learning task. For recurrent neural networks no general methods exist that permit the estimation of the number of layers of hidden neurons, the size of layers or the number of weights. We present a simple pruning heuristic that significantly improves the generalization performance of trained recurrent networks. We illustrate this heuristic by training a fully recurrent neural network on positive and negative strings of a regular grammar. We also show that rules extracted from networks trained with this pruning heuristic are more consistent with the rules to be learned. This performance improvement is obtained by pruning and retraining the networks. Simulations are shown for training and pruning a recurrent neural net on strings generated by two regular grammars, a randomly-generated 10-state grammar and an 8-state, triple-parity grammar. Further simulations indicate that this pruning method can have generalization performance superior to that obtained by training with weight decay.  相似文献   

19.
Artificial neural networks (ANNs) are one of the hottest topics in computer science and artificial intelligence due to their potential and advantages in analyzing real-world problems in various disciplines, including but not limited to physics, biology, chemistry, and engineering. However, ANNs lack several key characteristics of biological neural networks, such as sparsity, scale-freeness, and small-worldness. The concept of sparse and scale-free neural networks has been introduced to fill this gap. Network sparsity is implemented by removing weak weights between neurons during the learning process and replacing them with random weights. When the network is initialized, the neural network is fully connected, which means the number of weights is four times the number of neurons. In this study, considering that a biological neural network has some degree of initial sparsity, we design an ANN with a prescribed level of initial sparsity. The neural network is tested on handwritten digits, Arabic characters, CIFAR-10, and Reuters newswire topics. Simulations show that it is possible to reduce the number of weights by up to 50% without losing prediction accuracy. Moreover, in both cases, the testing time is dramatically reduced compared with fully connected ANNs.  相似文献   

20.
A synthesis procedure for brain-state-in-a-box neural networks   总被引:1,自引:0,他引:1  
In this paper, some new qualitative properties of discrete-time neural networks based on the "brain-state-in-a-box" model are presented. These properties concern both the characterization of equilibrium points and the global dynamical behavior. Next, the analysis results are used as guidelines in developing an efficient synthesis procedure for networks that function as associative memories. A constrained design algorithm is presented that gives completely stable dynamical neural networks sharing some interesting features. It is guaranteed the absence of nonbinary stable equilibria, that is stable states with nonsaturated components. It is guaranteed that in close proximity (Hamming distance one) of the stored patterns there is no other binary equilibrium point. Moreover, the presented method allows one to optimize a design parameter that controls the size of the attraction basins of the stored vectors and the accuracy needed in a digital realization of the network.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号