首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
本文研究了双向联想记忆(BAM)神经网络的开关电流技术实现,提出了实现负权值及存储联想矢量的两个开关电流单元电路,基于此,给出了双向联想记忆网络的开关电流电路,文中对三神经元双向联想记忆SI网络进行了PSPICE仿真,结果表明所提出的SI联想记忆网络是正确的。  相似文献   

2.
Based on current research on applications of chaotic neuron network for information processing, the stability and convergence of chaotic neuron network are proved from the viewpoint of energy function. Moreover, a new auto-associative matrix is devised for artificial neural network composed of chaotic neurons, thus, an improved chaotic neuron network for associative memory is built up. Finally, the associative recalling process of the network is analyzed in detail and explanations of improvement are given.  相似文献   

3.
Low-power and low-variability artificial neuronal devices are highly desired for high-performance neuromorphic com-puting.In this paper,an oscillation neuron based on a low-variability Ag nanodots(NDs)threshold switching(TS)device with low operation voltage,large on/off ratio and high uniformity is presented.Measurement results indicate that this neuron demon-strates self-oscillation behavior under applied voltages as low as 1 V.The oscillation frequency increases with the applied voltage pulse amplitude and decreases with the load resistance.It can then be used to evaluate the resistive random-access memory(RRAM)synaptic weights accurately when the oscillation neuron is connected to the output of the RRAM crossbar ar-ray for neuromorphic computing.Meanwhile,simulation results show that a large RRAM crossbar array(>128×128)can be sup-ported by our oscillation neuron owing to the high on/off ratio(>108)of Ag NDs TS device.Moreover,the high uniformity of the Ag NDs TS device helps improve the distribution of the output frequency and suppress the degradation of neural network recognition accuracy(<1%).Therefore,the developed oscillation neuron based on the Ag NDs TS device shows great poten-tial for future neuromorphic computing applications.  相似文献   

4.
The multiple-channel novelty filters of associative memory are derived from the retina neuron network point-spread function. Scale and rotation invariance is achieved by the input data flowing through the nonuniform polar exponential sampling to a uniform output space. Image domain bandpassing techniques are then used to extract the low-band and high-band frequency contents of the equivalent novelty filters.  相似文献   

5.
An approach to storing of temporal sequences that deals with complex temporal sequences directly is presented. Short-term memory (STM) is modeled by units comprised of recurrent excitatory connections between two neurons. A dual-neuron model is proposed. By applying the Hebbian learning rule at each synapse and a normalization rule among all synaptic weights of a neuron, it is shown that a quantity called the input potential increases monotonically with sequence presentation, and that the neuron can only be fired when its input signals are arranged in a specific sequence. These sequence-detecting neurons form the basis for a model of complex sequence recognition that can tolerate distortions of the learned sequences. A recurrent network of two layers is provided for reproducing complex sequences  相似文献   

6.
忆阻器由于具有低功耗、记忆能力和纳米尺寸等特点,是实现人工神经突触的理想器件。为构建简洁、高效、功能全面地联想记忆电路,该文首先提出一种简单的神经元电路和基于压控阈值忆阻器的突触电路。然后根据巴甫洛夫联想记忆模型,设计了相应的联想记忆电路。电路结构简单,仅包含3个神经元电路和突触电路,可有效降低网络复杂度和功耗。尤为重要的是该电路可以模拟全功能的联想记忆行为,不但实现了学习、遗忘、加速学习、减速遗忘以及减速自然遗忘等功能,而且学习速率和自然遗忘速率能够根据学习的次数自动调整,使电路更具仿生性。此外,该电路与艾宾浩斯遗忘曲线相吻合,扩大了电路的适用范围。  相似文献   

7.
提出了一种新型的sigmoid函数发生器.它不仅简单、快速,与理想sigmoid函数的拟合程度好,而且可实现阈值和增益因子的编程,因而有很大的应用范围和良好的应用前景.设计了神经元以及Gilbert乘法器、数字存储器、D/A转换器等神经网络的基本单元.说明了遗传算法(GA)作为人工神经网络(ANN)学习算法的有利因素.利用上述电路,采用GA,设计了可重构ANN.对各单元电路和整个ANN都用标准1.2μm CMOS工艺的第47级模型进行了HSPICE模拟.结果表明它们的功能正确、性能优良.  相似文献   

8.
Modified intraconnected bidirectional associative memory   总被引:1,自引:0,他引:1  
Jeng  Y.-J. Yeh  C.-C. 《Electronics letters》1991,27(20):1818-1819
A modified model for the intraconnected bidirectional associative memory (IBAM) is introduced in which there are not only interfield connections but also intrafield connections added in each neuron field. In the modified IBAM recall process, the intralayer feedback processes run parallel, instead of sequentially as in the IBAM, with the interlayer processes. This results in both removal of the complement encoding problem and relaxation of the continuity assumption for reliable recalls of the BAM.<>  相似文献   

9.
Neuromorphic computing, which merges learning and memory functions, is a new computing paradigm surpassing traditional von Neumann architecture. Apart from the plasticity of artificial synapses, the simulation of neurons’ multi-input signal integration is also of great significance to realize efficient neuromorphic computing. Since the structure of transistors and neurons is strikingly similar, capacitively coupled multi-terminal pectin-gated oxide electric double layer transistors are proposed here as artificial neurons for classification. In this work, the free logic switching of “AND” and “OR” is realized in the device with triple in-plane gates. More importantly, the linear classification function on a single neuron transistor is demonstrated experimentally for the first time. All the results obtained in this work indicate that the prepared artificial neuron can improve the efficiency of artificial neural networks and thus will play an important role in neuromorphic computing.  相似文献   

10.
A quantizer neuron model and a hardware implementation of the model is described. A quantizer neuron model and a multifunctional layered network (MFLN) with quantizer neurons is proposed and applied to a character recognition system. Each layer of MFLN has a specific function defined by quantizer input, and the weights between neurons are set dynamically according to quantizer inputs. The learning speed of MFLN is extremely fast in comparison with conventional multilayered perceptrons using back propagation, and the structure of MFLN is suitable for supplemental learning with extraneous learning data sets. We tested the learning speed and compared it with three other network models: RCE networks, LVQ3, and multilayered neural network with back propagation. According to the simulation, we also developed a quantizer neuron chip (QNC) using two newly developed schemes. QNC simulates MFLN and has 4736 neurons and 2000000 synaptic weights. The processing speed of the chip achieved 20300000000 connections per second (GCPS) for recognition and 20 000 000 connection updates per second (MCUPS) for learning. QNC is implemented in a 1.2 μm double-metal CMOS-process sea of gates and contains 27 000 gates on a 10.99×10.93 mm2 die. The neuroboard, which consists of a main board with a QNC and a memory board for synaptic weights of the neurons, can be connected to a host personal computer and can be used for image or character recognition and learning. The quantizer neuron model, the quantizer neuron chip, and the neuroboard with QNC can realize adaptive learning or filtering  相似文献   

11.
A general model of the intraconnected bidirectional associative memory (GIBAM) is proposed. In a GIBAM, states are represented by complex values on the unit circle of the complex plane, and the weight matrices are learned using the generalised inverse technique. The stability of the GIBAM is demonstrated by defining an energy function which decreases with a change in neuron states. Computer simulation demonstrates that the GIBAM has a much higher storage capacity and much better error correcting capability than Jengs modified IBAM  相似文献   

12.
研究了离散Hopfield神经网络(DHNN)和联想记忆神经网络的开关电流技术实现,利用多权输入跨导,开关电流延迟器(SID)和可编程电流比较器(PCC)实现了离散Hopield神经网络,并提出了利用离散Hopfield神经网络实现自联想记忆时相应的开关电流电路,所提出了开关电流神经网络适宜于超大规模集成,能在低电压(如3.3V)下工作。  相似文献   

13.
With the rapid development of artificial intelligence, the simulation of the human brain for neuromorphic computing has demonstrated unprecedented progress. Photonic artificial synapses are strongly desirable owing to their higher neuron selectivity, lower crosstalk, wavelength multiplexing capabilities, and low operating power compared to their electric counterparts. This study demonstrates a highly transparent and flexible artificial synapse with a two-terminal architecture that emulates photonic synaptic functionalities. This optically triggered artificial synapse exhibits clear synaptic characteristics such as paired-pulse facilitation, short/long-term memory, and synaptic behavior analogous to that of the iris in the human eye. Ultraviolet light illumination-induced neuromorphic characteristics exhibited by the synapse are attributed to carrier trapping and detrapping in the SnO2 nanoparticles and CsPbCl3 perovskite interface. Moreover, the ability to detect deep red light without changes in synaptic behavior indicates the potential for dual-mode operation. This study establishes a novel two-terminal architecture for highly transparent and flexible photonic artificial synapse that can help facilitate higher integration density of transparent 3D stacking memristors, and make it possible to approach optical learning, memory, computing, and visual recognition.  相似文献   

14.
杨淑云  李盼池 《信号处理》2014,30(4):374-383
当使用神经网络解决问题时, 得到的结果与神经网络的逼近能力有很大关系。如何提高神经网络的逼近能力目前还没有较为理想的解决方法。本文提出了一种利用多位量子受控非门来构造神经网络模型的新方法。该模型为三层结构,隐层为量子神经元,输出层为普通神经元。量子神经元由量子旋转门和多位受控非门组成,利用多位受控非门中目标量子位的输出向输入端的反馈,实现对输入序列的整体记忆,利用多位受控非门的受控关系获得量子神经元的输出。基于量子计算原理设计了该模型的L M学习算法。该模型可从宽度和深度两方面获取输入序列的特征。纸牌预测的实验结果表明,当输入节点数和序列长度比较接近时,该模型对训练集的识别率比普通神经网络有大约8%的提高,从而揭示了量子计算机制对提高网络逼近能力的有效性。   相似文献   

15.
杨忠斌  吕明钊  胡宝雷 《电子科技》2013,26(7):67-71,73
提出了一种改进的可连续点火的交叉视觉皮质模型,并将其运用到图像增强当中。新模型利用了点火阈值对神经元行为的长期记忆能力,将其作为反馈和外部激励共同决定神经元的状态值;将阈值衰减因子替换为函数变换,从而可根据具体应用灵活选择合适的衰减函数;同时在点火神经元的阈值提升时增加了邻域判读,以决定对应阈值是否发生陡变,且实现了神经元的连续点火。在此基础上,提出了基于FC_ICM的直方图均衡化方法,对比基于CLAHE和ICM的图像增强方法,实验结果表明,该方法可有效提升全局和局部对比度,及有效抑制噪声。  相似文献   

16.
Artificial synapses are key elements for the nervous system which is an emulation of sensory and motor neuron signal transmission. Here, the design and fabrication of redox-behavior the metal carbide nanosheets, termed MXene artificial synapse, which uses a highly-conductive MXene electrode, are reported. Benefiting from the special working mechanism of ion migration with adsorption and insertion, the device achieves world-record power consumption (460 fW) of two-terminal synaptic devices, and so far, the bidirectionally functioned synaptic device could effectively respond to ultra-small stimuli at an amplitude of ±80 mV, even exceeding that of a biological synapse. Potential applications have also been demonstrated, such as dendritic integration and memory enhancement. The special strategy and superior electrical characteristics of the bidirectionally functioned electronic device pave the way to high-power-efficiency brain-inspired electronics and artificial peripheral systems.  相似文献   

17.
设计一种具有多种神经元响应模式、结构紧凑、低功耗的神经元电路,对大规模神经形态硬件的构建具有重要意义。分析了LIF、Izhikevich两种神经元模型的基本原理,重点介绍了数字和模拟两类神经元电路的设计方法、工作原理和优缺点。最后,讨论了神经元电路的设计趋势以及挑战。  相似文献   

18.
A McCulloch-Pitts neuron is the simplified neuron model which has been successfully used for many optimisation problems. The neural network with the hysteresis property can suppress the oscillatory behaviours of neural dynamics so that the convergence time is shortened. In this paper, digital CMOS layout design of the hysteresis McCulloch-Pitts neuron is presented. Based on simulation results using the hysteresis McCulloch-Pitts binary neuron model, a 6-bit fixed point 2's complement arithmetic was adopted for the calculation of the input U of each neuron. Each neuron needs 204 transistors and requires a 399 lambda *368 lambda layout area using the MOSIS scalable CMOS/bulk (SCMOS) VLSI technology with 2 mu m rule of P well, double level metal. Layout design of the hysteresis McCulloch-Pitts neuron chip was completed, and fabrication of the chip and the design for the test circuit for the fabricated CMOS VLSI chip are underway at present.<>  相似文献   

19.
电阻耦合型神经 MOS晶体管是在电容耦合 (浮栅 )型神经 MOS晶体管基础上提出来的 ,它克服了电容耦合型神经 MOS晶体管中由于电容耦合而产生的缺点。文中介绍了电阻耦合型神经 MOS晶体管的基本结构和特点 ,并将它应用于差分四象限模拟乘法器  相似文献   

20.
We present an analog BiCMOS neuron circuit with linear input synapses which implements the standard Hopfield neuron model. The use of bipolar transistors allows linear multiplication, exact replication of the sigmoid function, and precise linear gain control. SPICE simulations are presented which demonstrate the properties of the neuron. The use of the basic neuron structure to build multilayer networks is discussed, as are some large-scale integration issues.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号