首页 | 本学科首页   官方微博 | 高级检索  
     


Area Efficient Pattern Representation of Binary Neural Networks on RRAM
Authors:Feng Wang  Guo-Jie Luo  Guang-Yu Sun  Yu-Hao Wang  Di-Min Niu  Hong-Zhong Zheng
Affiliation:Center for Energy-Efficient Computing and Applications,Peking University,Beijing 100871,China;Pingtouge,Alibaba Group,Hangzhou 310052,China
Abstract:Resistive random access memory (RRAM) has been demonstrated to implement multiply-and-accumulate(MAC) operations using a highly parallel analog fashion,which dramatically accelerates the convolutional neural networks(CNNs).Since CNNs require considerable converters between analog crossbars and digital peripheral circuits,recent studies map the binary neural networks (BNNs) onto RRAM and binarize the weights to {+1,-1}.However,two mainstream representations for BNN weights introduce patterns of redundant 0s and 1s when dealing with negative weights.In this work,we reduce the area of redundant 0s and 1s by proposing a BNN weight representation framework based on the novel pattern representation and a corresponding architecture.First,we spilt the weight matrix into several small matrices by clustering adjacent columns together.Second,we extract 1s' patterns,i.e.,the submatrices only containing 1s,from the small weight matrix,such that each final output can be represented by the sum of several patterns.Third,we map these patterns onto RRAM crossbars,including pattern computation crossbars (PCCs) and pattern accumulation crossbars(PACs).Finally,we compare the pattern representation with two mainstream representations and adopt the more area efficient one.The evaluation results demonstrate that our framework can save over 20% of crossbar area effectively,compared with two mainstream representations.
Keywords:binary neural network (BNN)  pattern  resistive random access memory (RRAM)
本文献已被 万方数据 等数据库收录!
点击此处可从《计算机科学技术学报》浏览原始摘要信息
点击此处可从《计算机科学技术学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号