首页 | 本学科首页   官方微博 | 高级检索  
     


AB9: A neural processor for inference acceleration
Authors:Yong Cheol Peter Cho  Jaehoon Chung  Jeongmin Yang  Chun‐Gi Lyuh  HyunMi Kim  Chan Kim  Je‐seok Ham  Minseok Choi  Kyoungseon Shin  Jinho Han  Youngsu Kwon
Abstract:We present AB9, a neural processor for inference acceleration. AB9 consists of a systolic tensor core (STC) neural network accelerator designed to accelerate artificial intelligence applications by exploiting the data reuse and parallelism characteristics inherent in neural networks while providing fast access to large on‐chip memory. Complementing the hardware is an intuitive and user‐friendly development environment that includes a simulator and an implementation flow that provides a high degree of programmability with a short development time. Along with a 40‐TFLOP STC that includes 32k arithmetic units and over 36 MB of on‐chip SRAM, our baseline implementation of AB9 consists of a 1‐GHz quad‐core setup with other various industry‐standard peripheral intellectual properties. The acceleration performance and power efficiency were evaluated using YOLOv2, and the results show that AB9 has superior performance and power efficiency to that of a general‐purpose graphics processing unit implementation. AB9 has been taped out in the TSMC 28‐nm process with a chip size of 17 × 23 mm2. Delivery is expected later this year.
Keywords:AI SoC  inference  neural network accelerator
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号