[1] KRIZHEVSKY A, SUTSKEVER I, HINTON G E.ImageNet classification with deep convolutional neural networks[J].Communications of the ACM, 2017, 60(6):84-90. [2] SIMONYAN K, ZISSERMAN A.Very deep convolutional networks for large-scale image recognition[EB/OL].[2020-07-14].https://arxiv.org/abs/1409.1556. [3] SZEGEDY C, LIU W, JIA Y Q, et al.Going deeper with convolutions[C]//Proceedings of 2015 IEEE Conference on Computer Vision and Pattern Recognition.Washington D.C., USA:IEEE Press, 2015:1-9. [4] HE K M, ZHANG X Y, REN S Q, et al.Deep residual learning for image recognition[C]//Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition.Washington D.C., USA:IEEE Press, 2016:770-778. [5] DENIL M, SHAKIBI B, DINH L, et al.Predicting parameters in deep learning[EB/OL].[2020-07-14].http://export.arxiv.org/pdf/1306.0543. [6] LI H, KADAV A, DURDANOVIC I, et al.Pruning filters for efficient ConvNets[EB/OL].[2020-07-14].https://arxiv.org/abs/1608.08710. [7] HE Y, LIU P, WANG Z W, et al.Filter pruning via geometric Median for deep convolutional neural networks acceleration[C]//Proceedings of 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition.Washington D.C., USA:IEEEPress, 2019:1-8. [8] HAN S, POOL J, TRAN J, et al.Learning both weights and connections for efficient neural network[C]//Proceedings of the 28th International Conference on Neural Information Processing Systems.New York, USA:ACM Press, 2015:1135-1143. [9] HAN S, MAO H, DALLY W J.Deep compression:compressing deep neural networks with pruning, trained quantization and Huffman coding[EB/OL].[2020-07-14].https://arxiv.org/abs/1510.00149. [10] 叶会娟, 刘向阳.基于稀疏卷积核的卷积神经网络研究及其应用[J].信息技术, 2017, 41(10):5-9. YE H J, LIU X Y.Research and application of convolutional neural network based on sparse convolution kernel[J].Information Technology, 2017, 41(10):5-9.(in Chinese) [11] WEN W, WU C P, WANG Y D, et al.Learning structured sparsity in deep neural networks[EB/OL].[2020-07-14].https://arxiv.org/abs/1608.03665. [12] CHANGPINYO S, SANDLER M, ZHMOGINOV A.The power of sparsity in convolutional neural networks[EB/OL].[2020-07-14].https://arxiv.org/abs/1702.06257. [13] RANZATO M A, POULTNEY C, CHOPRA S, et al.Efficient learning of sparse representations with an energy-based model[C]//Proceedings of NIPS'07.New York, USA:ACMPress, 2007:1137-1144. [14] LEE H, EKANADHAM C, NG A Y.Sparse deep belief net model for visual area V2[C]//Proceedings of the 20th International Conference on Neural Information Processing Systems.NewYork, USA:ACM Press, 2008:873-880. [15] RANZATO M A, BOUREAU Y L, LECUN Y.Sparse feature learning for deep belief networks[C]//Proceedings of NIPS'07.New York, USA:ACM Press, 2007:1185-1192. [16] LEE H, BATTLE A, RAINA R, et al.Efficient sparse coding algorithms[C]//Proceedings of NIPS'07.New York, USA:ACM Press, 2007:801-808. [17] MAO H Z, HAN S, POOL J, et al.Exploring the regularity of sparse structure in convolutional neural networks[EB/OL].[2020-07-14].https://arxiv.org/abs/1705.08922. [18] DAI B, ZHU C, WIPF D.Compressing neural networks using the variational information bottleneck[EB/OL].[2020-07-14].https://arxiv.org/abs/1802.10399. [19] LOUIZOS C, WELLING M, KINGMA D P.Learning sparse neural networks through L0 regularization[EB/OL].[2020-07-14].https://arxiv.org/abs/1712.01312. [20] THEIS L, KORSHUNOVA I, TEJANI A, et al.Faster gaze prediction with dense networks and Fisher pruning[EB/OL].[2020-07-14].https://arxiv.org/abs/1801.05787. [21] HU H Y, PENG R, TAI Y W, et al.Network trimming:a data-driven neuron pruning approach towards efficient deep architectures[EB/OL].[2020-07-14].https://arxiv.org/abs/1607.03250v1. [22] 卢海伟, 夏海峰, 袁晓彤.基于滤波器注意力机制与特征缩放系数的动态网络剪枝[J].小型微型计算机系统, 2019, 40(9):1832-1838. LU H W, XIA H F, YUAN X T.Dynamic network pruning via filter attention mechanism and feature scaling factor[J].Journal of Chinese Computer Systems, 2019, 40(9):1832-1838.(in Chinese) [23] LIU Z, LI J G, SHEN Z Q, et al.Learning efficient convolutional networks through network slimming[C]//Proceedings of2017 IEEE International Conference on Computer Vision.Washington D.C., USA:IEEE Press, 2017:2755-2763. [24] SCARDAPANE S, COMMINIELLO D, HUSSAIN A, et al.Group sparse regularization for deep neural networks[J].Neurocomputing, 2017, 241:81-89. [25] HE K M, ZHANG X Y, REN S Q, et al.Identity mappings in deep residual networks[C]//Proceedings of European Conference on Computer Vision.Berlin, Germany:Springer, 2016:630-645. [26] HUANG G, LIU Z, VAN DER MAATEN L, et al.Densely connected convolutional networks[C]//Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition.Washington D.C., USA:IEEEPress, 2017:2261-2269. [27] LIU Z, SUN M J, ZHOU T H, et al.Rethinking the value of network pruning[EB/OL].[2020-07-14].https://arxiv.org/abs/1810.05270v2. |