Fine-grained neural architecture search for image super-resolution |
| |
Affiliation: | 1. School of Electronic and Information Engineering, Anhui Jianzhu University, Hefei 230601, China;2. School of Mathematics, Hefei University of Technology, Hefei 230009, China |
| |
Abstract: | Designing efficient deep neural networks has achieved great interest in image super-resolution (SR). However, exploring diverse network structures is computationally expensive. More importantly, each layer in a network has a distinct role that leads to the design of a specialized structure. In this work, we present a novel neural architecture search (NAS) algorithm that efficiently explores layer-wise structures. Specifically, we construct a supernet allowing flexibility in choosing the number of channels and per-channel activation functions according to the role of each layer. The search process runs efficiently via channel pruning since gradient descent jointly optimizes the Mult-Adds and the accuracy of the searched models. We facilitate estimating the model Mult-Adds in a differentiable manner using relaxations in the backward pass. The searched model, named FGNAS, outperforms the state-of-the-art NAS-based SR methods by a large margin. |
| |
Keywords: | Image super-resolution Neural architecture search Convolutional neural network |
本文献已被 ScienceDirect 等数据库收录! |
|