[ECCV'20] A Generic Graph-based Neural Architecture Encoding Scheme for Predictor-based NAS]

The predictor-based NAS workflow and the two designs in GATES.

Abstract

This work proposes a novel Graph-based neural ArchiTecture Encoding Scheme, a.k.a. GATES, to improve the predictor-based neural architecture search. Specifically, different from existing graphbased schemes, GATES models the operations as the transformation of the propagating information, which mimics the actual data processing of neural architecture. GATES is a more reasonable modeling of the neural architectures, and can encode architectures from both the “operation on node” and “operation on edge” cell search spaces consistently. Experimental results on various search spaces confirm GATES’s effectiveness in improving the performance predictor. Furthermore, equipped with the improved performance predictor, the sample efficiency of the predictor-based neural architecture search (NAS) flow is boosted.

Publication
In ECCV'20
Xuefei Ning
Xuefei Ning
Research Assistant Professor at Tsinghua University

My primary research interests are neural architecture search, efficient deep learning.