Xuefei Ning
Xuefei Ning
Home
Updates
Publications
Talks
Light
Dark
Automatic
Neural Architecture Search
[TPAMI'23] A Generic Graph-based Neural Architecture Encoding Scheme with Multifaceted Information
An extension of GATES@ECCV'20. We incorporate multifaceted information about NN’s operation-level and architecture-level computing semantics into the construction and training of the encoder.
Xuefei Ning
,
Yin Zheng
,
Zixuan Zhou
,
Tianchen Zhao
,
Huazhong Yang
,
Yu Wang
Last updated on May 7, 2024
Cite
Code
Model Compression Towards Efficient Deep Learning Inference
A talk on model compression towards efficient DL inference
Last updated on Aug 29, 2023
Slides
Neural Architecture Search and Architecture Encoding
A talk on NAS researches at Renmin University.
Last updated on Dec 12, 2022
Slides
[ECCV'22] CLOSE: Curriculum Learning On the Sharing Extent Towards Better One-shot NAS
To improve one-shot NAS, we apply curriculum learning to the sharing extent in one-shot supernet to achieve better ranking correlation.
Zixuan Zhou
,
Xuefei Ning
,
Yi Cai
,
Jiashu Han
,
Yiping Deng
,
Yuhan Dong
,
Huazhong Yang
,
Yu Wang
Last updated on May 7, 2024
PDF
Cite
Code
Slides
Website
[NeurIPS'21] Evaluating Efficient Performance Estimators of Neural Architectures
We study one-shot performance estimators and 8 types of zero-shot estimators on 5 different benchmarks (NB101, 201, 301, NDS ResNet, ResNeXt-A).
Xuefei Ning
,
Changcheng Tang
,
Wenshuo Li
,
Zixuan Zhou
,
Shuang Liang
,
Huazhong Yang
,
Yu Wang
Last updated on May 7, 2024
PDF
Cite
Code
Poster
Slides
Website
[NeurIPS'22] TA-GATES: An Encoding Scheme for Neural Network Architectures
TA-GATES is an encoding scheme specially designed for neural architectures, considering their distinguishing properties as DAGs with trainable operations. TA-GATES encodes an architecture by mimicking its training process, and thereby provides more discriminative architecture-level and operation-level encodings.
Xuefei Ning
,
Zixuan Zhou
,
Junbo Zhao
,
Tianchen Zhao
,
Yiping Deng
,
Changcheng Tang
,
Shuang Liang
,
Huazhong Yang
,
Yu Wang
Last updated on May 7, 2024
PDF
Cite
Code
Slides
Website
[ECCV'20] A Generic Graph-based Neural Architecture Encoding Scheme for Predictor-based NAS]
In order to improve the sample efficiency of NAS, we follow the line of predictor-based NAS and improve the encoder design and training of the predictor. (1) We design a Generic Graph-based neural ArchiTecture Encoding Scheme (GATES) to better encode NN architectures. (2) We propose to use ranking loss to train the predictor.
Xuefei Ning
,
Yin Zheng
,
Tianchen Zhao
,
Yu Wang
,
Huazhong Yang
Last updated on May 7, 2024
PDF
Cite
Code
Slides
Website
[AAAI'23] Dynamic Ensemble of Low-fidelity Experts: Mitigating NAS Cold-Start
To mitigate the cold-start problem of predictor-based NAS, we design an ensemble method to fuse the knowledge from multiple experts trained with low-fidelity architectural information (e.g., complexities, zero-shot metrics).
Junbo Zhao
,
Xuefei Ning
,
Enshu Liu
,
Binxin Ru
,
Zixuan Zhou
,
Tianchen Zhao
,
Chen Chen
,
Jiajin Zhang
,
Qingmin Liao
,
Yu Wang
Last updated on May 7, 2024
PDF
Cite
Code
Website