This repository is the official implementation of Supernet Shifting
Boosting Order-Preserving and Transferability for Neural Architecture Search: a Joint Architecture Refined Search and Fine-tuning Approach (accepted by CVPR2024)
Beichen Zhang, Xiaoxing Wang, Xiaohan Qin, Junchi Yan
Download the ImageNet Dataset and move the images to labeled folders. Download the Flops table used in Flops calculation. It is proposed by SPOS and can be founded in Link The structure of dataset should be
data
|--- train ImageNet Training Dataset
|--- val ImageNet Validation Dataset
|--- op_flops_dict.pkl Flops Table
Train the supernet with the following command:
cd supernet
python3 train.py --train-dir $YOUR_TRAINDATASET_PATH --val-dir $YOUR_VALDATASET_PATH
First, change the data root in imagenet_dataset.py
Apply supernet shifting and architecture searching in the following command
cd search
python3 search.py
If you want to transfer the supernet weight to a new dataset, first, change the data root and the dataloader in imagenet_dataset.py
, then run the following comand
cd search
python3 search.py --new_dataset True --n_class $new_dataset_classes
Get searched architecture with the following command:
cd evaluation
python3 eval.py
Finally, train and evaluate the searched architecture with the following command.
cd evaluation/data/$YOUR_ARCHITECTURE
python3 train.py --train-dir $YOUR_TRAINDATASET_PATH --val-dir $YOUR_VALDATASET_PATH
If you use these models in your research, please cite:
@article{zhang2024boosting,
title={Boosting Order-Preserving and Transferability for Neural Architecture Search: a Joint Architecture Refined Search and Fine-tuning Approach},
author={Beichen Zhang and Xiaoxing Wang and Xiaohan Qin and Junchi Yan},
journal={arXiv preprint arXiv:2403.11380},
year={2024}
}