Skip to content
/ NCFM Public

Official PyTorch implementation of the paper "Dataset Distillation with Neural Characteristic Function" (NCFM) in CVPR 2025.

Notifications You must be signed in to change notification settings

gszfwsb/NCFM

Repository files navigation

[CVPR2025] Dataset Distillation with Neural Characteristic Function: A Minmax Perspective

Official PyTorch implementation of the paper "Dataset Distillation with Neural Characteristic Function" (NCFM) in CVPR 2025.

🔥 News

  • [2025/03/02] The code of our paper has been released.
  • [2025/02/27] Our NCFM paper has been accepted to CVPR 2025 (Rating: 555). Thanks!

🚀 Pipeline

Here's an overview of the process behind our Neural Characteristic Function Matching (NCFM) method:

Figure 1

🔍 TODO

We are currently organizing all the code. Stay tuned!

  • Distillation code
  • Evaluation code
  • Sampling network
  • Config files
  • Pretrained models
  • Distilled datasets
  • Continual learning code
  • Project page

🛠️ Getting Started

To get started with NCFM, follow the installation instructions below.

  1. Clone the repo
git clone https://github.com/gszfwsb/NCFM.git
  1. Install dependencies
pip install -r requirements.txt
  1. Pretrain the models yourself, or download the pretrained_models from Google Drive.
cd pretrain
torchrun --nproc_per_node={n_gpus} --nnodes=1 pretrain_script.py --gpu={gpu_ids} --config_path=../config/{ipc}/{dataset}.yaml
  1. Condense
cd condense 
torchrun --nproc_per_node={n_gpus} --nnodes=1 condense_script.py --gpu={gpu_ids} --ipc={ipc} --config_path=../config/{ipc}/{dataset}.yaml
  1. Evaluation
cd evaluation 
torchrun --nproc_per_node={n_gpus} --nnodes=1 evaluation_script.py --gpu={gpu_ids} --ipc={ipc} --config_path=../config/{ipc}/{dataset}.yaml --load_path={distilled_dataset.pt}

📘 Example Usage

  1. CIFAR-10
#ipc50
cd condense
torchrun --nproc_per_node=8 --nnodes=1 --master_port=34153 condense_script.py --gpu="0,1,2,3,4,5,6,7" --ipc=50 --config_path=../config/ipc50/cifar10.yaml
  1. CIFAR-100
#ipc10
cd condense
torchrun --nproc_per_node=8 --nnodes=1 --master_port=34153 condense_script.py --gpu="0,1,2,3,4,5,6,7" --ipc=10 --config_path=../config/ipc10/cifar100.yaml

📮 Contact

If you have any questions, please contact Shaobo Wang([email protected]).

📌 Citation

If you find NCFM useful for your research and applications, please cite using this BibTeX:

@misc{wang2025datasetdistillationneuralcharacteristic,
      title={Dataset Distillation with Neural Characteristic Function: A Minmax Perspective}, 
      author={Shaobo Wang and Yicun Yang and Zhiyuan Liu and Chenghao Sun and Xuming Hu and Conghui He and Linfeng Zhang},
      year={2025},
      eprint={2502.20653},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2502.20653}, 
}

Acknowledgement

We sincerely thank the developers of the following projects for their valuable contributions and inspiration: MTT, DATM, DC/DM, IDC, SRe2L, RDED, DANCE. We draw inspiration from these fantastic projects!

About

Official PyTorch implementation of the paper "Dataset Distillation with Neural Characteristic Function" (NCFM) in CVPR 2025.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages