Please download and preprocess the point cloud datasets according to the dataset guidance
- 3DTrans has supported the Autonomous Driving Pre-training using the PointContrast
- We are exploring the effective pre-training solution by means of ONCE dataset, and if you are interested in this topic, do not hesitate to contact me ([email protected]).
The detailed network structure information of each module in 3DTrans, where we leverage multi-source domains with significant data-level differences to perform the point-cloud pre-training task.
- a) Train PV-RCNN++ backbone with PointContrast using multiple GPUs
sh scripts/PRETRAIN/dist_train_pointcontrast.sh ${NUM_GPUs} \
--cfg_file ./cfgs/once_models/unsupervised_model/pointcontrast_pvrcnn_res_plus_backbone.yaml \
--batch_size 4 \
--epochs 30
or
- a) Train PV-RCNN++ backbone with PointContrast using multiple machines
sh scripts/PRETRAIN/slurm_train_pointcontrast.sh ${PARTITION} ${JOB_NAME} ${NUM_NODES} \
--cfg_file ./cfgs/once_models/unsupervised_model/pointcontrast_pvrcnn_res_plus_backbone.yaml \
--batch_size 4 \
--epochs 30
- b) Fine-tuning PV-RCNN++ on other 3D datasets such as Waymo
Note that you need to set the
--pretrained_model ${PRETRAINED_MODEL}
using the checkpoint obtained in the Pre-training phase.
sh scripts/dist_train.sh ${NUM_GPUs} \
--cfg_file ${CONFIG_FILE} \
--batch_size ${BATCH_SIZE} \
--pretrained_model ${PRETRAINED_MODEL}
💪 💪 We are actively exploring the possibility of boosting the 3D pre-training generalization ability. The corresponding code is coming soon in 3DTrans-v0.2.0.