Clean code version of YOLOv5(V6) pruning.
The original code comes from : https://github.com/midasklr/yolov5prune.
-
Basic training
- In COCO Dataset
python train.py --data coco.yaml --cfg yolov5s.yaml --weights '' --batch-size 32 --device 0 --epochs 300 --name coco --optimizer AdamW --data data/coco.yaml
- In COCO Dataset
-
Sparse training
- In COCO Dataset
python train.py --batch 32 --epochs 50 --weights weights/yolov5s.pt --data data/coco.yaml --cfg models/yolov5s.yaml --name coco_sparsity --optimizer AdamW --bn_sparsity --sparsity_rate 0.00005 --device 0
- In COCO Dataset
-
Pruning
- In COCO Dataset
python prune.py --percent 0.5 --weights runs/train/coco_sparsity13/weights/last.pt --data data/coco.yaml --cfg models/yolov5s.yaml --imgsz 640
- In COCO Dataset
-
Fine-tuning
- In COCO Dataset
python train.py --img 640 --batch 32 --epochs 100 --weights runs/val/exp1/pruned_model.pt --data data/coco.yaml --cfg models/yolov5s.yaml --name coco_ft --device 0 --optimizer AdamW --ft_pruned_model --hyp hyp.finetune_prune.yaml
- In COCO Dataset
-
Result of COCO Dataset
exp_name model optim&epoch lr sparity [email protected] note prune threshold BN weight distribution Weight coco yolov5s adamw 100 0.01 - 0.5402 - - - - coco2 yolov5s adamw 300 0.01 - 0.5534 - - - last.pt coco_sparsity yolov5s adamw 50 0.0032 0.0001 0.4826 resume official SGD 0.54 - coco_sparsity2 yolov5s adamw 50 0.0032 0.00005 0.50354 resume official SGD 0.48 - coco_sparsity3 yolov5s adamw 50 0.0032 0.0005 0.39514 resume official SGD 0.576 - coco_sparsity4 yolov5s adamw 50 0.0032 0.001 0.34889 resume official SGD 0.576 - coco_sparsity5 yolov5s adamw 50 0.0032 0.00001 0.52948 resume official SGD 0.579 - coco_sparsity6 yolov5s adamw 50 0.01 0.0005 0.51202 resume coco 0.564 - coco_sparsity10 yolov5s adamw 50 0.01 0.001 0.49504 resume coco2 0.6 - coco_sparsity11 yolov5s adamw 50 0.01 0.0005 0.52609 resume coco2 0.6 - coco_sparsity13 yolov5s adamw 100 0.01 0.0005 0.533 resume coco2 0.55 last.pt coco_sparsity14 yolov5s adamw 50 0.01 0.0007 0.515 resume coco2 0.61 - coco_sparsity15 yolov5s adamw 100 0.01 0.001 0.501 resume coco2 0.54 - -
The model of pruning coco_sparsity13
coco_sparsity13 [email protected] Params/FLOPs origin 0.537 7.2M/16.5G after 10% prune 0.5327 6.2M/15.6G after 20% prune 0.5327 5.4M/14.7G after 30% prune 0.5324 4.4M/13.8G after 33% prune 0.5281 4.2M/13.6G after 34% prune 0.5243 4.18M/13.5G after 34.5% prune 0.5203 4.14M/13.5G after 35% prune 0.2548 4.1M/13.4G after 38% prune 0.2018 3.88M/13.0G after 40% prune 0.1622 3.7M/12.7G after 42% prune 0.1194 3.6M/12.4G after 45% prune 0.0537 3.4M/12.0G after 50% prune 0.0032 3.1M/11.4G