torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
sc2-benchmark
[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing"
feddistill
Code to reproduce the experiments of the ICLR25 paper "On the Byzantine-Resilience of Distillation-Based Federated Learning"
https://github.com/chenhongyiyang/pgd
[ECCV 2022] Prediction-Guided Distillation for Dense Object Detection
icsfsurvey
Explore concepts like Self-Correct, Self-Refine, Self-Improve, Self-Contradict, Self-Play, and Self-Knowledge, alongside o1-like reasoning elevation🍓 and hallucination alleviation🍄.
mtcm_kd
multi-teacher cross-modal knowledge distilaltion for unimodal brain tumor segmentation