Updated 6 months ago

torchdistill • Rank 15.0 • Science 59%

A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

Updated 6 months ago

rotated-ld • Science 67%

Rotated Localization Distillation (CVPR 2022, TPAMI 2023)

Updated 6 months ago

fasterai • Science 57%

FasterAI: Prune and Distill your models with FastAI and PyTorch

Updated 6 months ago

feddistill • Science 54%

Code to reproduce the experiments of the ICLR25 paper "On the Byzantine-Resilience of Distillation-Based Federated Learning"

Updated 6 months ago

https://github.com/chenhongyiyang/pgd • Science 10%

[ECCV 2022] Prediction-Guided Distillation for Dense Object Detection

Updated 6 months ago

icsfsurvey • Science 54%

Explore concepts like Self-Correct, Self-Refine, Self-Improve, Self-Contradict, Self-Play, and Self-Knowledge, alongside o1-like reasoning elevation🍓 and hallucination alleviation🍄.

Updated 6 months ago

mtcm_kd • Science 54%

multi-teacher cross-modal knowledge distilaltion for unimodal brain tumor segmentation