Updated 6 months ago

instances • Rank 33.1 • Science 77%

The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT), MobileNetV4, MobileNet-V3 & V2, RegNet, DPN, CSPNet, Swin Transformer, MaxViT, CoAtNet, ConvNeXt, and more

Updated 6 months ago

hivemind • Rank 23.0 • Science 77%

Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.

Updated 4 months ago

https://github.com/deeprec-ai/deeprec • Rank 14.9 • Science 23%

DeepRec is a high-performance recommendation deep learning framework based on TensorFlow. It is hosted in incubation in LF AI & Data Foundation.

Updated 6 months ago

relora • Rank 6.1 • Science 26%

Official code for ReLoRA from the paper Stack More Layers Differently: High-Rank Training Through Low-Rank Updates

Updated 5 months ago

https://github.com/bytedance/byteps • Rank 16.0 • Science 10%

A high performance and generic framework for distributed DNN training

Updated 5 months ago

https://github.com/cornell-zhang/hoga • Science 23%

Hop-Wise Graph Attention for Scalable and Generalizable Learning on Circuits

Updated 5 months ago

https://github.com/awslabs/dynamic-training-with-apache-mxnet-on-aws • Science 10%

Dynamic training with Apache MXNet reduces cost and time for training deep neural networks by leveraging AWS cloud elasticity and scale. The system reduces training cost and time by dynamically updating the training cluster size during training, with minimal impact on model training accuracy.