https://github.com/google-research/vmoe
Science Score: 36.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (10.3%) to scientific vocabulary
Keywords from Contributors
Repository
Basic Info
- Host: GitHub
- Owner: google-research
- License: apache-2.0
- Language: Jupyter Notebook
- Default Branch: main
- Size: 1.77 MB
Statistics
- Stars: 659
- Watchers: 13
- Forks: 53
- Open Issues: 21
- Releases: 0
Metadata Files
README.md
Scaling Vision with Sparse Mixture of Experts
This repository contains the code for training and fine-tuning Sparse MoE models for vision (V-MoE) on ImageNet-21k, reproducing the results presented in the paper:
- Scaling Vision with Sparse Mixture of Experts, by Carlos Riquelme, Joan Puigcerver, Basil Mustafa, Maxim Neumann, Rodolphe Jenatton, André Susano Pinto, Daniel Keysers, and Neil Houlsby.
We will soon provide a colab analysing one of the models that we have released, as well as "config" files to train from scratch and fine-tune checkpoints. Stay tuned.
We also provide checkpoints, a notebook, and a config for Efficient Ensemble of Experts (E3), presented in the paper:
- Sparse MoEs meet Efficient Ensembles, by James Urquhart Allingham, Florian Wenzel, Zelda E Mariet, Basil Mustafa, Joan Puigcerver, Neil Houlsby, Ghassen Jerfel, Vincent Fortuin, Balaji Lakshminarayanan, Jasper Snoek, Dustin Tran,Carlos Riquelme Ruiz, and Rodolphe Jenatton.
Installation
Simply clone this repository.
The file requirements.txt contains the requirements that can be installed
via PyPi. However, we recommend installing jax, flax and optax
directly from GitHub, since we use some of the latest features that are not part
of any release yet.
In addition, you also have to clone the Vision Transformer repository, since we use some parts of it.
If you want to use RandAugment to train models (which we recommend if you train
on ImageNet-21k or ILSVRC2012 from scratch), you must also clone the
Cloud TPU repository, and name it
cloud_tpu.
Checkpoints
We release the checkpoints containing the weights of some models that we trained
on ImageNet (either ILSVRC2012 or ImageNet-21k). All checkpoints contain an
index file (with .index extension) and one or multiple data files (
with extension .data-nnnnn-of-NNNNN, called shards). In the following
list, we indicate only the prefix of each checkpoint.
We recommend using gsutil to
obtain the full list of files, download them, etc.
- V-MoE S/32, 8 experts on the last two odd blocks, trained from scratch on
ILSVRC2012 with RandAugment for 300 epochs:
gs://vmoe_checkpoints/vmoe_s32_last2_ilsvrc2012_randaug_light1.- Fine-tuned on ILSVRC2012 with a resolution of 384 pixels:
gs://vmoe_checkpoints/vmoe_s32_last2_ilsvrc2012_randaug_light1_ft_ilsvrc2012
- Fine-tuned on ILSVRC2012 with a resolution of 384 pixels:
- V-MoE S/32, 8 experts on the last two odd blocks, trained from scratch on
ILSVRC2012 with RandAugment for 1000 epochs:
gs://vmoe_checkpoints/vmoe_s32_last2_ilsvrc2012_randaug_medium. - V-MoE B/16, 8 experts on every odd block, trained from scratch on ImageNet-21k
with RandAugment:
gs://vmoe_checkpoints/vmoe_b16_imagenet21k_randaug_strong.- Fine-tuned on ILSVRC2012 with a resolution of 384 pixels:
gs://vmoe_checkpoints/vmoe_b16_imagenet21k_randaug_strong_ft_ilsvrc2012
- Fine-tuned on ILSVRC2012 with a resolution of 384 pixels:
- E3 S/32, 8 experts on the last two odd blocks, with two ensemble
members (i.e., the 8 experts are partitioned into two groups), trained from
scratch on ILSVRC2012 with RandAugment for 300 epochs:
gs://vmoe_checkpoints/eee_s32_last2_ilsvrc2012- Fine-tuned on CIFAR100:
gs://vmoe_checkpoints/eee_s32_last2_ilsvrc2012_ft_cifar100
- Fine-tuned on CIFAR100:
Disclaimers
This is not an officially supported Google product.
Owner
- Name: Google Research
- Login: google-research
- Kind: organization
- Location: Earth
- Website: https://research.google
- Repositories: 226
- Profile: https://github.com/google-research
GitHub Events
Total
- Issues event: 1
- Watch event: 86
- Delete event: 10
- Issue comment event: 4
- Push event: 43
- Pull request event: 30
- Fork event: 6
- Create event: 14
Last Year
- Issues event: 1
- Watch event: 86
- Delete event: 10
- Issue comment event: 3
- Push event: 43
- Pull request event: 29
- Fork event: 6
- Create event: 14
Committers
Last synced: 11 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Joan Puigcerver | j****r@g****m | 98 |
| Peter Hawkins | p****s@g****m | 15 |
| Yash Katariya | y****a@g****m | 11 |
| Jake VanderPlas | v****s@g****m | 8 |
| V-MoE Authors | n****y@g****m | 6 |
| Colin Gaffney | c****y@g****m | 5 |
| Tianlin Liu | t****u@g****m | 5 |
| Marcus Chiam | m****m@g****m | 4 |
| Rebecca Chen | r****n@g****m | 4 |
| Sergei Lebedev | s****v@g****m | 4 |
| Carlos Riquelme | r****l@g****m | 2 |
| Parker Schuh | p****s@g****m | 2 |
| Ayush Dubey | a****d@g****m | 1 |
| Andreas Steiner | a****n@g****m | 1 |
| Brennan Saeta | s****a@g****m | 1 |
| Dan Foreman-Mackey | d****m@g****m | 1 |
| Daniel Keysers | k****s@g****m | 1 |
| Iurii Kemaev | i****v@g****m | 1 |
| Jacob Burnim | j****m@g****m | 1 |
| Jake Harmon | j****n@g****m | 1 |
| Marvin Ritter | m****r@g****m | 1 |
| SE Gyges | s****s@g****m | 1 |
| Zac Mustin | z****n@g****m | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 8 months ago
All Time
- Total issues: 15
- Total pull requests: 157
- Average time to close issues: 5 months
- Average time to close pull requests: 10 days
- Total issue authors: 13
- Total pull request authors: 4
- Average comments per issue: 1.73
- Average comments per pull request: 0.07
- Merged pull requests: 56
- Bot issues: 0
- Bot pull requests: 153
Past Year
- Issues: 2
- Pull requests: 28
- Average time to close issues: N/A
- Average time to close pull requests: 20 days
- Issue authors: 2
- Pull request authors: 2
- Average comments per issue: 0.0
- Average comments per pull request: 0.18
- Merged pull requests: 20
- Bot issues: 0
- Bot pull requests: 27
Top Authors
Issue Authors
- t5862755 (2)
- nowazrabbani (2)
- DoctorLiQ (1)
- BDHU (1)
- daixiangzi (1)
- dbsi-pinkman (1)
- sorobedio (1)
- kaikai23 (1)
- wangning7149 (1)
- firestonelib (1)
- nathanlmz (1)
- seliayeu (1)
- guoyan223 (1)
Pull Request Authors
- copybara-service[bot] (170)
- segyges (2)
- rcampbell95 (2)
- guspan-tanadi (1)