py-torchmeta

A collection of extensions and data-loaders for few-shot learning & meta-learning in PyTorch

https://github.com/tristandeleu/pytorch-meta

Science Score: 38.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Committers with academic emails
    3 of 12 committers (25.0%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (9.0%) to scientific vocabulary

Keywords

few-shot-learning meta-learning pytorch
Last synced: 6 months ago · JSON representation ·

Repository

A collection of extensions and data-loaders for few-shot learning & meta-learning in PyTorch

Basic Info
Statistics
  • Stars: 2,034
  • Watchers: 41
  • Forks: 261
  • Open Issues: 60
  • Releases: 0
Topics
few-shot-learning meta-learning pytorch
Created about 7 years ago · Last pushed over 2 years ago
Metadata Files
Readme License Citation

README.md

Torchmeta

PyPI Build Status Documentation

A collection of extensions and data-loaders for few-shot learning & meta-learning in PyTorch. Torchmeta contains popular meta-learning benchmarks, fully compatible with both torchvision and PyTorch's DataLoader.

Features

  • A unified interface for both few-shot classification and regression problems, to allow easy benchmarking on multiple problems and reproducibility.
  • Helper functions for some popular problems, with default arguments from the literature.
  • An thin extension of PyTorch's Module, called MetaModule, that simplifies the creation of certain meta-learning models (e.g. gradient based meta-learning methods). See the MAML example for an example using MetaModule.

Datasets available

Installation

You can install Torchmeta either using Python's package manager pip, or from source. To avoid any conflict with your existing Python setup, it is suggested to work in a virtual environment with virtualenv. To install virtualenv: bash pip install --upgrade virtualenv virtualenv venv source venv/bin/activate

Requirements

  • Python 3.6 or above
  • PyTorch 1.4 or above
  • Torchvision 0.5 or above

Using pip

This is the recommended way to install Torchmeta: bash pip install torchmeta

From source

You can also install Torchmeta from source. This is recommended if you want to contribute to Torchmeta. bash git clone https://github.com/tristandeleu/pytorch-meta.git cd pytorch-meta python setup.py install

Example

Minimal example

This minimal example below shows how to create a dataloader for the 5-shot 5-way Omniglot dataset with Torchmeta. The dataloader loads a batch of randomly generated tasks, and all the samples are concatenated into a single tensor. For more examples, check the examples folder. ```python from torchmeta.datasets.helpers import omniglot from torchmeta.utils.data import BatchMetaDataLoader

dataset = omniglot("data", ways=5, shots=5, testshots=15, metatrain=True, download=True) dataloader = BatchMetaDataLoader(dataset, batchsize=16, numworkers=4)

for batch in dataloader: traininputs, traintargets = batch["train"] print('Train inputs shape: {0}'.format(traininputs.shape)) # (16, 25, 1, 28, 28) print('Train targets shape: {0}'.format(traintargets.shape)) # (16, 25)

test_inputs, test_targets = batch["test"]
print('Test inputs shape: {0}'.format(test_inputs.shape))      # (16, 75, 1, 28, 28)
print('Test targets shape: {0}'.format(test_targets.shape))    # (16, 75)

```

Advanced example

Helper functions are only available for some of the datasets available. However, all of them are available through the unified interface provided by Torchmeta. The variable dataset defined above is equivalent to the following ```python from torchmeta.datasets import Omniglot from torchmeta.transforms import Categorical, ClassSplitter, Rotation from torchvision.transforms import Compose, Resize, ToTensor from torchmeta.utils.data import BatchMetaDataLoader

dataset = Omniglot("data", # Number of ways numclassespertask=5, # Resize the images to 28x28 and converts them to PyTorch tensors (from Torchvision) transform=Compose([Resize(28), ToTensor()]), # Transform the labels to integers (e.g. ("Glagolitic/character01", "Sanskrit/character14", ...) to (0, 1, ...)) targettransform=Categorical(numclasses=5), # Creates new virtual classes with rotated versions of the images (from Santoro et al., 2016) classaugmentations=[Rotation([90, 180, 270])], metatrain=True, download=True) dataset = ClassSplitter(dataset, shuffle=True, numtrainperclass=5, numtestperclass=15) dataloader = BatchMetaDataLoader(dataset, batchsize=16, num_workers=4) ``` Note that the dataloader, receiving the dataset, remains the same.

Citation

Tristan Deleu, Tobias Würfl, Mandana Samiei, Joseph Paul Cohen, and Yoshua Bengio. Torchmeta: A Meta-Learning library for PyTorch, 2019 [ArXiv]

If you want to cite Torchmeta, use the following Bibtex entry: @misc{deleu2019torchmeta, title={{Torchmeta: A Meta-Learning library for PyTorch}}, author={Deleu, Tristan and W\"urfl, Tobias and Samiei, Mandana and Cohen, Joseph Paul and Bengio, Yoshua}, year={2019}, url={https://arxiv.org/abs/1909.06576}, note={Available at: https://github.com/tristandeleu/pytorch-meta} }

Owner

  • Name: Tristan Deleu
  • Login: tristandeleu
  • Kind: user

Citation (CITATION)

@misc{deleu2019torchmeta,
  title={{Torchmeta: A Meta-Learning library for PyTorch}},
  author={Deleu, Tristan and W\"urfl, Tobias and Samiei, Mandana and Cohen, Joseph Paul and Bengio, Yoshua},
  year={2019},
  url={https://arxiv.org/abs/1909.06576},
  note={Available at: https://github.com/tristandeleu/pytorch-meta}
}

GitHub Events

Total
  • Watch event: 77
  • Issue comment event: 2
  • Pull request event: 1
  • Fork event: 10
Last Year
  • Watch event: 77
  • Issue comment event: 2
  • Pull request event: 1
  • Fork event: 10

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 353
  • Total Committers: 12
  • Avg Commits per committer: 29.417
  • Development Distribution Score (DDS): 0.235
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Tristan Deleu t****u@g****m 270
Maren Mahsereci m****c@a****m 29
Tobias Wuerfl t****l@f****e 24
Marc Rußwurm m****m@t****e 8
Gabriel Dahia g****a@g****m 6
Edward Grefenstette e****n 4
Mattie Tesfaldet m****e@e****m 4
Mennatullah Siam m****l@u****a 3
sleepyowl s****2@y****m 2
marcociccone m****e@p****t 1
Pavel Denisov p****v@i****e 1
Gene e****n@g****m 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 9 months ago

All Time
  • Total issues: 93
  • Total pull requests: 13
  • Average time to close issues: 27 days
  • Average time to close pull requests: 4 months
  • Total issue authors: 60
  • Total pull request authors: 11
  • Average comments per issue: 2.99
  • Average comments per pull request: 0.38
  • Merged pull requests: 3
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 1
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 1
  • Average comments per issue: 0
  • Average comments per pull request: 0.0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • brando90 (19)
  • renesax14 (8)
  • guozhiyao (2)
  • ajoshi80 (2)
  • sudarshan1994 (2)
  • ximinng (2)
  • raymond00000 (2)
  • jaeho3690 (2)
  • Ingvar-Y (2)
  • SebGGruber (2)
  • andro-demir (1)
  • carmete (1)
  • ojss (1)
  • Lelouch-Ice (1)
  • QiyaoWei (1)
Pull Request Authors
  • mmahsereci (2)
  • pbsds (2)
  • zobertke (2)
  • Clyde21c (1)
  • RobvanGastel (1)
  • tristandeleu (1)
  • janbolle (1)
  • rosikand (1)
  • rees-c (1)
  • TrellixVulnTeam (1)
  • sleepy-owl (1)
Top Labels
Issue Labels
Pull Request Labels

Packages

  • Total packages: 2
  • Total downloads:
    • pypi 7,352 last-month
  • Total docker downloads: 85
  • Total dependent packages: 0
    (may contain duplicates)
  • Total dependent repositories: 39
    (may contain duplicates)
  • Total versions: 29
  • Total maintainers: 2
pypi.org: torchmeta

Dataloaders for meta-learning in Pytorch

  • Versions: 28
  • Dependent Packages: 0
  • Dependent Repositories: 39
  • Downloads: 7,352 Last month
  • Docker Downloads: 85
Rankings
Stargazers count: 1.6%
Dependent repos count: 2.3%
Docker downloads count: 3.2%
Forks count: 3.4%
Average: 4.1%
Downloads: 4.2%
Dependent packages count: 10.0%
Maintainers (1)
Last synced: 7 months ago
spack.io: py-torchmeta

A collection of extensions and data-loaders for few-shot learning & meta-learning in PyTorch. Torchmeta contains popular meta-learning benchmarks, fully compatible with both torchvision and PyTorch's DataLoader.

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Stargazers count: 5.5%
Forks count: 6.4%
Average: 17.3%
Dependent packages count: 57.3%
Maintainers (1)
Last synced: 7 months ago

Dependencies

setup.py pypi
  • Pillow >=7.0.0
  • h5py *
  • numpy >=1.14.0
  • ordered-set *
  • requests *
  • torch >=1.4.0,<1.10.0
  • torchvision >=0.5.0,<0.11.0
  • tqdm >=4.0.0