adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

https://github.com/adapter-hub/adapters

Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (15.4%) to scientific vocabulary

Keywords

adapters bert lora natural-language-processing nlp parameter-efficient-learning parameter-efficient-tuning pytorch transformers

Keywords from Contributors

cryptocurrency cryptography jax transformer
Last synced: 6 months ago · JSON representation ·

Repository

A Unified Library for Parameter-Efficient and Modular Transfer Learning

Basic Info
  • Host: GitHub
  • Owner: adapter-hub
  • License: apache-2.0
  • Language: Python
  • Default Branch: main
  • Homepage: https://docs.adapterhub.ml
  • Size: 96.9 MB
Statistics
  • Stars: 2,760
  • Watchers: 28
  • Forks: 369
  • Open Issues: 50
  • Releases: 25
Topics
adapters bert lora natural-language-processing nlp parameter-efficient-learning parameter-efficient-tuning pytorch transformers
Created almost 6 years ago · Last pushed 6 months ago
Metadata Files
Readme Contributing License Citation

README.md

Adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

Website   •   Documentation   •   Paper

Tests GitHub PyPI

Adapters is an add-on library to HuggingFace's Transformers, integrating 10+ adapter methods into 20+ state-of-the-art Transformer models with minimal coding overhead for training and inference.

Adapters provides a unified interface for efficient fine-tuning and modular transfer learning, supporting a myriad of features like full-precision or quantized training (e.g. Q-LoRA, Q-Bottleneck Adapters, or Q-PrefixTuning), adapter merging via task arithmetics or the composition of multiple adapters via composition blocks, allowing advanced research in parameter-efficient transfer learning for NLP tasks.

Note: The Adapters library has replaced the adapter-transformers package. All previously trained adapters are compatible with the new library. For transitioning, please read: https://docs.adapterhub.ml/transitioning.html.

Installation

adapters currently supports Python 3.9+ and PyTorch 2.0+. After installing PyTorch, you can install adapters from PyPI ...

pip install -U adapters

... or from source by cloning the repository:

git clone https://github.com/adapter-hub/adapters.git cd adapters pip install .

Quick Tour

Load pre-trained adapters:

```python from adapters import AutoAdapterModel from transformers import AutoTokenizer

model = AutoAdapterModel.frompretrained("roberta-base") tokenizer = AutoTokenizer.frompretrained("roberta-base")

model.loadadapter("AdapterHub/roberta-base-pf-imdb", source="hf", setactive=True)

print(model(**tokenizer("This works great!", return_tensors="pt")).logits) ```

Learn More

Adapt existing model setups:

```python import adapters from transformers import AutoModelForSequenceClassification

model = AutoModelForSequenceClassification.from_pretrained("t5-base")

adapters.init(model)

model.addadapter("myloraadapter", config="lora") model.trainadapter("myloraadapter")

Your regular training loop...

```

Learn More

Flexibly configure adapters:

```python from adapters import ConfigUnion, PrefixTuningConfig, ParBnConfig, AutoAdapterModel

model = AutoAdapterModel.from_pretrained("microsoft/deberta-v3-base")

adapterconfig = ConfigUnion( PrefixTuningConfig(prefixlength=20), ParBnConfig(reductionfactor=4), ) model.addadapter("myadapter", config=adapterconfig, set_active=True) ```

Learn More

Easily compose adapters in a single model:

```python from adapters import AdapterSetup, AutoAdapterModel import adapters.composition as ac

model = AutoAdapterModel.from_pretrained("roberta-base")

qc = model.loadadapter("AdapterHub/roberta-base-pf-trec") sent = model.loadadapter("AdapterHub/roberta-base-pf-imdb")

with AdapterSetup(ac.Parallel(qc, sent)): print(model(**tokenizer("What is AdapterHub?", return_tensors="pt"))) ```

Learn More

Useful Resources

HuggingFace's great documentation on getting started with Transformers can be found here. adapters is fully compatible with Transformers.

To get started with adapters, refer to these locations:

  • Colab notebook tutorials, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub
  • https://docs.adapterhub.ml, our documentation on training and using adapters with adapters
  • https://adapterhub.ml to explore available pre-trained adapter modules and share your own adapters
  • Examples folder of this repository containing HuggingFace's example training scripts, many adapted for training adapters

Implemented Methods

Currently, adapters integrates all architectures and methods listed below:

| Method | Paper(s) | Quick Links | | --- | --- | --- | | Bottleneck adapters | Houlsby et al. (2019)
Bapna and Firat (2019)
Steitz and Roth (2024) | Quickstart, Notebook | | AdapterFusion | Pfeiffer et al. (2021) | Docs: Training, Notebook | | MAD-X,
Invertible adapters | Pfeiffer et al. (2020) | Notebook | | AdapterDrop | Rücklé et al. (2021) | Notebook | | MAD-X 2.0,
Embedding training | Pfeiffer et al. (2021) | Docs: Embeddings, Notebook | | Prefix Tuning | Li and Liang (2021) | Docs | | Parallel adapters,
Mix-and-Match adapters | He et al. (2021) | Docs | | Compacter | Mahabadi et al. (2021) | Docs | | LoRA | Hu et al. (2021) | Docs | | MTL-LoRA | Yang et al., 2024 | Docs | | (IA)^3 | Liu et al. (2022) | Docs | | Vera | Kopiczko et al., 2024 | Docs | DoRA | Liu et al., 2024 | Docs | UniPELT | Mao et al. (2022) | Docs | | Prompt Tuning | Lester et al. (2021) | Docs | | QLoRA | Dettmers et al. (2023) | Notebook | | ReFT | Wu et al. (2024) | Docs | | Adapter Task Arithmetics | Chronopoulou et al. (2023)
Zhang et al. (2023) | Docs, Notebook|

Supported Models

We currently support the PyTorch versions of all models listed on the Model Overview page in our documentation.

Developing & Contributing

To get started with developing on Adapters yourself and learn more about ways to contribute, please see https://docs.adapterhub.ml/contributing.html.

Citation

If you use Adapters in your work, please consider citing our library paper: Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning

@inproceedings{poth-etal-2023-adapters, title = "Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning", author = {Poth, Clifton and Sterz, Hannah and Paul, Indraneil and Purkayastha, Sukannya and Engl{\"a}nder, Leon and Imhof, Timo and Vuli{\'c}, Ivan and Ruder, Sebastian and Gurevych, Iryna and Pfeiffer, Jonas}, booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: System Demonstrations", month = dec, year = "2023", address = "Singapore", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2023.emnlp-demo.13", pages = "149--160", }

Alternatively, for the predecessor adapter-transformers, the Hub infrastructure and adapters uploaded by the AdapterHub team, please consider citing our initial paper: AdapterHub: A Framework for Adapting Transformers

@inproceedings{pfeiffer2020AdapterHub, title={AdapterHub: A Framework for Adapting Transformers}, author={Pfeiffer, Jonas and R{\"u}ckl{\'e}, Andreas and Poth, Clifton and Kamath, Aishwarya and Vuli{\'c}, Ivan and Ruder, Sebastian and Cho, Kyunghyun and Gurevych, Iryna}, booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations}, pages={46--54}, year={2020} }

Owner

  • Name: AdapterHub
  • Login: adapter-hub
  • Kind: organization
  • Location: Germany

Citation (CITATION.cff)

cff-version: "1.2.0"
date-released: 2023-11
message: "If you use this software, please cite it as below."
title: "Adapters: A Unified Library for Parameter-Efficient and
  Modular Transfer Learning"
url: "https://github.com/Adapter-Hub/adapters"
authors:
  - family-names: Poth
    given-names: Clifton
  - family-names: Sterz
    given-names: Hannah
  - family-names: Paul
    given-names: Indraneil
  - family-names: Purkayastha
    given-names: Sukannya
  - family-names: Engländer
    given-names: Leon
  - family-names: Imhof
    given-names: Timo
  - family-names: Vulić
    given-names: Ivan
  - family-names: Ruder
    given-names: Sebastian
  - family-names: Gurevych
    given-names: Iryna
  - family-names: Pfeiffer
    given-names: Jonas
preferred-citation:
  type: conference-paper
  authors:
  - family-names: Poth
    given-names: Clifton
  - family-names: Sterz
    given-names: Hannah
  - family-names: Paul
    given-names: Indraneil
  - family-names: Purkayastha
    given-names: Sukannya
  - family-names: Engländer
    given-names: Leon
  - family-names: Imhof
    given-names: Timo
  - family-names: Vulić
    given-names: Ivan
  - family-names: Ruder
    given-names: Sebastian
  - family-names: Gurevych
    given-names: Iryna
  - family-names: Pfeiffer
    given-names: Jonas
  booktitle: "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: System Demonstrations"
  month: 12
  start: 149
  end: 160
  title: "Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning"
  year: 2023
  publisher: "Association for Computational Linguistics"
  url: "https://aclanthology.org/2023.emnlp-demo.13"
  address: "Singapore"

GitHub Events

Total
  • Create event: 6
  • Commit comment event: 2
  • Release event: 4
  • Issues event: 49
  • Watch event: 186
  • Delete event: 2
  • Issue comment event: 87
  • Push event: 54
  • Pull request review comment event: 89
  • Pull request review event: 109
  • Pull request event: 83
  • Fork event: 32
Last Year
  • Create event: 6
  • Commit comment event: 2
  • Release event: 4
  • Issues event: 49
  • Watch event: 186
  • Delete event: 2
  • Issue comment event: 87
  • Push event: 54
  • Pull request review comment event: 89
  • Pull request review event: 109
  • Pull request event: 83
  • Fork event: 32

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 179
  • Total Committers: 16
  • Avg Commits per committer: 11.188
  • Development Distribution Score (DDS): 0.38
Past Year
  • Commits: 68
  • Committers: 12
  • Avg Commits per committer: 5.667
  • Development Distribution Score (DDS): 0.426
Top Committers
Name Email Commits
calpt c****t@m****e 111
Leon Engländer l****r@g****m 21
TimoImhof 6****f 15
Hannah Sterz h****6@g****m 13
Julian Fong 4****g 7
francois_ledoyen l****s@g****m 2
Aditya Ranjan a****5@g****m 1
Alex Yun 3****p 1
Boris k****y@g****m 1
FahadEbrahim 6****m 1
Ikko Eltociear Ashimine e****r@g****m 1
KorventennFR 1****R 1
Stefan Schweter s****n@s****t 1
TheoWeih t****y@w****e 1
William Soto 9****i 1
divyanshuaggarwal d****l@g****m 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 79
  • Total pull requests: 139
  • Average time to close issues: 4 months
  • Average time to close pull requests: 3 months
  • Total issue authors: 54
  • Total pull request authors: 24
  • Average comments per issue: 2.46
  • Average comments per pull request: 0.94
  • Merged pull requests: 100
  • Bot issues: 0
  • Bot pull requests: 1
Past Year
  • Issues: 28
  • Pull requests: 73
  • Average time to close issues: about 1 month
  • Average time to close pull requests: 23 days
  • Issue authors: 23
  • Pull request authors: 10
  • Average comments per issue: 1.32
  • Average comments per pull request: 0.77
  • Merged pull requests: 55
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • arueckle (7)
  • rabeehkarimimahabadi (7)
  • km5ar (5)
  • calpt (5)
  • FrLdy (4)
  • FahadEbrahim (3)
  • JoPfeiff (3)
  • julian-fong (3)
  • sbassam (2)
  • prajjwal1 (2)
  • swoldemichael (2)
  • SSamDav (2)
  • z-lai (2)
  • mkgs210 (2)
  • xplip (2)
Pull Request Authors
  • calpt (104)
  • TimoImhof (21)
  • julian-fong (14)
  • lenglaender (11)
  • JoPfeiff (6)
  • hSterz (6)
  • FrLdy (4)
  • devin-astrumu (2)
  • Soham2000 (2)
  • amitkumarj441 (2)
  • divyanshuaggarwal (2)
  • killershrimp (2)
  • arueckle (2)
  • xmarva (2)
  • joao-alves97 (2)
Top Labels
Issue Labels
bug (56) enhancement (35) question (14) Stale (2) do-not-stale (2) wontfix (1) discussion (1) bug:encoder-decoder (1) duplicate (1) external-dependency (1)
Pull Request Labels
sync (23) model-requires-upgrade (10) bug (4) do-not-merge (3) documentation (1) enhancement (1) dependencies (1)

Packages

  • Total packages: 3
  • Total downloads:
    • pypi 1,414,365 last-month
  • Total docker downloads: 1,612
  • Total dependent packages: 2
    (may contain duplicates)
  • Total dependent repositories: 51
    (may contain duplicates)
  • Total versions: 47
  • Total maintainers: 3
pypi.org: adapter-transformers

Deprecated adapter-transformers package. Use adapters package instead.

  • Versions: 19
  • Dependent Packages: 2
  • Dependent Repositories: 49
  • Downloads: 5,070 Last month
  • Docker Downloads: 1,612
Rankings
Stargazers count: 1.5%
Downloads: 1.8%
Docker downloads count: 1.9%
Dependent repos count: 2.1%
Average: 2.2%
Forks count: 3.0%
Dependent packages count: 3.2%
Maintainers (3)
Last synced: 6 months ago
proxy.golang.org: github.com/adapter-hub/adapters
  • Versions: 11
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent packages count: 5.4%
Average: 5.6%
Dependent repos count: 5.8%
Last synced: 6 months ago
pypi.org: adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

  • Versions: 17
  • Dependent Packages: 0
  • Dependent Repositories: 2
  • Downloads: 1,409,295 Last month
Rankings
Stargazers count: 1.5%
Forks count: 3.0%
Downloads: 5.4%
Average: 6.3%
Dependent packages count: 10.1%
Dependent repos count: 11.5%
Maintainers (3)
Last synced: 6 months ago

Dependencies

examples/pytorch/_tests_requirements.txt pypi
  • accelerate main test
  • conllu * test
  • datasets >=1.13.3 test
  • elasticsearch * test
  • faiss-cpu * test
  • fire * test
  • git-python ==1.0.3 test
  • jiwer * test
  • librosa * test
  • matplotlib * test
  • nltk * test
  • pandas * test
  • protobuf * test
  • psutil * test
  • pytest * test
  • rouge-score * test
  • sacrebleu >=1.4.12 test
  • scikit-learn * test
  • sentencepiece * test
  • seqeval * test
  • streamlit * test
  • tensorboard * test
  • tensorflow_datasets * test
  • torchvision * test
examples/pytorch/audio-classification/requirements.txt pypi
  • datasets >=1.14.0
  • librosa *
  • torch >=1.6
  • torchaudio *
examples/pytorch/contrastive-image-text/requirements.txt pypi
  • datasets >=1.8.0
  • torch >=1.5.0
  • torchvision >=0.6.0
examples/pytorch/dependency-parsing/requirements.txt pypi
  • conllu *
  • datasets >=1.8.0
  • torch >=1.3
examples/pytorch/image-classification/requirements.txt pypi
  • datasets >=1.8.0
  • torch >=1.5.0
  • torchvision >=0.6.0
examples/pytorch/image-pretraining/requirements.txt pypi
  • datasets >=1.8.0
  • torch >=1.5.0
  • torchvision >=0.6.0
examples/pytorch/language-modeling/requirements.txt pypi
  • accelerate *
  • datasets >=1.8.0
  • protobuf *
  • sentencepiece *
  • torch >=1.3
examples/pytorch/multiple-choice/requirements.txt pypi
  • accelerate *
  • protobuf *
  • sentencepiece *
  • torch >=1.3
examples/pytorch/question-answering/requirements.txt pypi
  • accelerate *
  • datasets >=1.8.0
  • torch >=1.3.0
examples/pytorch/semantic-segmentation/requirements.txt pypi
  • datasets >=2.0.0
  • torch >=1.3
examples/pytorch/speech-pretraining/requirements.txt pypi
  • accelerate >=0.5.0
  • datasets >=1.12.0
  • librosa *
  • torch >=1.5
  • torchaudio *
examples/pytorch/speech-recognition/requirements.txt pypi
  • datasets >=1.18.0
  • jiwer *
  • librosa *
  • torch >=1.5
  • torchaudio *
examples/pytorch/summarization/requirements.txt pypi
  • accelerate *
  • datasets >=1.8.0
  • nltk *
  • protobuf *
  • py7zr *
  • rouge-score *
  • sentencepiece *
  • torch >=1.3
examples/pytorch/text-classification/requirements.txt pypi
  • accelerate *
  • datasets >=1.8.0
  • protobuf *
  • scikit-learn *
  • scipy *
  • sentencepiece *
  • torch >=1.3
examples/pytorch/text-generation/requirements.txt pypi
  • protobuf *
  • sentencepiece *
  • torch >=1.3
examples/pytorch/token-classification/requirements.txt pypi
  • accelerate *
  • datasets >=1.8.0
  • seqeval *
  • torch >=1.3
examples/pytorch/translation/requirements.txt pypi
  • accelerate *
  • datasets >=1.8.0
  • protobuf *
  • py7zr *
  • sacrebleu >=1.4.12
  • sentencepiece *
  • torch >=1.3
setup.py pypi
  • deps *
tests/sagemaker/scripts/pytorch/requirements.txt pypi
  • datasets ==1.8.0 test
.github/workflows/adapter_docs_build.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v2 composite
  • peaceiris/actions-gh-pages v3 composite
.github/workflows/pr_dependencies.yml actions
  • z0al/dependent-issues v1 composite
.github/workflows/stale.yml actions
  • actions/stale v6 composite
.github/workflows/tests_torch.yml actions
  • actions/cache v2 composite
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
docker/transformers-all-latest-gpu/Dockerfile docker
  • nvidia/cuda 11.2.2-cudnn8-devel-ubuntu20.04 build
docker/transformers-cpu/Dockerfile docker
  • ubuntu 18.04 build
docker/transformers-doc-builder/Dockerfile docker
  • python 3.8 build
docker/transformers-gpu/Dockerfile docker
  • nvidia/cuda 10.2-cudnn7-devel-ubuntu18.04 build
docker/transformers-past-gpu/Dockerfile docker
  • $BASE_DOCKER_IMAGE latest build
docker/transformers-pytorch-cpu/Dockerfile docker
  • ubuntu 18.04 build
docker/transformers-pytorch-deepspeed-latest-gpu/Dockerfile docker
  • nvcr.io/nvidia/pytorch 21.03-py3 build
docker/transformers-pytorch-deepspeed-nightly-gpu/Dockerfile docker
  • nvcr.io/nvidia/pytorch 21.03-py3 build
docker/transformers-pytorch-gpu/Dockerfile docker
  • nvidia/cuda 11.2.2-cudnn8-devel-ubuntu20.04 build
docker/transformers-pytorch-tpu/Dockerfile docker
  • google/cloud-sdk slim build
docker/transformers-tensorflow-cpu/Dockerfile docker
  • ubuntu 18.04 build
docker/transformers-tensorflow-gpu/Dockerfile docker
  • nvidia/cuda 11.2.2-cudnn8-devel-ubuntu20.04 build