Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (13.8%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

Basic Info
  • Host: GitHub
  • Owner: KorventennFR
  • License: apache-2.0
  • Language: Jupyter Notebook
  • Default Branch: main
  • Size: 2.37 MB
Statistics
  • Stars: 0
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Created over 2 years ago · Last pushed over 2 years ago
Metadata Files
Readme Contributing License Citation

README.md

Note: This repository holds the codebase of the Adapters library, which has replaced adapter-transformers. For the legacy codebase, go to: https://github.com/adapter-hub/adapter-transformers-legacy.

Adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

Tests GitHub PyPI

adapters is an add-on to HuggingFace's Transformers library, integrating adapters into state-of-the-art language models by incorporating AdapterHub, a central repository for pre-trained adapter modules.

Installation

adapters currently supports Python 3.8+ and PyTorch 1.10+. After installing PyTorch, you can install adapters from PyPI ...

pip install -U adapters

... or from source by cloning the repository:

git clone https://github.com/adapter-hub/adapters.git git checkout adapters cd adapters pip install .

Quick Tour

Load pre-trained adapters:

```python from adapters import AutoAdapterModel from transformers import AutoTokenizer

model = AutoAdapterModel.frompretrained("roberta-base") tokenizer = AutoTokenizer.frompretrained("roberta-base")

model.loadadapter("AdapterHub/roberta-base-pf-imdb", source="hf", setactive=True)

print(model(**tokenizer("This works great!", return_tensors="pt")).logits) ```

Learn More

Adapt existing model setups:

```python import adapters from transformers import AutoModelForSequenceClassification

model = AutoModelForSequenceClassification.from_pretrained("t5-base")

adapters.init(model)

model.addadapter("myloraadapter", config="lora") model.trainadapter("myloraadapter")

Your regular training loop...

```

Learn More

Flexibly configure adapters:

```python from adapters import ConfigUnion, PrefixTuningConfig, ParBnConfig, AutoAdapterModel

model = AutoAdapterModel.from_pretrained("microsoft/deberta-v3-base")

adapterconfig = ConfigUnion( PrefixTuningConfig(prefixlength=20), ParBnConfig(reductionfactor=4), ) model.addadapter("myadapter", config=adapterconfig, set_active=True) ```

Learn More

Easily compose adapters in a single model:

```python from adapters import AdapterSetup, AutoAdapterModel import adapters.composition as ac

model = AutoAdapterModel.from_pretrained("roberta-base")

qc = model.loadadapter("AdapterHub/roberta-base-pf-trec") sent = model.loadadapter("AdapterHub/roberta-base-pf-imdb")

with AdapterSetup(ac.Parallel(qc, sent)): print(model(**tokenizer("What is AdapterHub?", return_tensors="pt"))) ```

Learn More

Useful Resources

HuggingFace's great documentation on getting started with Transformers can be found here. adapters is fully compatible with Transformers.

To get started with adapters, refer to these locations:

  • Colab notebook tutorials, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub
  • https://docs.adapterhub.ml, our documentation on training and using adapters with adapters
  • https://adapterhub.ml to explore available pre-trained adapter modules and share your own adapters
  • Examples folder of this repository containing HuggingFace's example training scripts, many adapted for training adapters

Implemented Methods

Currently, adapters integrates all architectures and methods listed below:

| Method | Paper(s) | Quick Links | | --- | --- | --- | | Bottleneck adapters | Houlsby et al. (2019)
Bapna and Firat (2019) | Quickstart, Notebook | | AdapterFusion | Pfeiffer et al. (2021) | Docs: Training, Notebook | | MAD-X,
Invertible adapters | Pfeiffer et al. (2020) | Notebook | | AdapterDrop | Rücklé et al. (2021) | Notebook | | MAD-X 2.0,
Embedding training | Pfeiffer et al. (2021) | Docs: Embeddings, Notebook | | Prefix Tuning | Li and Liang (2021) | Docs | | Parallel adapters,
Mix-and-Match adapters | He et al. (2021) | Docs | | Compacter | Mahabadi et al. (2021) | Docs | | LoRA | Hu et al. (2021) | Docs | | (IA)^3 | Liu et al. (2022) | Docs | | UniPELT | Mao et al. (2022) | Docs |

Supported Models

We currently support the PyTorch versions of all models listed on the Model Overview page in our documentation.

Developing & Contributing

To get started with developing on Adapters yourself and learn more about ways to contribute, please see https://docs.adapterhub.ml/contributing.html.

Citation

If you use this library for your work, please consider citing our paper AdapterHub: A Framework for Adapting Transformers:

@inproceedings{pfeiffer2020AdapterHub, title={AdapterHub: A Framework for Adapting Transformers}, author={Pfeiffer, Jonas and R{\"u}ckl{\'e}, Andreas and Poth, Clifton and Kamath, Aishwarya and Vuli{\'c}, Ivan and Ruder, Sebastian and Cho, Kyunghyun and Gurevych, Iryna}, booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations}, pages={46--54}, year={2020} }

Owner

  • Login: KorventennFR
  • Kind: user

Hi, I am a programmer with a passion for AI related project and topics. This GitHub repository serves as a showcase for some of my projects.

Citation (CITATION.cff)

cff-version: "1.2.0"
date-released: 2020-10
message: "If you use this software, please cite it as below."
title: "AdapterHub: A Framework for Adapting Transformers"
url: "https://github.com/Adapter-Hub/adapters"
authors: 
  - family-names: Pfeiffer
    given-names: Jonas
  - family-names: Rücklé
    given-names: Andreas
  - family-names: Poth
    given-names: Clifton
  - family-names: Kamath
    given-names: Aishwarya
  - family-names: Vulić
    given-names: Ivan
  - family-names: Ruder
    given-names: Sebastian
  - family-names: Cho
    given-names: Kyunghyun
  - family-names: Gurevych
    given-names: Iryna
preferred-citation:
  type: inproceedings
  authors:
  - family-names: Pfeiffer
    given-names: Jonas
  - family-names: Rücklé
    given-names: Andreas
  - family-names: Poth
    given-names: Clifton
  - family-names: Kamath
    given-names: Aishwarya
  - family-names: Vulić
    given-names: Ivan
  - family-names: Ruder
    given-names: Sebastian
  - family-names: Cho
    given-names: Kyunghyun
  - family-names: Gurevych
    given-names: Iryna
  booktitle: "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations"
  month: 10
  start: 46
  end: 54
  title: "AdapterHub: A Framework for Adapting Transformers"
  year: 2020
  publisher: "Association for Computational Linguistics"
  url: "https://aclanthology.org/2020.emnlp-demos.7"
  address: "Online"

GitHub Events

Total
Last Year

Dependencies

.github/workflows/adapter_docs_build.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v2 composite
  • peaceiris/actions-gh-pages v3 composite
.github/workflows/pr_dependencies.yml actions
  • z0al/dependent-issues v1 composite
.github/workflows/stale.yml actions
  • actions/stale v8 composite
.github/workflows/tests_torch.yml actions
  • actions/cache v2 composite
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
examples/pytorch/_tests_requirements.txt pypi
  • accelerate main test
  • conllu * test
  • datasets >=1.13.3 test
  • elasticsearch * test
  • evaluate >=0.2.0 test
  • faiss-cpu * test
  • fire * test
  • git-python ==1.0.3 test
  • jiwer * test
  • librosa * test
  • matplotlib * test
  • nltk * test
  • pandas * test
  • protobuf * test
  • psutil * test
  • pytest * test
  • rouge-score * test
  • sacrebleu >=1.4.12 test
  • scikit-learn * test
  • sentencepiece * test
  • seqeval * test
  • streamlit * test
  • tensorboard * test
  • tensorflow_datasets * test
  • torchvision * test
examples/pytorch/dependency-parsing/requirements.txt pypi
  • conllu *
  • datasets >=1.8.0
  • torch >=1.3
examples/pytorch/language-modeling/requirements.txt pypi
  • accelerate >=0.12.0
  • datasets >=1.8.0
  • evaluate *
  • protobuf *
  • scikit-learn *
  • sentencepiece *
  • torch >=1.3
examples/pytorch/multiple-choice/requirements.txt pypi
  • accelerate >=0.12.0
  • evaluate *
  • protobuf *
  • sentencepiece *
  • torch >=1.3
examples/pytorch/question-answering/requirements.txt pypi
  • accelerate >=0.12.0
  • datasets >=1.8.0
  • evaluate *
  • torch >=1.3.0
examples/pytorch/summarization/requirements.txt pypi
  • accelerate >=0.12.0
  • datasets >=1.8.0
  • evaluate *
  • nltk *
  • protobuf *
  • py7zr *
  • rouge-score *
  • sentencepiece *
  • torch >=1.3
examples/pytorch/text-classification/requirements.txt pypi
  • accelerate >=0.12.0
  • datasets >=1.8.0
  • evaluate *
  • protobuf *
  • scikit-learn *
  • scipy *
  • sentencepiece *
  • torch >=1.3
examples/pytorch/text-generation/requirements.txt pypi
  • protobuf *
  • sentencepiece *
  • torch >=1.3
examples/pytorch/token-classification/requirements.txt pypi
  • accelerate >=0.12.0
  • datasets >=1.8.0
  • evaluate *
  • seqeval *
  • torch >=1.3
examples/pytorch/translation/requirements.txt pypi
  • accelerate >=0.12.0
  • datasets >=1.8.0
  • evaluate *
  • protobuf *
  • py7zr *
  • sacrebleu >=1.4.12
  • sentencepiece *
  • torch >=1.3
pyproject.toml pypi
setup.py pypi
  • deps *