hf-trim

Reduce the size of pretrained Hugging Face models via vocabulary trimming.

https://github.com/iamadisri/hf-trim

Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.2%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

Reduce the size of pretrained Hugging Face models via vocabulary trimming.

Basic Info
  • Host: GitHub
  • Owner: IamAdiSri
  • License: mpl-2.0
  • Language: Python
  • Default Branch: main
  • Size: 52.7 KB
Statistics
  • Stars: 45
  • Watchers: 2
  • Forks: 5
  • Open Issues: 3
  • Releases: 2
Created almost 4 years ago · Last pushed about 3 years ago
Metadata Files
Readme License Citation

README.md

hf-trim

Python HuggingFace PyTorch

Downloads PyPI GitHub tag (latest by date) PyPI - License

A package to reduce the size of 🤗 Hugging Face models via vocabulary trimming.

The library currently supports the following models (and their pretrained versions available on the Hugging Face Models hub);

  1. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation
  2. mBART: Multilingual Denoising Pre-training for Neural Machine Translation
  3. T5: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
  4. mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer

"Why would I need to trim the vocabulary on a model?" 🤔

To put it simply, vocabulary trimming is a way to reduce a language model's memory footprint while retaining most of its performance.

Read more here.

Citation

If you use this software, please cite it as given below; @software{Srivastava_hf-trim, author = {Srivastava, Aditya}, license = {MPL-2.0}, title = {{hf-trim}} url = {https://github.com/IamAdiSri/hf-trim} }

Installation

You can run the following command to install from PyPI (recommended); bash $ pip install hf-trim

You can also install from source; bash $ git clone https://github.com/IamAdiSri/hf-trim $ cd hf-trim $ pip install .

Usage

Simple Example

```python from transformers import MT5Config, MT5Tokenizer, MT5ForConditionalGeneration from hftrim.TokenizerTrimmer import TokenizerTrimmer from hftrim.ModelTrimmers import MT5Trimmer

data = [ " UN Chief Says There Is No Military Solution in Syria", "Şeful ONU declară că nu există o soluţie militară în Siria" ]

load pretrained config, tokenizer and model

config = MT5Config.frompretrained("google/mt5-small") tokenizer = MT5Tokenizer.frompretrained("google/mt5-small") model = MT5ForConditionalGeneration.from_pretrained("google/mt5-small")

trim tokenizer

tt = TokenizerTrimmer(tokenizer) tt.makevocab(data) tt.maketokenizer()

trim model

mt = MT5Trimmer(model, config, tt.trimmedtokenizer) mt.makeweights(tt.trimmedvocabids) mt.make_model() ```

You can directly use the trimmed model with mt.trimmed_model and the trimmed tokenizer with tt.trimmed_tokenizer.

Saving and Loading

```python

save with

tt.trimmedtokenizer.savepretrained('trimT5') mt.trimmedmodel.savepretrained('trimT5')

load with

config = MT5Config.frompretrained("trimT5") tokenizer = MT5Tokenizer.frompretrained("trimT5") model = MT5ForConditionalGeneration.from_pretrained("trimT5") ```

Limitations

  • Fast tokenizers are currently unsupported.
  • Tensorflow and Flax models are currently unsupported.

Roadmap

  • Add support for MarianMT models.
  • Add support for FSMT models.

Issues

Feel free to open an issue if you run into bugs, have any queries or want to request support for an architecture.

Contributing

Contributions are welcome, especially those adding functionality for new or currently unsupported models.

Owner

  • Name: Aditya Srivastava
  • Login: IamAdiSri
  • Kind: user
  • Location: United States
  • Company: University of Colorado, Boulder

Graduate Student at CU Boulder | Ex NLProc and ML Engineer at SentiSum | Ex NLProc Researcher at LTRC IIIT-H | Ex ML Research Intern at ICAR-CNR Italy

Citation (CITATION.cff)

# This CITATION.cff file was generated with cffinit.
# Visit https://bit.ly/cffinit to generate yours today!

cff-version: 1.2.0
title: hf-trim
message: >-
  If you use this software, please cite it using the
  metadata from this file.
type: software
authors:
  - given-names: Aditya
    family-names: Srivastava
    email: adi.srivastava@hotmail.com
    affiliation: Independent
    orcid: 'https://orcid.org/0000-0002-2908-0273'
identifiers:
  - type: url
    value: 'https://github.com/IamAdiSri/hf-trim'
    description: Homepage
abstract: >-
  A package to reduce the size of Hugging Face models
  via vocabulary trimming.
keywords:
  - Machine Learning
  - Deep Learning
  - Neural Networks
  - Artificial Intelligence
  - Python
  - Pytorch
  - Hugging Face
license: MPL-2.0

GitHub Events

Total
  • Watch event: 3
Last Year
  • Watch event: 3

Issues and Pull Requests

Last synced: 9 months ago

All Time
  • Total issues: 6
  • Total pull requests: 0
  • Average time to close issues: 10 days
  • Average time to close pull requests: N/A
  • Total issue authors: 5
  • Total pull request authors: 0
  • Average comments per issue: 4.17
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 1
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 1
  • Pull request authors: 0
  • Average comments per issue: 0.0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • IamAdiSri (2)
  • tatiana-iazykova (1)
  • SoshyHayami (1)
  • silver-seashell (1)
  • BakingBrains (1)
Pull Request Authors
Top Labels
Issue Labels
bug (2)
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 21 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 1
  • Total versions: 4
  • Total maintainers: 1
pypi.org: hf-trim

A tool to reduce the size of Hugging Face models via vocabulary trimming.

  • Versions: 4
  • Dependent Packages: 0
  • Dependent Repositories: 1
  • Downloads: 21 Last month
Rankings
Dependent packages count: 10.1%
Stargazers count: 11.6%
Average: 18.4%
Downloads: 19.1%
Dependent repos count: 21.5%
Forks count: 29.8%
Maintainers (1)
Last synced: 8 months ago

Dependencies

setup.py pypi
  • numpy >=1.22.3
  • protobuf >=3.19.4
  • sentencepiece >=0.1.96
  • tokenizers >=0.11.6
  • torch >=1.11.0
  • tqdm >=4.63.1
  • transformers >=4.17.0