transformers

Flexible transformer implementation for research

https://github.com/willguimont/transformers

Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (8.4%) to scientific vocabulary

Keywords

deep-learning deep-neural-networks learning machine-learning pytorch transformer transformer-architecture
Last synced: 6 months ago · JSON representation ·

Repository

Flexible transformer implementation for research

Basic Info
  • Host: GitHub
  • Owner: willGuimont
  • License: mit
  • Language: Python
  • Default Branch: master
  • Homepage:
  • Size: 107 KB
Statistics
  • Stars: 5
  • Watchers: 1
  • Forks: 1
  • Open Issues: 0
  • Releases: 0
Topics
deep-learning deep-neural-networks learning machine-learning pytorch transformer transformer-architecture
Created almost 3 years ago · Last pushed about 1 year ago
Metadata Files
Readme License Citation

README.md

transformers

Collection of easy to understand transformer-based models in PyTorch. The implementation is heavily commented and should be easy to follow.

Installation

bash pip install git+https://github.com/willGuimont/transformers

Implemented models

General:

  • Transformer (Vaswani et al., 2017)
  • Parallel Transformer (Dehghani et al., 2023)
  • PerceiverIO (Jaegle et al., 2022)

Positional encoding:

  • Sinusoidal positional encoding
  • Relative positional encoding
  • Learnable positional encoding
  • Learnable Fourier positional encoding (Li, 2021)

Vision:

  • VisionTransformer (Dosovitskiy et al., 2021)

NLP:

  • Simple character-level Transformer language model

Next steps

  • VICReg
  • Rotary positional encoding https://arxiv.org/pdf/2104.09864.pdf
  • Optimizing Deeper Transformers on Small Datasets (Xu et al., 2021)
  • Neural Machine Translation by Jointly Learning to Align and Translate (Bahdanau et al., 2016)
  • Universal Transformers Paper
  • Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles (Ryali et al., 2023)
  • Swin Transformer (Liu et al., 2021)
  • DINO: Emerging Properties in Self-Supervised Vision Transformers (Caron et al., 2021)
  • FlashAttention (Dao et al., 2022)
  • DETR (Carion et al., 2020)
  • Unlimiformer: Long-Range Transformers with Unlimited Length Input (Bertsch et al., 2023)
  • PointBERT (Yu et al., 2022)
  • Hydra Attention: Efficient Attention with Many Heads (Bolya et al., 2022)
  • Hyena Hierarchy: Towards Larger Convolutional Language Models (Poli et al., 2023)
  • Thinking Like Transformers (Weiss et al., 2021)
  • Long short-term memory (Schmidhuber, 1997)
  • Rethinking Positional Encoding in Language Pre-training (Ke et al., 2021)

Cite this repository

@software{Guimont-Martin_transformer_flexible_and_2023, author = {Guimont-Martin, William}, month = {2}, title = {{transformer: flexible and easy to understand transformer models}}, version = {0.1.0}, year = {2023} }

Owner

  • Name: William Guimont-Martin
  • Login: willGuimont
  • Kind: user
  • Location: Quebec City, Canada
  • Company: @norlab-ulaval, Université Laval

Ph.D. student in deep learning and mobile robotics at Norlab, Université Laval. Candidate to the engineering profession (CEP). Inventor.

Citation (CITATION.cff)

# This CITATION.cff file was generated with cffinit.
# Visit https://bit.ly/cffinit to generate yours today!

cff-version: 1.2.0
title: >-
  transformer: flexible and easy to understand transformer
  models
message: 'If you use this software, please cite it as below.'
type: software
authors:
  - family-names: Guimont-Martin
    given-names: William
    orcid: 'https://orcid.org/0000-0002-8850-8399'
    email: william.guimont-martin.1@ulaval.ca
    affiliation: Université Laval
repository-code: 'https://github.com/willGuimont/transformers'
url: 'https://github.com/willGuimont/transformers'
abstract: >-
  Collection of easy to understand transformer-based models
  in PyTorch. The implementation is heavily commented and
  should be easy to follow.
keywords:
  - deep-learning
  - transformer
license: MIT
version: 0.1.0
date-released: '2023-02-23'

GitHub Events

Total
  • Watch event: 2
  • Push event: 2
  • Fork event: 1
Last Year
  • Watch event: 2
  • Push event: 2
  • Fork event: 1

Committers

Last synced: 8 months ago

All Time
  • Total Commits: 44
  • Total Committers: 1
  • Avg Commits per committer: 44.0
  • Development Distribution Score (DDS): 0.0
Past Year
  • Commits: 3
  • Committers: 1
  • Avg Commits per committer: 3.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
William Guimont-Martin w****0@h****m 44

Issues and Pull Requests

Last synced: 8 months ago

All Time
  • Total issues: 0
  • Total pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Total issue authors: 0
  • Total pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels