transformers-tf-finetune

Scripts to finetune huggingface transformers models with Tensorflow 2

https://github.com/cosmoquester/transformers-tf-finetune

Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (8.3%) to scientific vocabulary

Keywords

nlp tensorflow transformers
Last synced: 6 months ago · JSON representation ·

Repository

Scripts to finetune huggingface transformers models with Tensorflow 2

Basic Info
  • Host: GitHub
  • Owner: cosmoquester
  • License: mit
  • Language: Python
  • Default Branch: master
  • Homepage:
  • Size: 113 KB
Statistics
  • Stars: 6
  • Watchers: 1
  • Forks: 1
  • Open Issues: 0
  • Releases: 0
Topics
nlp tensorflow transformers
Created over 4 years ago · Last pushed about 1 year ago
Metadata Files
Readme License Citation

README.md

Transformers tf finetune

Code style: black Imports: isort cosmoquester codecov

  • Scripts and notebooks to train hugginface transformers models with Tensorflow 2.
  • You can train models with jupyter or python script using a separate machine, and even without it, you can learn with two clicks from the colab.
  • Select Task below, enter [Open in Colab], and click [Runtime] - [Run all] to automatically load, learn, and evaluate data.
  • All code support GPU and TPU both.


  • Tensorflow 2를 이용해 Transformer 모델들을 파인튜닝합니다.
  • 별도의 머신을 이용해 노트북이나 스크립트로 학습할 수 있으며 그게 없더라도 colab 에서 클릭 두 번으로 학습할 수 있습니다.
  • 아래에서 Task를 골라 [Open in Colab]으로 들어간 뒤에 [Runtime] - [Run all] 을 클릭하면 데이터로딩과 학습, 평가까지 자동으로 수행됩니다.
  • 모든 코드는 GPU, TPU 디바이스를 전부 지원합니다.

Tasks

| TaskName | Supported Models | Script | Colab | | --- | --- | --- | --- | | Chatbot | EncoderDecoder (e.g. BART, T5, ...) | Link | Open In Colab | | HateSpeech | BART | Link | Open In Colab | | KLUE NLI | SequenceClassification (e.g. BERT, BART, GPT, ...) | Link | Open In Colab | | KLUE STS (Bi-Encoder) | SequenceClassification (e.g. BERT, BART, ...) | Link | Open In Colab | | KLUE TC | SequenceClassification (e.g. BERT, BART, GPT, ...) | Link | Open In Colab | | KorSTS (Bi-Encoder) | SequenceClassification (e.g. BERT, BART, ...) | Link | Open In Colab | | NSMC | SequenceClassification (e.g. BERT, BART, GPT, ...) | Link | Open In Colab | | QuestionPair | SequenceClassification (e.g. BERT, BART, GPT, ...) | Link | Open In Colab |

Owner

  • Name: ParkSangJun
  • Login: cosmoquester
  • Kind: user
  • Location: Seoul, Korea
  • Company: @scatterlab @pingpong-ai

Machine Learning Engineer @scatterlab Korea. Thank you.

Citation (CITATION.cff)

cff-version: 1.2.0
type: generic
message: "If you use this code, please cite this as below."
authors:
- family-names: "Park"
  given-names: "Sangjun"
  orcid: "https://orcid.org/0000-0002-1838-9259"
title: "transformers-tf-finetune"
version: 1.0.0
date-released: 2022-10-28
url: "https://github.com/cosmoquester/transformers-tf-finetune"

GitHub Events

Total
  • Create event: 1
Last Year
  • Create event: 1

Dependencies

requirements-dev.txt pypi
  • black * development
  • codecov * development
  • isort * development
  • pytest * development
  • pytest-cov * development
requirements.txt pypi
  • tensorflow >=2
  • tensorflow-addons *
  • transformers *
setup.py pypi
  • tensorflow >=2
pyproject.toml pypi