transformers-tf-finetune
Scripts to finetune huggingface transformers models with Tensorflow 2
Science Score: 44.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (8.3%) to scientific vocabulary
Keywords
Repository
Scripts to finetune huggingface transformers models with Tensorflow 2
Basic Info
Statistics
- Stars: 6
- Watchers: 1
- Forks: 1
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
Transformers tf finetune
- Scripts and notebooks to train hugginface transformers models with Tensorflow 2.
- You can train models with jupyter or python script using a separate machine, and even without it, you can learn with two clicks from the colab.
- Select Task below, enter
[Open in Colab], and click[Runtime]-[Run all]to automatically load, learn, and evaluate data. - All code support GPU and TPU both.
- Tensorflow 2를 이용해 Transformer 모델들을 파인튜닝합니다.
- 별도의 머신을 이용해 노트북이나 스크립트로 학습할 수 있으며 그게 없더라도 colab 에서 클릭 두 번으로 학습할 수 있습니다.
- 아래에서 Task를 골라
[Open in Colab]으로 들어간 뒤에[Runtime]-[Run all]을 클릭하면 데이터로딩과 학습, 평가까지 자동으로 수행됩니다. - 모든 코드는 GPU, TPU 디바이스를 전부 지원합니다.
Tasks
| TaskName | Supported Models | Script | Colab |
| --- | --- | --- | --- |
| Chatbot | EncoderDecoder (e.g. BART, T5, ...) | Link | |
| HateSpeech | BART | Link |
|
| KLUE NLI | SequenceClassification (e.g. BERT, BART, GPT, ...) | Link |
|
| KLUE STS (Bi-Encoder) | SequenceClassification (e.g. BERT, BART, ...) | Link |
|
| KLUE TC | SequenceClassification (e.g. BERT, BART, GPT, ...) | Link |
|
| KorSTS (Bi-Encoder) | SequenceClassification (e.g. BERT, BART, ...) | Link |
|
| NSMC | SequenceClassification (e.g. BERT, BART, GPT, ...) | Link |
|
| QuestionPair | SequenceClassification (e.g. BERT, BART, GPT, ...) | Link |
|
Owner
- Name: ParkSangJun
- Login: cosmoquester
- Kind: user
- Location: Seoul, Korea
- Company: @scatterlab @pingpong-ai
- Website: https://cosmoquester.github.io
- Repositories: 12
- Profile: https://github.com/cosmoquester
Machine Learning Engineer @scatterlab Korea. Thank you.
Citation (CITATION.cff)
cff-version: 1.2.0 type: generic message: "If you use this code, please cite this as below." authors: - family-names: "Park" given-names: "Sangjun" orcid: "https://orcid.org/0000-0002-1838-9259" title: "transformers-tf-finetune" version: 1.0.0 date-released: 2022-10-28 url: "https://github.com/cosmoquester/transformers-tf-finetune"
GitHub Events
Total
- Create event: 1
Last Year
- Create event: 1
Dependencies
- black * development
- codecov * development
- isort * development
- pytest * development
- pytest-cov * development
- tensorflow >=2
- tensorflow-addons *
- transformers *
- tensorflow >=2