Science Score: 44.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (4.1%) to scientific vocabulary
Repository
Basic Info
- Host: GitHub
- Owner: sasukexie
- License: apache-2.0
- Language: Python
- Default Branch: main
- Size: 5.11 MB
Statistics
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Metadata Files
README.md
Read me
install LLaMA Factory
```bash
git clone --depth 1 https://github.com/hiyouga/LLaMA-Factory.git
cd LLaMA-Factory
pip install -e ".[torch,metrics]"
```
```bash
pip install https://github.com/jllllll/bitsandbytes-windows-webui/releases/download/wheels/bitsandbytes-0.41.2.post2-py3-none-win_amd64.whl
```
Datasets: /data/resource
preprocess: /tests/preprocess/chain
The /data and /tests data can be downloaded from here:https://drive.google.com/drive/folders/1bySWmDulEZ125QKe1eykR5HsGgnTe3-?usp=drivelink
Below you can find the configuration file: /examples/rec
You can run with the following command:
``
pretrain: nohup llamafactory-cli train examples/rec/llama3/lora_pretrain_ml.yaml > log/llama3/ml/pt_date +"%Y%m%d%H%M%S"`.log 2>&1 &
sft: nohup llamafactory-cli train examples/rec/llama3/lorasftftml.yaml > log/llama3/ml/ftdate +"%Y%m%d%H%M%S".log 2>&1 &
predict: nohup llamafactory-cli train examples/rec/llama3/lorapredictml.yaml > log/llama3/ml/pd_date +"%Y%m%d%H%M%S".log 2>&1 &
```
Owner
- Name: sasuke
- Login: sasukexie
- Kind: user
- Repositories: 1
- Profile: https://github.com/sasukexie
I think;
Citation (CITATION.cff)
cff-version: 1.2.0
date-released: 2024-03
message: "If you use this software, please cite it as below."
authors:
- family-names: "Zheng"
given-names: "Yaowei"
- family-names: "Zhang"
given-names: "Richong"
- family-names: "Zhang"
given-names: "Junhao"
- family-names: "Ye"
given-names: "Yanhan"
- family-names: "Luo"
given-names: "Zheyan"
- family-names: "Feng"
given-names: "Zhangchi"
- family-names: "Ma"
given-names: "Yongqiang"
title: "LlamaFactory: Unified Efficient Fine-Tuning of 100+ Language Models"
url: "https://arxiv.org/abs/2403.13372"
preferred-citation:
type: conference-paper
conference:
name: "Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations)"
authors:
- family-names: "Zheng"
given-names: "Yaowei"
- family-names: "Zhang"
given-names: "Richong"
- family-names: "Zhang"
given-names: "Junhao"
- family-names: "Ye"
given-names: "Yanhan"
- family-names: "Luo"
given-names: "Zheyan"
- family-names: "Feng"
given-names: "Zhangchi"
- family-names: "Ma"
given-names: "Yongqiang"
title: "LlamaFactory: Unified Efficient Fine-Tuning of 100+ Language Models"
url: "https://arxiv.org/abs/2403.13372"
year: 2024
publisher: "Association for Computational Linguistics"
address: "Bangkok, Thailand"
GitHub Events
Total
- Push event: 1
- Create event: 2
Last Year
- Push event: 1
- Create event: 2
Dependencies
- nvcr.io/nvidia/pytorch 24.02-py3 build
- ascendai/cann 8.0.rc1-910b-ubuntu22.04-py3.8 build
- hardandheavy/transformers-rocm 2.2.0 build
- accelerate >=0.30.1,<=0.34.2
- datasets >=2.16.0,<=2.21.0
- einops *
- fastapi *
- fire *
- gradio >=4.0.0
- matplotlib >=3.7.0
- numpy <2.0.0
- packaging *
- pandas >=2.0.0
- peft >=0.11.1,<=0.12.0
- protobuf *
- pydantic *
- pyyaml *
- scipy *
- sentencepiece *
- sse-starlette *
- tiktoken *
- transformers >=4.41.2,<=4.45.0
- trl >=0.8.6,<=0.9.6
- uvicorn *