llms-from-scratch

Build your own Large Language Model from scratch with this code repository. Learn the ins and outs of LLMs like GPT. 🚀💻

https://github.com/quin12124223/llms-from-scratch

Science Score: 26.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • â—‹
    CITATION.cff file
  • ✓
    codemeta.json file
    Found codemeta.json file
  • ✓
    .zenodo.json file
    Found .zenodo.json file
  • â—‹
    DOI references
  • â—‹
    Academic links in README
  • â—‹
    Academic email domains
  • â—‹
    Institutional organization owner
  • â—‹
    JOSS paper metadata
  • â—‹
    Scientific vocabulary similarity
    Low similarity (2.6%) to scientific vocabulary

Keywords

bert book chatgpt deberta flan-t5 from-scratch language-model large-language-models llm llms-book machine-learning mcp neural-networks nlp prompt-engineering python pytorch roberta
Last synced: 6 months ago · JSON representation

Repository

Build your own Large Language Model from scratch with this code repository. Learn the ins and outs of LLMs like GPT. 🚀💻

Basic Info
  • Host: GitHub
  • Owner: quin12124223
  • License: other
  • Language: Jupyter Notebook
  • Default Branch: main
  • Homepage: https://quin12124223.github.io
  • Size: 11.1 MB
Statistics
  • Stars: 0
  • Watchers: 0
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Topics
bert book chatgpt deberta flan-t5 from-scratch language-model large-language-models llm llms-book machine-learning mcp neural-networks nlp prompt-engineering python pytorch roberta
Created 8 months ago · Last pushed 8 months ago
Metadata Files
Readme License Citation

Owner

  • Login: quin12124223
  • Kind: user

GitHub Events

Total
  • Push event: 12
  • Create event: 1
Last Year
  • Push event: 12
  • Create event: 1

Dependencies

setup/03_optional-docker-environment/.devcontainer/Dockerfile docker
  • pytorch/pytorch 2.5.0-cuda12.4-cudnn9-runtime build
ch02/02_bonus_bytepair-encoder/requirements-extra.txt pypi
  • requests *
  • tqdm *
  • transformers >=4.33.2
ch04/02_performance-analysis/requirements-extra.txt pypi
  • thop *
ch05/06_user_interface/requirements-extra.txt pypi
  • chainlit >=1.2.0
ch05/07_gpt_to_llama/requirements-extra.txt pypi
  • blobfile >=3.0.0
  • huggingface_hub >=0.24.7
  • ipywidgets >=8.1.2
  • safetensors >=0.4.4
  • sentencepiece >=0.1.99
ch05/07_gpt_to_llama/tests/test-requirements-extra.txt pypi
  • pytest >=8.1.1 test
  • transformers >=4.44.2 test
ch06/03_bonus_imdb-classification/requirements-extra.txt pypi
  • scikit-learn >=1.3.0
  • transformers >=4.33.2
ch06/04_user_interface/requirements-extra.txt pypi
  • chainlit >=1.2.0
ch07/02_dataset-utilities/requirements-extra.txt pypi
  • openai >=1.30.3
  • scikit-learn >=1.3.1
  • tqdm >=4.65.0
ch07/03_model-evaluation/requirements-extra.txt pypi
  • openai >=1.30.3
  • tqdm >=4.65.0
ch07/05_dataset-generation/requirements-extra.txt pypi
  • openai >=1.30.3
  • tqdm >=4.65.0
ch07/06_user_interface/requirements-extra.txt pypi
  • chainlit >=1.2.0
pyproject.toml pypi
  • jupyterlab >=4.0
  • matplotlib >=3.7.1
  • numpy >=1.26,<2.1
  • pandas >=2.2.1
  • pip >=25.0.1
  • pytest >=8.3.5
  • tensorflow >=2.18.0
  • tiktoken >=0.5.1
  • torch >=2.3.0
  • tqdm >=4.66.1
requirements.txt pypi
  • jupyterlab >=4.0
  • matplotlib >=3.7.1
  • numpy >=1.26,<2.1
  • pandas >=2.2.1
  • psutil >=5.9.5
  • tensorflow >=2.18.0
  • tiktoken >=0.5.1
  • torch >=2.3.0
  • tqdm >=4.66.1