tensorlm-webui

Simple and modern webui for LLM models based LLaMA.

https://github.com/ehristoforu/tensorlm-webui

Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (12.5%) to scientific vocabulary

Keywords

ai fooocus gradio gradio-python-llm linux llamacpp llm llm-inference lm macos ml portable tensorlm text-generation-webui ui webui windows
Last synced: 6 months ago · JSON representation ·

Repository

Simple and modern webui for LLM models based LLaMA.

Basic Info
Statistics
  • Stars: 6
  • Watchers: 2
  • Forks: 5
  • Open Issues: 0
  • Releases: 8
Topics
ai fooocus gradio gradio-python-llm linux llamacpp llm llm-inference lm macos ml portable tensorlm text-generation-webui ui webui windows
Created about 2 years ago · Last pushed over 1 year ago
Metadata Files
Readme Funding License Citation Codeowners

README.md

TensorLM - webui for LLM models

preview

This is Fooocus from the world of Stable Difusion in the world of Text Generation, the same ease of use and the same convenience.

This is simple and modern Gradio webui for LLM models GGML format (.bin) or GGUF format (.gguf) based on LLaMA.

We can try this app online in very slow demo of 3.0.0 version: Open In Spaces


Navigation: \ Installing \ Presets \ Model downloading \ API \

Args

Fast use

You can use this webui in cloud service Colab: Open In Colab

Features

  • Simple to use
  • Comfy to work
  • Not demanding on resources
  • Beautiful and pleasant interface
  • Support GGML and GGUF format (.bin and .gguf)
  • Support OpenAI API and MistralAI API

Installing

In Windows

>>> Portable one-click packege <<<

Step-by-step installation: 1. Install Python 3.10.6 and Git 2. Run git clone https://github.com/ehristoforu/TensorLM-webui.git 3. Run cd TensorLM-webui 4. Run update_mode.bat && enter 1 and 2 5. Run start.bat

Or download exe installer from this repo

In MacOS

Step-by-step installation: 1. Install Python 3.10.6 and Git 2. Run git clone https://github.com/ehristoforu/TensorLM-webui.git 3. Run cd TensorLM-webui 4. Run python pip install -r requirements.txt 5. Run python webui.py

In Linux

Step-by-step installation: 1. Install Python 3.10.6 and Git 2. Run git clone https://github.com/ehristoforu/TensorLM-webui.git 3. Run cd TensorLM-webui 4. Run python pip install -r requirements.txt 5. Run python webui.py

Presets

In this app there is 23 default presets. \ Thanks, @mustvlad for system prompts!

You can create your custom presets, instruction in presets folder (it is .md-file).

Model downloading

With this interface you don't need to scour the Internet looking for a compatible model; in the "Tabs" checkbox and in the "ModelGet" tab you can choose which model to download from our verified repository on HuggingFace.

API

Warning! Need Internet.

OpenAI API

openai-api-preview

You can use OpenAI API in this UI (select "OpenAI" in Mode radio-button).

  • You can select the OpenAI model or input your custom OpenAI model
  • You can select the OpenAI endpoint or input your custom OpenAI endpoint

To enter your OpenAI key go to configure.txt and input to openai_key parameter

MistralAI API

mistralai-api-preview

You can use MistralAI API (from HuggingFace) in this UI (select "MistralAI" in Mode radio-button) for free.

  • You can select the MistralAI model or input your custom MistralAI model

Args

To use args: - In Windows: edit start.bat with Notepad and edit line with python webui.py to python webui.py [Your args], for ex. python webui.py --inbrowser - In MacOS & Linux: run python webui.py with args - python webui.py {Your args}, for ex. python webui.py --inbrowser

Args list

--inbrowser --share --lowvram --debug --quiet

Forks

While there are no forks 😔, perhaps you will be the first who can significantly improve this application!

Citation

bibtex @software{ehristoforu_TensorLM-webui_2024, author = {ehristoforu}, month = apr, title = {{TensorLM-webui}}, url = {https://github.com/ehristoforu/TensorLM-webui}, year = {2024} }

Owner

  • Name: Evgeniy Hristoforu
  • Login: ehristoforu
  • Kind: user
  • Location: The Diffusers World
  • Company: @openskyml

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
  - given-names: ehristoforu
title: "TensorLM-webui"
date-released: 2024-04-03
url: "https://github.com/ehristoforu/TensorLM-webui"

GitHub Events

Total
  • Watch event: 1
  • Fork event: 1
Last Year
  • Watch event: 1
  • Fork event: 1

Committers

Last synced: 7 months ago

All Time
  • Total Commits: 44
  • Total Committers: 1
  • Avg Commits per committer: 44.0
  • Development Distribution Score (DDS): 0.0
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Evgeniy Hristoforu 1****u 44

Issues and Pull Requests

Last synced: 7 months ago

All Time
  • Total issues: 0
  • Total pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Total issue authors: 0
  • Total pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels

Dependencies

requirements.txt pypi
  • art ==6.1
  • gradio ==4.1.0
  • huggingface_hub *
  • llama-cpp-python ==0.1.73
  • python-dotenv *
  • torch ==2.0.1
  • transformers *