Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.0%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

Basic Info
Statistics
  • Stars: 0
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Created over 2 years ago · Last pushed over 2 years ago
Metadata Files
Readme License Code of conduct Citation

README.md

LLMStack

LLMStack is a no-code platform for building generative AI applications, chatbots, agents and connecting them to your data and business processes.

Quickstart | Documentation | Promptly

Overview

Build tailor-made generative AI applications, chatbots and agents that cater to your unique needs by chaining multiple LLMs. Seamlessly integrate your own data and GPT-powered models without any coding experience using LLMStack's no-code builder. Trigger your AI chains from Slack or Discord. Deploy to the cloud or on-premise.

llmstack-quickstart

See full demo video here

Getting Started

Check out our Cloud offering at Promptly or follow the instructions below to deploy LLMStack on your own infrastructure.

Clone this repository or download the latest release. Install docker if not already installed. Copy .env.prod to .env and update SECRET_KEY, CIPHER_SALT and DATABASE_PASSWORD in .env file:

cp .env.prod .env

Run LLMStack using the following command:

./run-llmstack.sh

If you are on Windows, you can use run-llmstack.bat instead

Once LLMStack is up and ready, it should automatically open your browser and point it to localhost:3000. You can also alternatively use docker compose up to manually start the containers and open localhost:3000 to login into the platform. Make sure to wait for the API server to be ready before trying to load LLMStack.

LLMStack deployment comes with a default admin account whose credentials are admin and promptly. Be sure to change the password from admin panel after logging in.

Users of the platform can add their own keys to providers like OpenAI, Cohere, Stability etc., from Settings page. If you want to provide default keys for all the users of your LLMStack instance, you can add them to the .env file. Make sure to restart the containers after adding the keys.

Remember to update POSTGRES_VOLUME, REDIS_VOLUME and WEAVIATE_VOLUME in .env file if you want to persist data across container restarts.

LLMStack: Quickstart video

Features

🔗 Chain multiple models: LLMStack allows you to chain multiple LLMs together to build complex generative AI applications.

📊 Use generative AI on your Data: Import your data into your accounts and use it in AI chains. LLMStack allows importing various types (CSV, TXT, PDF, DOCX, PPTX etc.,) of data from a variety of sources (gdrive, notion, websites, direct uploads etc.,). Platform will take care of preprocessing and vectorization of your data and store it in the vector database that is provided out of the box.

🛠️ No-code builder: LLMStack comes with a no-code builder that allows you to build AI chains without any coding experience. You can chain multiple LLMs together and connect them to your data and business processes.

☁️ Deploy to the cloud or on-premise: LLMStack can be deployed to the cloud or on-premise. You can deploy it to your own infrastructure or use our cloud offering at Promptly.

🚀 API access: Apps or chatbots built with LLMStack can be accessed via HTTP API. You can also trigger your AI chains from Slack or Discord.

🏢 Multi-tenant: LLMStack is multi-tenant. You can create multiple organizations and add users to them. Users can only access the data and AI chains that belong to their organization.

What can you build with LLMStack?

Using LLMStack you can build a variety of generative AI applications, chatbots and agents. Here are some examples:

📝 Text generation: You can build apps that generate product descriptions, blog posts, news articles, tweets, emails, chat messages, etc., by using text generation models and optionally connecting your data. Check out this marketing content generator for example

🤖 Chatbots: You can build chatbots trained on your data powered by ChatGPT like Promptly Help that is embedded on Promptly website

🎨 Multimedia generation: Build complex applications that can generate text, images, videos, audio, etc. from a prompt. This story generator is an example

🗣️ Conversational AI: Build conversational AI systems that can have a conversation with a user. Check out this Harry Potter character chatbot

🔍 Search augmentation: Build search augmentation systems that can augment search results with additional information using APIs. Sharebird uses LLMStack to augment search results with AI generated answer from their content similar to Bing's chatbot

💬 Discord and Slack bots: Apps built on LLMStack can be triggered from Slack or Discord. You can easily connect your AI chains to Slack or Discord from LLMStack's no-code app editor. Check out our Discord server to interact with one such bot.

Administration

Login to http://localhost:3000/admin using the admin account. You can add users and assign them to organizations in the admin panel.

Cloud Offering

Check out our cloud offering at Promptly. You can sign up for a free account and start building your own generative AI applications.

Documentation

Check out our documentation at llmstack.ai/docs to learn more about LLMStack.

Development

Run the following commands from the root of the repository to bring up the application containers in development mode. Make sure you have docker and npm installed on your system before running these commands.

bash cd client npm install npm run build cd .. docker compose -f docker-compose.dev.yml --env-file .env.dev up --build

This will mount the source code into the containers and restart the containers on code changes. Update .env.dev as needed. Please note that LLMStack is available at http://localhost:9000 in development mode.

You can skip running npm install and npm run build if you have already built the client before

For frontend development, you can use npm start to start the development server in client directory. You can also use npm run build to build the frontend and serve it from the backend server.

To update documentation, make changes to web/docs directory and run npm run build in web directory to build the documentation. You can use npm start in web directory to serve the documentation locally.

Contributing

We welcome contributions to LLMStack. Please check out our contributing guide to learn more about how you can contribute to LLMStack.

Owner

  • Login: Yanking1202
  • Kind: user

Citation (CITATION.cff)

cff-version: 1.2.0
title: "LLMStack: A platform to build and deploy LLM applications"
message: "If you use this software, please cite it as below."
type: software
authors:
  - given-names: "Ajay Kumar"
    family-names: "Chintala"
  - given-names: "Vignesh"
    family-names: "Aigal"
url: "https://github.com/trypromptly/llmstack"

GitHub Events

Total
Last Year

Issues and Pull Requests

Last synced: almost 2 years ago

All Time
  • Total issues: 0
  • Total pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Total issue authors: 0
  • Total pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels

Dependencies

.github/workflows/codeql.yml actions
  • actions/checkout v3 composite
  • github/codeql-action/analyze v2 composite
  • github/codeql-action/autobuild v2 composite
  • github/codeql-action/init v2 composite
.github/workflows/docker-images.yml actions
  • actions/checkout v3 composite
  • docker/build-push-action v4 composite
  • docker/login-action v2.2.0 composite
  • docker/metadata-action v4 composite
  • docker/setup-buildx-action v2 composite
  • docker/setup-qemu-action v2 composite
  • sigstore/cosign-installer v3.1.1 composite
Dockerfile docker
  • python 3.11 build
  • python 3.11-slim build
docker-compose.dev.yml docker
  • llmstack-playwright latest
  • postgres 15.1-alpine
  • redis alpine
  • semitechnologies/weaviate 1.20.5
docker-compose.yml docker
  • ${REGISTRY -ghcr.io/trypromptly/}llmstack-api
  • ${REGISTRY -ghcr.io/trypromptly/}llmstack-nginx
  • ${REGISTRY -ghcr.io/trypromptly/}llmstack-playwright
  • postgres 15.1-alpine
  • redis alpine
  • semitechnologies/weaviate 1.20.5
playwright/Dockerfile docker
  • mcr.microsoft.com/playwright v1.37.0-jammy build
client/package-lock.json npm
  • 1546 dependencies
client/package.json npm
  • http-proxy-middleware ^2.0.6 development
  • source-map-explorer ^2.5.3 development
  • @ginkgo-bioworks/react-json-schema-form-builder ^2.10.1
  • @jsonforms/material-renderers ^3.0.0
  • @jsonforms/react ^3.0.0
  • @lexical/react ^0.9.1
  • @mui/icons-material ^5.11.11
  • @mui/lab ^5.0.0-alpha.124
  • @rjsf/core ^5.2.0
  • @rjsf/mui ^5.2.0
  • @rjsf/utils ^5.2.0
  • @rjsf/validator-ajv8 ^5.2.0
  • @testing-library/jest-dom ^5.16.5
  • @testing-library/react ^13.4.0
  • @testing-library/user-event ^13.5.0
  • antd ^5.1.6
  • antd-img-crop ^4.12.2
  • axios ^1.3.1
  • js-yaml ^4.1.0
  • lexical ^0.9.1
  • liquidjs ^10.7.0
  • moment ^2.29.4
  • notistack ^3.0.1
  • pretty-bytes ^6.1.0
  • react ^18.2.0
  • react-ace ^10.1.0
  • react-cookie ^4.1.1
  • react-diff-viewer-continued ^3.2.5
  • react-dom ^18.2.0
  • react-dropzone ^14.2.3
  • react-ga4 ^2.0.0
  • react-markdown ^8.0.5
  • react-router-dom ^6.7.0
  • react-scripts 5.0.1
  • react-share ^4.4.1
  • recoil ^0.7.6
  • remark-gfm ^3.0.1
  • ua-parser-js ^1.0.35
  • web-vitals ^2.1.4
web/package-lock.json npm
  • 1031 dependencies
web/package.json npm
  • @docusaurus/module-type-aliases 2.4.1 development
  • @docusaurus/core 2.4.1
  • @docusaurus/plugin-google-gtag ^2.4.1
  • @docusaurus/plugin-sitemap ^2.4.1
  • @docusaurus/preset-classic 2.4.1
  • @mdx-js/react ^1.6.22
  • clsx ^1.2.1
  • prism-react-renderer ^1.3.5
  • react ^17.0.2
  • react-dom ^17.0.2
  • react-github-btn ^1.4.0
  • react-player ^2.12.0
requirements_base.txt pypi
  • Authlib ==1.2.0
  • Automat ==22.10.0
  • Django ==4.2.1
  • Jinja2 ==3.1.2
  • PyJWT ==2.6.0
  • PyYAML ==6.0
  • SQLAlchemy ==1.4.46
  • Twisted ==22.10.0
  • aiohttp ==3.8.4
  • aiosignal ==1.3.1
  • anyio ==3.6.2
  • asgiref ==3.6.0
  • async-timeout ==4.0.2
  • attrs ==22.2.0
  • autobahn ==23.1.2
  • backoff ==2.2.1
  • beautifulsoup4 ==4.12.2
  • channels ==4.0.0
  • click ==8.1.3
  • constantly ==15.1.0
  • defusedxml ==0.7.1
  • django-allauth ==0.52.0
  • django-environ ==0.10.0
  • django-flags ==5.0.12
  • django-jsonform ==2.17.4
  • django-picklefield ==3.1
  • django-redis ==5.2.0
  • django-rq ==2.7.0
  • djangorestframework ==3.14.0
  • fastapi ==0.93.0
  • geoip2 ==4.7.0
  • google-auth ==2.22.0
  • gunicorn ==20.1.0
  • h11 ==0.14.0
  • h2 ==4.1.0
  • httpcore ==0.16.3
  • httptools ==0.5.0
  • httpx ==0.23.3
  • hyperlink ==21.0.0
  • idna ==3.4
  • importlib-metadata ==6.0.0
  • incremental ==22.10.0
  • joblib ==1.2.0
  • jsonschema ==4.17.3
  • lz4 ==4.3.2
  • marshmallow ==3.19.0
  • marshmallow-enum ==1.5.1
  • multidict ==6.0.4
  • oauthlib ==3.2.2
  • orjson ==3.8.14
  • packaging ==23.0
  • playwright ==1.35.0
  • protobuf ==3.19.5
  • psycopg2-binary ==2.9.5
  • pyOpenSSL ==23.1.1
  • pyasn1 ==0.4.8
  • pyasn1-modules ==0.2.8
  • pykka ==3.1.1
  • python-dateutil ==2.8.2
  • python3-openid ==3.2.0
  • pytz ==2022.7
  • redis ==4.5.4
  • rfc3986 ==1.5.0
  • rq ==1.13.0
  • sendgrid ==6.10.0
  • service-identity ==21.1.0
  • sqlparse ==0.4.3
  • starlette ==0.25.0
  • tenacity ==8.2.2
  • txaio ==23.1.1
  • typer ==0.7.0
  • typing-inspect ==0.8.0
  • typing_extensions ==4.5.0
  • ujson ==5.7.0
  • urllib3 ==1.26.13
  • uvicorn ==0.21.0
  • uvloop ==0.17.0
  • validators ==0.19.0
  • weaviate-client ==3.22.1
  • websockets ==10.4
  • yarl ==1.8.2
  • zope.interface ==6.0
requirements_datasources.txt pypi
  • Markdown ==3.4.4
  • PyNaCl ==1.5.0
  • Scrapy ==2.8.0
  • boto3 ==1.26.122
  • botocore ==1.29.122
  • ffmpeg-python ==0.2.0
  • pdf2image ==1.16.3
  • pdfminer.six ==20221105
  • pydub ==0.25.1
  • python-docx ==0.8.11
  • python-magic ==0.4.27
  • python-pptx ==0.6.21
  • spacy ==3.6.0
  • striprtf ==0.0.22
  • tiktoken ==0.4.0
  • unstructured ==0.9.0
  • yt-dlp ==2023.3.4
requirements_processors.txt pypi
  • openai ==0.27.0
  • stability-sdk ==0.8.4