https://github.com/faisalhakimi22/automated-customer-support-chatbot
A smart AI-powered chatbot designed to automate customer support using Rasa's conversation management with OpenAI's GPT models combines. The chatbot understands queries, provides relevant responses, and can be deployed across multiple platforms.
https://github.com/faisalhakimi22/automated-customer-support-chatbot
Science Score: 26.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (10.4%) to scientific vocabulary
Repository
A smart AI-powered chatbot designed to automate customer support using Rasa's conversation management with OpenAI's GPT models combines. The chatbot understands queries, provides relevant responses, and can be deployed across multiple platforms.
Basic Info
Statistics
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Metadata Files
README.md
Automated Customer Support Chatbot
A next-generation hybrid customer support chatbot combining the power of Rasa 3, LangChain, Llama.cpp, and FastAPI, with a sleek Streamlit chat UI. Deliver both structured and generative answersleveraging local LLMs and Retrieval Augmented Generation (RAG)for exceptional customer experiences.
Features
- Hybrid Intelligence: Rasa's robust dialogue + LLM-powered generative answers
- Local LLaMA Backend: Fast, private, and cost-effective (supports
unsloth.Q8_0.ggufand more) - RAG Support: LangChain + ChromaDB for document-grounded answers
- FastAPI Microservice: Scalable LLM API for custom actions
- Modern Streamlit UI: Chat-style, responsive, and cloud-ready
- Flexible Deployment: Run locally, on your server, or deploy the UI to Streamlit Cloud
Architecture
mermaid
flowchart TD
A[" User"] --> B[" Streamlit Chat UI"]
B --> C[" Rasa 3 Server"]
C -- "Custom Action (action_ask_gpt)" --> D[" FastAPI LLM API"]
D --> E[" Llama.cpp LLM"]
D --> F[" LangChain RAG (ChromaDB, Docs)"]
Project Structure
text
Automated-Customer-Support-Chatbot/
actions/ # Custom Rasa actions (calls LLM API)
data/ # Rasa NLU, stories, rules
models/ # Trained Rasa models, LLaMA GGUF files
results/ # Output, logs, etc.
app.py # Streamlit frontend
llm_api.py # FastAPI LLM+RAG backend
start_chatbot.py # Script to launch all services locally
requirements.txt # Python dependencies
README.md # This file
... # Other configs and scripts
How It Works
- User chats via the Streamlit web UI
- Streamlit sends messages to the Rasa 3 server (REST API)
- Rasa handles intent/entity recognition and dialogue
- For open-ended/knowledge queries, Rasa triggers
action_ask_gpt:- Calls the FastAPI LLM API
- API uses LangChain to retrieve context (RAG) and generate a response with Llama.cpp
- The answer flows back to the user via Rasa and Streamlit
Tech Stack
| Backend | LLM & RAG | Frontend | Deployment | |-----------------|-------------------|---------------|--------------------| | Rasa 3 | Llama.cpp | Streamlit | Local/Cloud | | FastAPI | LangChain | | Streamlit Cloud | | Python 3.83.10 | ChromaDB | | Docker (optional) |
Quickstart
1. Clone the Repository
sh
git clone https://github.com/your-username/Automated-Customer-Support-Chatbot.git
cd Automated-Customer-Support-Chatbot
2. Install Dependencies
sh
pip install -r requirements.txt
3. Set Up Environment Variables
Create a .env file:
env
OPENAI_API_KEY=your_openai_key
GGUF_MODEL_PATH=path_to_your_llama_model.gguf
4. Train and Start Rasa
sh
rasa train
rasa run --enable-api
5. Start the LLM API
sh
python llm_api.py
6. Start the Streamlit Frontend
sh
streamlit run app.py
Or use start_chatbot.py to launch all services together (locally).
Deploying on Streamlit Cloud
- Only the Streamlit frontend is deployed on Streamlit Cloud.
- Rasa and the LLM API must run on a separate server (local, VM, or cloud).
- Set the Rasa server URL in your Streamlit Cloud environment variables.
Customization
- Add Documents: Place PDFs, Markdown, or text files in the
docs/folder for RAG. - NLU & Stories: Edit
data/nlu.yml,data/stories.yml, anddata/rules.ymlfor intents and flows. - Custom Actions: Extend logic in
actions/. - Model: Swap out LLaMA models by changing
GGUF_MODEL_PATH.
Troubleshooting
- See TROUBLESHOOTING.md for common issues and solutions.
- Key tips:
- Ensure Rasa and LLM API are running and accessible
- Check environment variables and API keys
- Review logs for errors
License
MIT License
Made with by Faisal Hakimi
Owner
- Name: Faisal Hakimi
- Login: Faisalhakimi22
- Kind: user
- Location: Pakistan
- Website: https://medium.com/@faisalh5556
- Repositories: 1
- Profile: https://github.com/Faisalhakimi22
Computer Science | Aspiring Data Analyst | Ai Enthusiast | Machine Learning
GitHub Events
Total
- Push event: 17
- Create event: 2
Last Year
- Push event: 17
- Create event: 2