Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.5%) to scientific vocabulary
Last synced: 8 months ago · JSON representation ·

Repository

Basic Info
  • Host: GitHub
  • Owner: MaheyDS
  • License: apache-2.0
  • Language: Jupyter Notebook
  • Default Branch: main
  • Size: 782 KB
Statistics
  • Stars: 0
  • Watchers: 0
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Created 10 months ago · Last pushed 10 months ago
Metadata Files
Readme License Citation

README.md

AI Engineering Bootcamp - Multi-Provider Chatbot

A modern Streamlit-based chatbot application that supports multiple LLM providers including OpenAI, Groq, and Google Gemini. This project demonstrates how to build a unified interface for different AI models with configurable parameters. This project uses the Amazon Electronics product data and customer reviews. You can find and download the Amazon Electronics Category Dataset Overview by clicking here

🚀 Features

  • Multi-Provider Support: Switch between OpenAI, Groq, and Google Gemini models
  • Real-time Chat Interface: Built with Streamlit for an intuitive user experience
  • Configurable Parameters: Adjust temperature and max tokens for response control
  • Docker Support: Easy deployment with containerization
  • Environment-based Configuration: Secure API key management

📋 Prerequisites

  • Python 3.12 or higher
  • Docker (optional, for containerized deployment)
  • API keys for your chosen providers:
    • OpenAI API key
    • Groq API key
    • Google Gemini API key

🛠️ Installation & Setup

1. Clone the Repository

bash git clone <repository-url> cd 01-ai-engg-bootcamp

2. Install Dependencies

The project uses uv for dependency management. Install dependencies with:

bash uv sync

Alternative (if not using uv): bash pip install -r requirements.txt

3. Environment Configuration

Create a .env file in the project root with your API keys:

env OPENAI_API_KEY=your_openai_api_key_here GROQ_API_KEY=your_groq_api_key_here GOOGLE_API_KEY=your_google_api_key_here

🏃‍♂️ Running Locally

Option 1: Using Make (Recommended)

bash make run-streamlit

Option 2: Direct Streamlit Command

bash streamlit run src/chatbot-ui/streamlit_app.py

The application will be available at: - Local URL: http://localhost:8501 - Network URL: http://your-ip:8501

🐳 Running with Docker

1. Build the Docker Image

bash make build-docker-streamlit

2. Run the Container

bash make run-docker-streamlit

Note: If port 8501 is already in use, you can modify the port mapping in the Makefile or run directly:

bash docker run -v "$(PWD)/.env:/app/.env" -p 8502:8501 streamlit-app:latest

Then access the app at http://localhost:8502

🎛️ Usage

  1. Select Provider: Choose between OpenAI, Groq, or Google from the sidebar
  2. Choose Model: Select your preferred model for the chosen provider
  3. Adjust Parameters:
    • Temperature: Controls randomness (0.0 = deterministic, 1.0 = creative)
    • Max Tokens: Limits response length
  4. Start Chatting: Type your message and press Enter

📁 Project Structure

01-ai-engg-bootcamp/ ├── src/ │ └── chatbot_ui/ │ ├── core/ │ │ └── config.py # Configuration management │ └── streamlit_app.py # Main Streamlit application ├── notebooks/ # Jupyter notebooks for exploration ├── Dockerfile # Docker configuration ├── Makefile # Build and run commands ├── pyproject.toml # Project dependencies └── .env # Environment variables (create this)

🔧 Configuration

Supported Models

  • OpenAI: gpt-4o-mini, gpt-4o
  • Groq: llama-3.3-70b-versatile
  • Google: gemini-2.0-flash

Environment Variables

| Variable | Description | Required | |----------|-------------|----------| | OPENAI_API_KEY | OpenAI API key | For OpenAI models | | GROQ_API_KEY | Groq API key | For Groq models | | GOOGLE_API_KEY | Google Gemini API key | For Google models |

🚨 Troubleshooting

Common Issues

  1. Port Already in Use ```bash

    Check what's using port 8501

    lsof -i :8501

    Kill the process or use a different port

    ```

  2. API Key Errors

    • Ensure your .env file exists and contains valid API keys
    • Verify API keys have sufficient credits/permissions
  3. Docker Mount Issues

    • Ensure the .env file exists in the project root
    • Check Docker Desktop file sharing settings on macOS
  4. Google API Errors

    • Verify you're using the correct Google Generative AI library version
    • Check API quotas and billing status

Performance Optimization

For better performance, install the Watchdog module: bash pip install watchdog

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Test thoroughly
  5. Submit a pull request

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Streamlit for the amazing web framework
  • OpenAI, Groq, and Google for their LLM APIs
  • The AI engineering community for inspiration and support
  • Bridging Language and Items for Retrieval and Recommendation

Citations

@article{hou2024bridging, title={Bridging Language and Items for Retrieval and Recommendation}, author={Hou, Yupeng and Li, Jiacheng and He, Zhankui and Yan, An and Chen, Xiusi and McAuley, Julian}, journal={arXiv preprint arXiv:2403.03952}, year={2024} }

Owner

  • Login: MaheyDS
  • Kind: user

Citation (CITATION.cff)

@article{hou2024bridging,
  title={Bridging Language and Items for Retrieval and Recommendation},
  author={Hou, Yupeng and Li, Jiacheng and He, Zhankui and Yan, An and Chen, Xiusi and McAuley, Julian},
  journal={arXiv preprint arXiv:2403.03952},
  year={2024}
}

GitHub Events

Total
  • Push event: 1
  • Pull request event: 1
Last Year
  • Push event: 1
  • Pull request event: 1

Dependencies

Dockerfile docker
  • ghcr.io/astral-sh/uv python3.12-bookworm-slim build
pyproject.toml pypi
  • google-genai *
  • groq *
  • openai *
  • pydantic *
  • pydantic-settings >=2.0.0
  • streamlit *
uv.lock pypi
  • 01-ai-engg-bootcamp 0.1.0
  • altair 5.5.0
  • annotated-types 0.7.0
  • anyio 4.9.0
  • attrs 25.3.0
  • blinker 1.9.0
  • cachetools 5.5.2
  • certifi 2025.6.15
  • charset-normalizer 3.4.2
  • click 8.2.1
  • colorama 0.4.6
  • distro 1.9.0
  • gitdb 4.0.12
  • gitpython 3.1.44
  • google-auth 2.40.3
  • google-genai 1.23.0
  • groq 0.29.0
  • h11 0.16.0
  • httpcore 1.0.9
  • httpx 0.28.1
  • idna 3.10
  • jinja2 3.1.6
  • jiter 0.10.0
  • jsonschema 4.24.0
  • jsonschema-specifications 2025.4.1
  • markupsafe 3.0.2
  • narwhals 1.44.0
  • numpy 2.3.1
  • openai 1.93.0
  • packaging 25.0
  • pandas 2.3.0
  • pillow 11.2.1
  • protobuf 6.31.1
  • pyarrow 20.0.0
  • pyasn1 0.6.1
  • pyasn1-modules 0.4.2
  • pydantic 2.11.7
  • pydantic-core 2.33.2
  • pydantic-settings 2.10.1
  • pydeck 0.9.1
  • python-dateutil 2.9.0.post0
  • python-dotenv 1.1.1
  • pytz 2025.2
  • referencing 0.36.2
  • requests 2.32.4
  • rpds-py 0.25.1
  • rsa 4.9.1
  • six 1.17.0
  • smmap 5.0.2
  • sniffio 1.3.1
  • streamlit 1.46.1
  • tenacity 8.5.0
  • toml 0.10.2
  • tornado 6.5.1
  • tqdm 4.67.1
  • typing-extensions 4.14.0
  • typing-inspection 0.4.1
  • tzdata 2025.2
  • urllib3 2.5.0
  • watchdog 6.0.0
  • websockets 15.0.1