azllm

A Python package that provides an easier user interface for multiple LLM providers.

https://github.com/hanifsajid/azllm

Science Score: 67.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 2 DOI reference(s) in README
  • Academic publication links
    Links to: zenodo.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (15.5%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

A Python package that provides an easier user interface for multiple LLM providers.

Basic Info
Statistics
  • Stars: 1
  • Watchers: 2
  • Forks: 0
  • Open Issues: 0
  • Releases: 2
Created 11 months ago · Last pushed 8 months ago
Metadata Files
Readme Citation

README.md

azllm: A Unified LLM Interface for Multi-Provider Access

PyPI version DOI Python

azllm is a Python package that provides a unified interface to work with multiple LLM providers including OpenAI, DeepSeek, Grok, Gemini, Meta's LLaMA, Anthropic, Ollama, and more.

NOTE: For advanced usage, see the azllm documentation and/or examples.

Features

  • One unified interface for all major LLM APIs
  • Batch and parallel prompt generation
  • Structured outputs (parsing) with Pydantic for models that support parsed outputs natively
  • Structured outputs (parsing) with Pydantic for DeepSeek and Anthropic
  • Per-model configurations and lazy initialization
  • Clean error handling

- .env-based API key management

Supported Clients

NOTE: If you would like to request support for additional LLMs, please open an issue on our GitHub page.

Installation

You can install the azllm package via pip:

bash pip install azllm

Prerequisites

  • Python 3.11+
  • Create a .env file to store your API keys. For example:

    bash OPENAI_API_KEY=your_openai_api_key DEEPSEEK_API_KEY=your_deepseek_api_key XAI_API_KEY=your_xai_api_key GEMINI_API_KEY=your_gemini_api_key ANTHROPIC_API_KEY=your_anthropic_api_key FIREWORKS_API_KEY=your_fireworks_api_key

  • Ollama must be installed and running locally to use Ollama models.

Quick Start

Basic Initialization

Python from azllm import azLLM manager = azLLM() # Instantiated with default parameters

Generate Text from a Single Prompt

Python prompt = 'What is the captial of France?' generated_text = manager.generate_text('openai', prompt) print(generated_text)

Batch Generation

Generate responses for multiple prompts at once:

```Python batch_prompts = [ 'What is the capital of France?', 'Tell me a joke.' ]

results = manager.batchgenerate('openai', batchprompts) for result in results: print(result) ```

Parallel Generation

Run a single prompt across multiple models simultaneously:

```python prompt = 'What is the capital of France?' models = [ 'openai', 'grok', 'ollama']

results = manager.generate_parallel(prompt, models) for model, result in results.items(): print(f"Model: {model},\nResult: {result}\n") ```

License

md MIT License

Citation

@misc{azLLM, title = {azllm}, author = {Hanif Sajid and Benjamin Radford and Yaoyao Dai and Jason Windett}, year = {2025}, month = apr, version = {0.1.6}, howpublished = {https://github.com/hanifsajid/azllm}, note = {MIT License}, abstract = {azllm is a Python package designed to interface with various large language models (LLMs) from different AI providers. It offers a unified interface for interacting with models from providers like OpenAI, DeepSeek, Grok, Gemini, Meta's Llama, Anthropic, Ollama, and others. The package allows for customizable configurations, batch generation, parallel generation, error handling, and the ability to parse structured responses from different models.} }

Owner

  • Name: Hanif Sajid
  • Login: hanifsajid
  • Kind: user

Citation (CITATION.cff)

cff-version: 0.1.2
title: "azllm"
authors:
  - family-names: Sajid
    given-names: Hanif
  - family-names: Radford
    given-names: Benjamin
  - family-names: Dai
    given-names: Yaoyao
  - family-names: Windett
    given-names: Jason
version: "0.1.6"
date-released: 2025-04-25
repository-code: https://github.com/hanifsajid/azllm
license: MIT
abstract: >
  `azllm` is a Python package designed to interface with various large language models (LLMs) 
  from different AI providers. It offers a unified interface for interacting 
  with models from providers like OpenAI, DeepSeek, Grok, Gemini, Meta's Llama, Anthropic,
  Ollama, and others. The package allows for customizable configurations, batch generation,
  parallel generation, error handling, and the ability 
  to parse structured responses from different models.

GitHub Events

Total
  • Release event: 2
  • Watch event: 1
  • Delete event: 1
  • Push event: 21
  • Create event: 4
Last Year
  • Release event: 2
  • Watch event: 1
  • Delete event: 1
  • Push event: 21
  • Create event: 4

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 94 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 6
  • Total maintainers: 4
pypi.org: azllm

A Python package that provides an easier user interface for multiple LLM providers.

  • Versions: 6
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 94 Last month
Rankings
Dependent packages count: 9.2%
Average: 30.6%
Dependent repos count: 52.0%
Last synced: 7 months ago

Dependencies

.github/workflows/tests.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v4 composite
poetry.lock pypi
  • alabaster 1.0.0
  • annotated-types 0.7.0
  • anyio 4.9.0
  • appnope 0.1.4
  • asttokens 3.0.0
  • babel 2.17.0
  • certifi 2025.1.31
  • cffi 1.17.1
  • charset-normalizer 3.4.1
  • colorama 0.4.6
  • comm 0.2.2
  • debugpy 1.8.14
  • decorator 5.2.1
  • distro 1.9.0
  • docutils 0.21.2
  • executing 2.2.0
  • ghp-import 2.1.0
  • h11 0.14.0
  • httpcore 1.0.8
  • httpx 0.28.1
  • idna 3.10
  • imagesize 1.4.1
  • iniconfig 2.1.0
  • ipykernel 6.29.5
  • ipython 8.35.0
  • jedi 0.19.2
  • jinja2 3.1.6
  • jiter 0.9.0
  • jupyter-client 8.6.3
  • jupyter-core 5.7.2
  • markdown-it-py 3.0.0
  • markupsafe 3.0.2
  • matplotlib-inline 0.1.7
  • mdit-py-plugins 0.4.2
  • mdurl 0.1.2
  • myst-parser 4.0.1
  • nest-asyncio 1.6.0
  • openai 1.76.0
  • packaging 25.0
  • parso 0.8.4
  • pexpect 4.9.0
  • platformdirs 4.3.7
  • pluggy 1.5.0
  • prompt-toolkit 3.0.51
  • psutil 7.0.0
  • ptyprocess 0.7.0
  • pure-eval 0.2.3
  • pycparser 2.22
  • pydantic 2.11.3
  • pydantic-core 2.33.1
  • pygments 2.19.1
  • pytest 8.3.5
  • python-dateutil 2.9.0.post0
  • python-dotenv 1.1.0
  • pywin32 310
  • pyyaml 6.0.2
  • pyzmq 26.4.0
  • requests 2.32.3
  • roman-numerals-py 3.1.0
  • six 1.17.0
  • sniffio 1.3.1
  • snowballstemmer 2.2.0
  • sphinx 8.2.3
  • sphinx-autodoc-typehints 3.1.0
  • sphinx-rtd-theme 3.0.2
  • sphinxcontrib-applehelp 2.0.0
  • sphinxcontrib-devhelp 2.0.0
  • sphinxcontrib-htmlhelp 2.1.0
  • sphinxcontrib-jquery 4.1
  • sphinxcontrib-jsmath 1.0.1
  • sphinxcontrib-qthelp 2.0.0
  • sphinxcontrib-serializinghtml 2.0.0
  • stack-data 0.6.3
  • tornado 6.4.2
  • tqdm 4.67.1
  • traitlets 5.14.3
  • typing-extensions 4.13.2
  • typing-inspection 0.4.0
  • urllib3 2.4.0
  • wcwidth 0.2.13
pyproject.toml pypi
  • ghp-import ^2.1.0 develop
  • ipykernel ^6.29.5 develop
  • myst-parser ^4.0.1 develop
  • sphinx ^8.2.3 develop
  • sphinx-autodoc-typehints ^3.1.0 develop
  • sphinx-rtd-theme ^3.0.2 develop
  • openai (>=1.76.0,<2.0.0)
  • python-dotenv (>=1.1.0,<2.0.0)
  • pyyaml (>=6.0.2,<7.0.0)
  • pytest ^8.3.5 test