liteswarm

Lightweight Agent Framework for building AI apps with any LLM

https://github.com/glyphyai/liteswarm

Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.9%) to scientific vocabulary

Keywords

agent-ops agent-systems ai-agents llm llm-ops llm-orchestration multi-agent multi-agent-systems
Last synced: 6 months ago · JSON representation ·

Repository

Lightweight Agent Framework for building AI apps with any LLM

Basic Info
Statistics
  • Stars: 5
  • Watchers: 1
  • Forks: 2
  • Open Issues: 0
  • Releases: 8
Topics
agent-ops agent-systems ai-agents llm llm-ops llm-orchestration multi-agent multi-agent-systems
Created over 1 year ago · Last pushed about 1 year ago
Metadata Files
Readme Changelog Contributing License Citation

README.md

LiteSwarm 🐝

A lightweight, LLM-agnostic framework for building AI agents with dynamic agent switching. Supports 100+ language models through litellm.

[!WARNING] LiteSwarm is currently in early preview and the API is likely to change as we gather feedback.

If you find any issues or have suggestions, please open an issue in the Issues section.

Features

  • Lightweight Core: Minimal base implementation that's easy to understand and extend
  • LLM Agnostic: Support for OpenAI, Anthropic, Google, and many more through litellm
  • Dynamic Agent Switching: Switch between specialized agents during execution
  • Type-Safe Context: Full type safety for context parameters and outputs
  • Stateful Chat Interface: Build chat applications with built-in state management
  • Event Streaming: Real-time streaming of agent responses and tool calls

Installation

bash pip install liteswarm

Requirements

  • Python: Version 3.11 or higher
  • Async Runtime: LiteSwarm provides only async API, so you need to use an event loop to run it
  • LLM Provider Key: You'll need an API key from a supported LLM provider (see supported providers)
    [click to see how to set keys]

```python # Environment variable export OPENAIAPIKEY=sk-... os.environ["OPENAIAPIKEY"] = "sk-..."

# .env file OPENAIAPIKEY=sk-...

# Direct in code LLM(model="gpt-4o", key="sk-...") ```

Quick Start

[!NOTE] All examples below are complete and can be run as is.

Hello World

Here's a minimal example showing how to use LiteSwarm's core functionality:

```python import asyncio

from liteswarm import LLM, Agent, Message, Swarm

async def main() -> None: # Create a simple agent agent = Agent( id="assistant", instructions="You are a helpful assistant.", llm=LLM(model="gpt-4o"), )

# Create swarm and run
swarm = Swarm()
result = await swarm.run(
    agent=agent,
    messages=[Message(role="user", content="Hello!")],
)
print(result.final_response.content)

if name == "main": asyncio.run(main()) ```

Streaming with Agent Switching

This example demonstrates real-time streaming and dynamic agent switching:

```python import asyncio

from liteswarm import LLM, Agent, Message, Swarm, ToolResult, tool_plain

async def main() -> None: # Define a tool that can switch to another agent @toolplain def switchtoexpert(domain: str) -> ToolResult: expertagent = Agent( id=f"{domain}-expert", instructions=f"You are a {domain} expert.", llm=LLM( model="gpt-4o", temperature=0.0, ), )

    return ToolResult.switch_to(expert_agent)

# Create a router agent that can switch to experts
router = Agent(
    id="router",
    instructions="Route questions to appropriate experts.",
    llm=LLM(model="gpt-4o"),
    tools=[switch_to_expert],
)

# Stream responses in real-time
swarm = Swarm()
stream = swarm.stream(
    agent=router,
    messages=[Message(role="user", content="Explain quantum physics like I'm 5")],
)

async for event in stream:
    if event.type == "agent_response_chunk":
        completion = event.chunk.completion
        if completion.delta.content:
            print(completion.delta.content, end="", flush=True)
        if completion.finish_reason == "stop":
            print()

# Optionally, get complete run result from stream
result = await stream.get_return_value()
print(result.final_response.content)

if name == "main": asyncio.run(main()) ```

Stateful Chat

Here's how to build a stateful chat application that maintains conversation history:

```python import asyncio

from liteswarm import LLM, Agent, SwarmChat, SwarmEvent

def handleevent(event: SwarmEvent) -> None: if event.type == "agentresponsechunk": completion = event.chunk.completion if completion.delta.content: print(completion.delta.content, end="", flush=True) if completion.finishreason == "stop": print()

async def main() -> None: # Create an agent agent = Agent( id="assistant", instructions="You are a helpful assistant. Provide short answers.", llm=LLM(model="gpt-4o"), )

# Create stateful chat
chat = SwarmChat()

# First message
print("First message:")
async for event in chat.send_message("Tell me about Python", agent=agent):
    handle_event(event)

# Second message - chat remembers the context
print("\nSecond message:")
async for event in chat.send_message("What are its key features?", agent=agent):
    handle_event(event)

# Access conversation history
messages = await chat.get_messages()
print(f"\nMessages in history: {len(messages)}")

if name == "main": asyncio.run(main()) ```

For more examples, check out the examples directory. To learn more about advanced features and API details, see our documentation.

Documentation

Citation

If you use LiteSwarm in your research, please cite our work:

bibtex @software{Mozharovskii_LiteSwarm_2025, author = {Mozharovskii, Evgenii and {GlyphyAI}}, license = {MIT}, month = jan, title = {{LiteSwarm}}, url = {https://github.com/glyphyai/liteswarm}, version = {0.6.0}, year = {2025} }

License

MIT License - see LICENSE file for details.

Owner

  • Name: GlyphyAI
  • Login: GlyphyAI
  • Kind: organization
  • Email: hello@glyphy.ai
  • Location: San Francisco, CA

Shaping the future of coding with AI agents.

Citation (CITATION.cff)

cff-version: 1.2.0
title: LiteSwarm
message: "If you use LiteSwarm in your research or project, please cite it as below."
type: software
authors:
  - family-names: "Mozharovskii"
    given-names: "Evgenii"
  - name: "GlyphyAI"
repository-code: "https://github.com/glyphyai/liteswarm"
url: "https://github.com/glyphyai/liteswarm"
abstract: |
  An extensible framework for building AI agent systems with a focus on structured
  interactions, LLM-agnostic design, and composable architecture. Features include
  multi-agent orchestration, structured outputs, and provider-agnostic response parsing.
keywords:
  - agents
  - multi-agent
  - ai-agents
  - agent-framework
  - agent-orchestration
  - agent-ops
license: MIT
version: 0.6.0
date-released: 2025-01-09

GitHub Events

Total
  • Release event: 19
  • Watch event: 5
  • Delete event: 35
  • Issue comment event: 27
  • Push event: 169
  • Pull request event: 50
  • Fork event: 2
  • Create event: 43
Last Year
  • Release event: 19
  • Watch event: 5
  • Delete event: 35
  • Issue comment event: 27
  • Push event: 169
  • Pull request event: 50
  • Fork event: 2
  • Create event: 43

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 0
  • Total pull requests: 2
  • Average time to close issues: N/A
  • Average time to close pull requests: 2 days
  • Total issue authors: 0
  • Total pull request authors: 1
  • Average comments per issue: 0
  • Average comments per pull request: 1.0
  • Merged pull requests: 2
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 2
  • Average time to close issues: N/A
  • Average time to close pull requests: 2 days
  • Issue authors: 0
  • Pull request authors: 1
  • Average comments per issue: 0
  • Average comments per pull request: 1.0
  • Merged pull requests: 2
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • mozharovsky (2)
Pull Request Authors
  • mozharovsky (21)
Top Labels
Issue Labels
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 73 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 8
  • Total maintainers: 1
pypi.org: liteswarm

A lightweight framework for building AI agent systems

  • Documentation: https://liteswarm.readthedocs.io/
  • License: MIT License Copyright (c) 2025 GlyphyAI Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
  • Latest release: 0.6.0
    published about 1 year ago
  • Versions: 8
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 73 Last month
Rankings
Dependent packages count: 9.9%
Average: 32.8%
Dependent repos count: 55.8%
Maintainers (1)
Last synced: 6 months ago

Dependencies

pyproject.toml pypi
  • litellm >=1.52.2
  • orjson >=3.10.11
  • pydantic >=2.9.2