AGENTS / GITHUB / tinyagent
githubinferredactive

tinyagent

provenance:github:askbudi/tinyagent

Tiny Agent: Production-Ready LLM Agent SDK for Every Developer

View Source ↗First seen 11mo agoNot yet hireable
README
# TinyAgent
🛠️ **Build Your Own AI Coding Assistant** - Break free from vendor lock-in and create powerful agents with *any* AI model you choose

[![AskDev.AI | Chat with TinyAgent](https://img.shields.io/badge/AskDev.AI-Chat_with_TinyAgent-blue?style=flat-square)](https://askdev.ai/github/askbudi/tinyagent)


![TinyAgent Logo](https://raw.githubusercontent.com/askbudi/tinyagent/main/public/logo.png)


[![AskDev.AI | Chat with TinyAgent](https://img.shields.io/badge/AskDev.AI-Chat_with_TinyAgent-blue?style=flat-square)](https://askdev.ai/github/askbudi/tinyagent)


Inspired by:
- [Tiny Agents blog post](https://huggingface.co/blog/tiny-agents)
- [12-factor-agents repository](https://github.com/humanlayer/12-factor-agents)
- Created by chatting to the source code of JS Tiny Agent using [AskDev.ai](https://askdev.ai/search)

## Quick Links
- [Build your own Tiny Agent](https://askdev.ai/github/askbudi/tinyagent)


## Live Projects using TinyAgent (🔥)
- [AskDev.AI](https://askdev.ai) - Understand, chat, and summarize codebase of any project on GitHub.
- [HackBuddy AI](https://huggingface.co/spaces/ask-dev/HackBuddyAI) - A Hackathon Assistant Agent, built with TinyCodeAgent and Gradio. Match invdividuals to teams based on their skills, interests and organizer preferences.

- [TinyCodeAgent Demo](https://huggingface.co/spaces/ask-dev/TinyCodeAgent) - A playground for TinyCodeAgent, built with tinyagent, Gradio and Modal.com

** Building something with TinyAgent? Let us know and I'll add it here!**


## 🚀 The Vision: Your AI, Your Choice, Your Rules

Tired of being locked into specific AI providers? Want the power of advanced coding assistants without the constraints? TinyAgent gives you **complete freedom** to build intelligent agents that work with *any* AI model - from OpenAI and Anthropic to your own local Ollama models.

**This isn't just another AI wrapper.** It's your gateway to building the coding assistant of your dreams:

### 🎯 Why TinyAgent Changes Everything

- **🔓 Model Freedom**: Switch between GPT-5, Claude-4, Llama, or any 100+ models instantly
- **🏠 Local Privacy**: Run everything locally with Ollama - your code never leaves your machine  
- **🛡️ Production Security**: Enterprise-grade sandboxing across macOS, Linux, and Windows
- **⚡ Parallel Intelligence**: Multiple specialized AI agents working together on complex tasks
- **🔧 Complete Control**: Extend, customize, and hook into every aspect of agent behavior

**Three Revolutionary Components:**
- **TinyAgent**: Your universal AI interface - one API, infinite models
- **TinyCodeAgent**: Secure code execution with cross-platform sandboxing
- **Subagent Swarm**: Parallel specialized workers that collaborate intelligently

### What's new for developers

- **Sandboxed File Tools**: `read_file`, `write_file`, `update_file`, `glob`, `grep` now route through provider sandboxes (Seatbelt/Modal) for secure file operations
- **Enhanced Shell Tool**: Improved `bash` tool with better safety validation, platform-specific tips, and provider-backed execution
- **TodoWrite Tool**: Built-in task management system for tracking progress and organizing complex workflows
- **Provider System**: Pluggable execution backends (Modal.com, Seatbelt sandbox) with unified API
- **Universal Tool Hooks**: Control any tool execution via `before_tool_execution`/`after_tool_execution` callbacks
- **Subagent Tools**: Revolutionary parallel task execution with specialized workers and context isolation
- **Enhanced Security**: Comprehensive validation, sandboxing, and permission controls

## Installation

### Using pip
```bash
# Basic installation
pip install tinyagent-py

# Install with all optional dependencies
pip install tinyagent-py[all]

# Install with Code Agent support
pip install tinyagent-py[code]


# Install with PostgreSQL support
pip install tinyagent-py[postgres]

# Install with SQLite support
pip install tinyagent-py[sqlite]

# Install with Gradio UI support
pip install tinyagent-py[gradio]





```

### Using uv
```bash
# Basic installation
uv pip install tinyagent-py

# Install with Code Agent support
uv pip install tinyagent-py[code]


# Install with PostgreSQL support
uv pip install tinyagent-py[postgres]

# Install with SQLite support
uv pip install tinyagent-py[sqlite]

# Install with Gradio UI support
uv pip install tinyagent-py[gradio]

# Install with all optional dependencies
uv pip install tinyagent-py[all]

```

## Developer Boilerplate & Quick Start

### OpenAI Responses API (optional)

TinyAgent supports OpenAI's Responses API alongside the default Chat Completions flow. To opt in without changing your code, set an environment variable:

```bash
export TINYAGENT_LLM_API=responses
```

Your existing TinyAgent code continues to work. Under the hood, TinyAgent translates your chat `messages`/`tools` to a Responses request and maps the Responses result back to the same structure it already uses (including `tool_calls` and usage accounting). To switch back, unset or set `TINYAGENT_LLM_API=chat`.

Example with explicit toggle:

```python
import os
import asyncio
from tinyagent import TinyAgent

async def main():
    # Option A: via environment variable
    os.environ["TINYAGENT_LLM_API"] = "responses"  # or "chat" (default)
    agent = await TinyAgent.create(
        model="gpt-5-mini",
        api_key=os.getenv("OPENAI_API_KEY"),
        # Option B: programmatic preference via model_kwargs
        model_kwargs={"llm_api": "responses"},  # or {"use_responses_api": True}
    )
    print(await agent.run("List three safe git commands for a repo"))

asyncio.run(main())
```

Notes:
- The adapter preserves TinyAgent hooks, storage schema, and tool-calling behavior.
- Streaming and semantic events can be added later without changing your code.
- Optional tracing: set `RESPONSES_TRACE_FILE=./responses_trace.jsonl` to capture raw request/response JSON for debugging. Set `DEBUG_RESPONSES=1` to print pairing details.

Examples you can run:
- `examples/openai_sdk_responses_multiturn.py` — baseline SDK multi-turn chaining
- `examples/openai_sdk_responses_extended_tools.py` — SDK multi-turn with function calls
- `examples/litellm_responses_extended_tools.py` — LiteLLM multi-turn with function calls
- `examples/litellm_responses_three_tools.py` — LiteLLM three-tool demo
- `examples/tinyagent_responses_three_tools.py` — TinyAgent three-tool demo (Responses)
- `examples/seatbelt_verbose_tools.py` — TinyCodeAgent + seatbelt, verbose hook stream
- `examples/seatbelt_responses_three_tools.py` — TinyCodeAgent + seatbelt three-tool demo

### 🚀 TinyAgent with New Tools

```python
import asyncio
import os
from tinyagent import TinyAgent
from tinyagent.tools.subagent import create_general_subagent

async def create_enhanced_tinyagent():
    """Create a TinyAgent with all new tools and capabilities."""
    
    # Initialize TinyAgent (TodoWrite is enabled by default)
    agent = TinyAgent(
        model="gpt-5-mini",
        api_key=os.getenv("OPENAI_API_KEY"),
        enable_todo_write=True  # Enable TodoWrite tool (True by default)
    )
    
    # Add a general-purpose subagent for parallel tasks
    helper_subagent = create_general_subagent(
        name="helper",
        model="gpt-5-mini",
        max_turns=20,
        enable_python=True,
        enable_shell=True
    )
    agent.add_tool(helper_subagent)
    
    # Check available tools
    available_tools = list(agent.custom_tool_handlers.keys())
    print(f"Available tools: {available_tools}")  # ['TodoWrite', 'helper']
    
    # Connect to MCP servers for extended functionality
    await agent.connect_to_server("npx", ["@openbnb/mcp-server-airbnb", "--ignore-robots-txt"])
    
    return agent

async def main():
    agent = await create_enhanced_tinyagent()
    
    try:
        # Example: Complex task with subagent delegation
        result = await agent.run("""
            I need help with a travel planning project:
            1. Create a todo list for 

[truncated…]

PUBLIC HISTORY

First discoveredMar 22, 2026

IDENTITY

inferred

Identity inferred from code signals. No PROVENANCE.yml found.

Is this yours? Claim it →

METADATA

platformgithub
first seenMay 8, 2025
last updatedMar 13, 2026
last crawled1 day ago
version

README BADGE

Add to your README:

![Provenance](https://getprovenance.dev/api/badge?id=provenance:github:askbudi/tinyagent)