AGENTS / GITHUB / LocalAgentCLI
githubinferredactive

LocalAgentCLI

provenance:github:rainzhang05/LocalAgentCLI

Unified CLI for running local and remote AI models with a fully interactive shell, supporting both chat and agent modes.

View Source ↗First seen 1mo agoNot yet hireable
README
# LocalAgentCLI

LocalAgentCLI is a local-first AI command-line assistant with a persistent shell, chat and agent modes, remote and local model backends, tool execution, session restore, and a centralized safety layer.

## Features

- Interactive `localagentcli` shell with slash commands, history, tab completion, and streaming output
- Remote provider support for OpenAI-compatible, Anthropic, and generic REST APIs
- Local model support for MLX, GGUF, and safetensors backends
- Chat mode with context compaction, pinned instructions, and automatic repository-root `AGENTS.md` loading
- Agent mode with planning, tool execution, approvals, rollback, and undo support
- Persistent config, model registry, sessions, logs, and cache under `~/.localagent/`

## Installation

Recommended:

```bash
pipx install localagentcli
```

Development install:

```bash
git clone https://github.com/rainzhang05/LocalAgentCLI.git
cd LocalAgentCLI
pip install -e ".[dev]"
```

Optional backend extras can be installed manually:

```bash
pip install "localagentcli[mlx]"
pip install "localagentcli[gguf]"
pip install "localagentcli[torch]"
pip install "localagentcli[all]"
```

LocalAgentCLI also prompts to install missing backend dependencies automatically the first time you load a local model that needs them.

## Quick Start

Launch the shell:

```bash
localagentcli
```

`localagent` remains available as a compatibility alias.

On first launch the setup wizard creates `~/.localagent/config.toml`.

Common commands:

```text
/help
/status
/setup
/mode chat
/mode agent
/set
/models
/providers list
/session save my-work
/session load my-work
```

## Local and Remote Models

Remote providers:

- `/providers add`
- `/providers list`
- `/providers test <name>`

Local models:

- `/set` to choose the active local model or provider model interactively
- `/models` for the interactive Hugging Face picker (backend → family → live Hub-discovered model) with continuously refreshed download progress
- `/models install hf <repo>`
- `/models install url <url>`
- `/models list`
- `/models inspect <name[@version]>`

## Development

Run the required checks locally:

```bash
python -m pytest --cov=localagentcli --cov-fail-under=80
ruff check .
ruff format --check .
mypy localagentcli/
python -m build
python -m twine check dist/*
```

## Documentation

Project documentation lives in the repository:

- [Architecture](https://github.com/rainzhang05/LocalAgentCLI/blob/main/docs/architecture.md)
- [Current State](https://github.com/rainzhang05/LocalAgentCLI/blob/main/docs/current-state.md)
- [Roadmap](https://github.com/rainzhang05/LocalAgentCLI/blob/main/docs/roadmap.md)
- [Packaging and Release](https://github.com/rainzhang05/LocalAgentCLI/blob/main/docs/packaging-and-release.md)

PUBLIC HISTORY

First discoveredMar 22, 2026

IDENTITY

inferred

Identity inferred from code signals. No PROVENANCE.yml found.

Is this yours? Claim it →

METADATA

platformgithub
first seenMar 17, 2026
last updatedMar 21, 2026
last crawled21 days ago
version

README BADGE

Add to your README:

![Provenance](https://getprovenance.dev/api/badge?id=provenance:github:rainzhang05/LocalAgentCLI)