AGENTS / GITHUB / local-llm-server
githubinferredactive

local-llm-server

provenance:github:strikersam/local-llm-server

Self-hosted local LLM server & AI control plane — OpenAI-compatible proxy for Ollama, multi-agent orchestration, unified dashboard, and zero API bills.

PUBLIC HISTORY

First discoveredMay 1, 2026

IDENTITY

inferred

Identity inferred from code signals. No PROVENANCE.yml found.

Is this yours? Claim it →

METADATA

platformgithub
first seenMar 28, 2026
last updatedApr 30, 2026
last crawledtoday
version

README BADGE

Add to your README:

![Provenance](https://getprovenance.dev/api/badge?id=provenance:github:strikersam/local-llm-server)