AI in your browser that supports MCP Servers, no network call, no costs, 100% private
-
Updated
Feb 16, 2026 - JavaScript
AI in your browser that supports MCP Servers, no network call, no costs, 100% private
Free, private AI in your browser
Scaffold for browser-local LLM chat apps. Runs Gemma 4 entirely in the browser via MediaPipe + WebGPU — no server, no cloud, no tokens. Mobile-first PWA, Atomic Design, i18n (DE/EN), multi-conversation. Fork it. Own it. Ship it.
Run small LLMs directly in your web browser, no cloud computing needed.
Starling - Lightning Fast Autofill Extension
On-device RAG app (WebLLM + Transformers.js + pdf.js). Drop a PDF, ask questions, get cited answers. Runs fully in the browser; data stays local.
Privacy-first document editor with local in-browser LLM inference via HuggingFace Transformers.js, WebGPU, and Tiptap.
Add a description, image, and links to the browser-llm topic page so that developers can more easily learn about it.
To associate your repository with the browser-llm topic, visit your repo's landing page and select "manage topics."