Describe a UI in a text prompt. Get a native macOS app. Real AppKit buttons, text fields, split views, outline views. Not a webview. Not Electron.
build/canopy --prompt "Build a calculator with dark theme and orange operators"That one line produces a native macOS window with working buttons, display, and arithmetic. The LLM figures out the layout, wires up the data model, and Canopy renders it as real Cocoa widgets.
macOS 13+, Go 1.25+.
git clone https://github.com/artpar/canopy.git
cd canopy
make buildBundle any Canopy app into a standalone .app:
canopy bundle myapp/ # -> MyApp.app
canopy bundle -o Notes.app sample_apps/notes # custom output path
canopy bundle --sign --identity "-" myapp/ # ad-hoc sign (no Apple account)
canopy bundle --sign --notarize myapp/ # sign + notarize for distributionThe bundled .app is self-contained — double-click to launch, no CLI needed. Add --sign for codesigning with hardened runtime, and --notarize to submit to Apple for Gatekeeper-free distribution.
From a file (no LLM needed):
build/canopy testdata/contact_form.jsonlFrom a prompt (needs ANTHROPIC_API_KEY or another provider):
build/canopy --prompt "Build a todo list with add/remove and a count of remaining items"From Claude Code (spawns a claude subprocess that builds the UI with MCP tools):
build/canopy --claude-code "Build an Apple Notes clone with three-pane layout"The LLM output is cached. Second run is instant. --regenerate forces a fresh call.
Every component maps to a real AppKit class. No HTML, no CSS.
Layout: Row, Column, Card, SplitView, Tabs, List, Modal
Input: TextField, CheckBox, Slider, ChoicePicker, DateTimeInput, SearchField, Button
Display: Text, Icon, Image, Divider, ProgressBar
Rich content: RichTextEditor, OutlineView, Video, AudioPlayer
Media capture: CameraView, AudioRecorder
The data model is a JSON document. Components bind to paths in it with JSON Pointers. Type in a TextField bound to /name, and every Text displaying /name updates immediately.
{"componentId":"input","type":"TextField","props":{"dataBinding":"/name","placeholder":"Your name"}},
{"componentId":"greeting","type":"Text","props":{"content":{"path":"/name"}}}No state management library. The engine handles propagation.
Point Canopy at any LLM and describe what you want. The LLM gets 11 A2UI tools (createSurface, updateComponents, updateDataModel, etc.) and builds the UI through tool calls. User interactions flow back as conversation turns.
build/canopy --llm openai --model gpt-4o --prompt "Build a settings panel"
build/canopy --llm ollama --model llama3 --prompt-file spec.txt --mode raw
build/canopy --llm gemini --model gemini-2.0-flash --prompt "Build a dashboard"Providers: Anthropic, OpenAI, Gemini, Ollama, DeepSeek, Groq, Mistral.
Edit a .jsonl file, save, and the window rebuilds. No restart.
build/canopy --watch testdata/contact_form.jsonlPolls every 500ms. Tears down existing surfaces and re-reads from scratch.
Define a component once, use it many times with different parameters. State is scoped per instance.
{"type":"defineComponent","name":"DigitButton","params":["digit"],"components":[
{"componentId":"_root","type":"Button","props":{"label":{"param":"digit"}}}
]}{"componentId":"btn7","useComponent":"DigitButton","args":{"digit":"7"}}defineFunction does the same for expressions. include splits apps across files. See testdata/calculator_v2/ for all three working together.
Defined components persist in ~/.canopy/library/ and show up in LLM prompts automatically.
Spawn background goroutines with their own transports. Timers, background LLM conversations, async file loading. Pub/sub channels connect them.
{"type":"createProcess","processId":"ticker","transport":{
"type":"interval","interval":1000,
"message":{"type":"updateDataModel","surfaceId":"main","ops":[
{"op":"replace","path":"/counter","value":{"functionCall":{"name":"add","args":[{"path":"/counter"},1]}}}
]}
}}Channels support broadcast (all subscribers get every message) and queue (round-robin) modes. Published values land in the data model, so dataBinding just works.
Canopy apps can access native macOS APIs through evaluator functions — clipboard, notifications, file dialogs, alerts, HTTP, shell commands, camera, microphone, and screen capture. No webview shims.
| Function | What it does |
|---|---|
notify(title, body) |
macOS notification via UNUserNotificationCenter |
clipboardRead() / clipboardWrite(text) |
System clipboard (NSPasteboard) |
fileOpen() / fileSave() |
Native file dialogs (NSOpenPanel / NSSavePanel) |
alert(title, msg) |
Modal alert sheet (NSAlert) |
openURL(url) |
Open in default app (NSWorkspace) |
httpGet(url) / httpPost(url, body) |
HTTP requests with 30s timeout |
shell(command) |
Run any shell command |
cameraCapture(position?) |
Take a photo, returns JPEG path (AVFoundation) |
audioRecordStart(format?) / audioRecordStop(id) |
Record audio from microphone |
screenCapture(type?) |
Screenshot via ScreenCaptureKit, returns PNG path |
File dialogs and alerts are non-blocking — they use beginSheetModalForWindow:completionHandler: so the main thread stays free for rendering and MCP calls while a dialog is open.
All system functions are also available as MCP tools.
The CameraView component shows a live camera preview using AVCaptureVideoPreviewLayer and supports still photo capture via AVCapturePhotoOutput. The AudioRecorder component records from the microphone with a native UI — record/stop button, level meter, and elapsed time display.
{"componentId":"cam","type":"CameraView","props":{"devicePosition":"front","mirrored":true},"style":{"width":320,"height":240}}
{"componentId":"rec","type":"AudioRecorder","props":{"format":"m4a","sampleRate":44100}}Camera and audio recording are also available as headless functions (no component needed) — useful for MCP tools and evaluator expressions. Screen capture uses ScreenCaptureKit on macOS 14+ with a CGWindowList fallback for macOS 13.
Any component can accept dropped files and text:
{"componentId":"zone","type":"Card","props":{
"dataBinding":"/dropped",
"onDrop":{"action":{"event":{"name":"fileDrop"}}}
}}Drop data ({"paths":["/path/to/file"],"text":"..."}) is written to the dataBinding path and the action event is fired. Uses a transparent NSView overlay as NSDraggingDestination.
Switch between dock app, menubar app, and invisible background app at runtime:
{"type":"setAppMode","mode":"menubar","icon":"bolt.fill","title":"Canopy"}| Mode | Behavior |
|---|---|
"normal" |
Default — dock icon, windows, standard app |
"menubar" |
NSStatusItem in menu bar, no dock icon, click toggles windows |
"accessory" |
No dock icon, no menu bar — invisible background process |
In menubar mode the app stays alive when all windows close.
Every Canopy instance is an MCP server on stdin/stdout. Click buttons, fill text fields, read the data model, take screenshots, capture photos, record audio, send raw messages. Claude Code connects through .mcp.json for interactive development.
build/canopy mcp testdata/hello.jsonl # dedicated MCP mode
build/canopy --mcp-http localhost:8080 ... # also on HTTPTools include: click, fill, toggle, interact, get_tree, get_component, get_data_model, set_data_model, take_screenshot, get_layout, get_style, send_message, get_logs, list_surfaces, create_process, create_channel, publish, subscribe, notify, clipboard_read, clipboard_write, open_url, file_open, file_save, alert, camera_capture, camera_capture_headless, audio_recorder_toggle, audio_record_start, audio_record_stop, screen_capture, screen_record_start, screen_record_stop, and more.
Any surface can switch between light and dark appearance at runtime. Cards, buttons, text, backgrounds all follow the system theme.
Load any .dylib at runtime and call its functions from component expressions. No wrappers.
{"type":"loadLibrary","path":"libcurl.dylib","prefix":"curl","functions":[
{"name":"version","symbol":"curl_version","returnType":"string","paramTypes":[]}
]}The sysinfo sample app loads libcurl, libsqlite3, and libz, calling version functions and compressBound to show computed results.
All in sample_apps/. Each has a prompt.txt (the human-language spec) and prompt.jsonl (the cached LLM output).
| App | What it shows |
|---|---|
calculator |
defineComponent + defineFunction, dark theme, grid layout |
notes_llm |
Three-pane SplitView, OutlineView, RichTextEditor, search |
todo |
List with add/remove, data binding, item count |
sysinfo |
Native FFI calling libcurl, libsqlite3, libz |
dashboard |
Cards, stats, nested layout |
settings |
ChoicePicker, Slider, CheckBox, DateTimeInput |
theme_switcher |
setTheme, light/dark toggle |
scrollable_feed |
Scrollable List with 15 Card items |
channel_demo |
Pub/sub channels between processes |
live_monitor |
Background processes with interval transport |
make run-app A=calculator # run from cache (or generate if no cache)
make regen-app A=notes_llm # force-regenerate from LLMA Canopy app is a directory. canopy myapp/ looks for app.jsonl or main.jsonl as the entry point, falling back to alphabetical order.
myapp/
app.jsonl # entry point
components.jsonl # defineComponent definitions
functions.jsonl # defineFunction definitions
assets/ # images, fonts, audio
prompt.txt # LLM prompt for regeneration
app.jsonl uses include to pull in the rest:
{"type":"include","path":"components.jsonl"}
{"type":"include","path":"functions.jsonl"}
{"type":"createSurface","surfaceId":"main","title":"My App","width":800,"height":600}
{"type":"updateComponents","surfaceId":"main","components":[...]}Add a canopy.json manifest for bundling metadata:
{"name":"My App","version":"1.0.0","type":"app","entry":"app.jsonl","icon":"icon.icns","bundleId":"com.example.myapp"}Then canopy bundle myapp/ creates My App.app — a standalone macOS app that runs without the CLI.
JSONL source --> Transport --> Engine (Go) --> CGo --> AppKit (ObjC) --> window
^ | |
+--- user actions (clicks, input, toggles) <-----------------+
The engine maintains a component tree, a JSON data model, and a binding tracker. When the data model changes (user input, LLM update, process message), the binding tracker finds affected components and re-renders them. All rendering happens on the main thread through a dispatcher.
The protocol is A2UI JSONL. Each line is a JSON message: createSurface, updateComponents, updateDataModel, etc. Any source that produces these messages can drive the UI.
make build # build binary to build/canopy
canopy bundle myapp/ # create standalone .app from any Canopy app
make test # headless unit + integration tests (387 tests, -race)
make verify # screenshot every fixture (48 fixtures)
make check # test + verify (the gate before commits)protocol/ JSONL parsing and message types
engine/ session, surfaces, data model, bindings, resolver, library, cache, FFI
renderer/ Renderer interface (platform-agnostic) + mock for tests
platform/darwin/ CGo + ObjC implementation of Renderer (25 components + native APIs)
transport/ file, directory, watch, LLM, Claude Code, interval transports
mcp/ MCP server (JSON-RPC 2.0, stdin/stdout + HTTP)
cmd/ CLI subcommands (pkg, bundle)
pkg/ package registry and GitHub integration
testdata/ 48 JSONL fixtures
sample_apps/ 10 LLM-generated apps
Unit and integration tests run headless with a mock renderer. Screenshot verification builds the real binary and captures every fixture. Native e2e tests run with real AppKit and assert on computed frames, fonts, and colors. MCP tests drive a running instance through tool calls.
make test # headless (CI-safe)
make verify # screenshots (needs display)
build/canopy test testdata/contact_form_test.jsonl # native e2emacOS only for now. The engine, protocol, and transport layers are pure Go. The rendering layer is behind a Renderer interface. Adding Linux (GTK4) or Windows (WinUI 3) means writing one package that implements that interface.
MIT









