Pi Integration
This page is migrated from the previous canonical Markdown guide.
Source
- Original file:
docs/PI_INTEGRATION.md
Content
Pi Integration
Chatons relies on Pi Coding Agent to avoid rebuilding a full AI execution stack in the application layer.
Runtime resolution
In the local app environment, Chatons executes Pi through its internal runtime:
- bundled
@mariozechner/pi-coding-agent/dist/cli.jswhen available - otherwise
<Chatons userData>/.pi/agent/bin/pi
Important:
bin/piis not the sole source of truth for runtime behavior.
Practical flow
- Pi loads user configuration from the detected configuration directory.
- Pi builds the available provider/model registry.
- Pi optionally applies model scope using
enabledModels. - Pi starts interactive mode or runs a one-shot command.
Important files
models.json: custom provider/model definitionssettings.json: global preferences includingenabledModels- internal Pi agent dir:
<Chatons userData>/.pi/agent - app runtime resolution logic:
electron/ipc/workspace.ts
Scoped models vs all models
Pi distinguishes between:
- All models: what
pi --list-modelsreturns - Scoped models: subset defined in
settings.json > enabledModels
Model key convention:
provider/modelId- example:
openai-codex/gpt-5.3-codex
Expected dashboard behavior
- default selector shows scoped models
moreshows all models- starring a model updates
settings.json > enabledModels