Run Coding Agents From Your Phone With Telegram
Why Telegram works as a mobile control surface for coding agents, and how TelePi and TeleCodex turn voice notes, screenshots, and short prompts into real coding workflows.
Engineering Notes
Deep dives into how Ora and htmlctl are designed, shipped, and iterated — from system prompts to deployment pipelines.
Why Telegram works as a mobile control surface for coding agents, and how TelePi and TeleCodex turn voice notes, screenshots, and short prompts into real coding workflows.
A practical comparison of the two Telegram coding-agent bridges: context model, mobile UX, ASR, attachments, launch profiles, and when each one fits better.
TeleCodex turns Codex into a mobile-first workflow: voice transcription, launch-profile switching, per-topic sessions, file exchange, and instant handback to the CLI from Telegram.
The broader essay behind TelePi and TeleCodex: why steering coding agents from your phone makes sense once most of the job becomes supervision instead of typing.
After canceling Claude Code, I doubled down on the Pi coding harness and built a Telegram bridge — bi-directional session hand-off, cross-workspace switching, and model selection, all from a chat interface I was already using.
Claude Code's shared usage limits were a bad fit for my architecture-first workflow, Google AI Pro felt siloed, and GitHub Copilot Pro+ gave me a broader multi-model bench for planner, coder, and reviewer roles.
Restoring the newsletter forced more than a backend. It led to a reusable extension model with same-origin routing, compatibility validation, safer runtime boundaries, and static releases left intact.
I tried to share an htmlctl update on social media. The preview was blank. That moment led to shipping automatic OG image generation, favicon support, robots.txt, and a sitemap generator — making discoverability effortless by default.
Ora is a good reminder of how fast the baseline is moving. If a solo developer can build a voice-first tool that understands speech, uses local or cloud models, remembers context, and acts across Mac apps, users will soon expect that kind of flexibility everywhere.
Ora registers 41 tools. Sending all their schemas to the LLM on every turn was expensive and noisy. Dynamic tool discovery introduced lazy loading via a client-side discovery index — shrinking the initial prompt tool block from ~1,200 tokens to ~450.