Powers /ai/audit on skill pages plus client chat. Saved server-side for your account and mirrored to this browser for fast reads.
AI lens loader-composer
Paste a prompt. Watch which .agents.md loaders fire, inspect the composed system prompt verbatim, then run it against any OpenRouter model with your own key.
How it works. This is the same loader-composer the UserPromptSubmit hook runs on every claude CLI turn. Each skill can ship a companion <slug>.agents.md with a frontmatter description: "Triggers on prompt mention of '<phrase>'". When the prompt contains that phrase, the loader body is spliced into the system prompt before the LLM runs. The AI on this page is not a skill — it is the composer itself, made visible.
Paste a prompt
Drop a skill tile or a raw .md loader onto this card to force-include it. Drag a chip out to remove it.
loaders:none yet — Analyze to auto-match, or drop a skill here to force-include
UserPromptSubmit hook · live trace
Triggers matched
Composed system prompt what the model sees
The three blocks below are the verbatim wire format. Shading distinguishes role: (a) static preamble, (b) spliced on trigger match, (c) your prompt.
Side-by-side
Left: with the snappy-os hook. Right: without — naked prompt.
what the model sees with snappy-os hook
what the model sees normally
Run
Send the composed system prompt + your original user prompt to OpenRouter. Bring your own key — it is forwarded to OpenRouter and never stored or logged server-side.
BYO key. The Worker forwards the request to OpenRouter and returns the response. No server-side persistence, no logging, no read-back.