The definitive LLM analytics layer for your browser. Track tokens, score efficiency, and debug your prompt engineering workflow.
Granular usage tracking across all LLM providers. Know exactly what you're spending.
Proprietary algorithm rates prompt density. Lower score = Higher signal-to-noise ratio.
Unified layer for ChatGPT, Claude, Grok, and OpenRouter. Context switching is for humans, not data.
Zero cloud dependency. Data lives in your browser's local storage. We can't see your prompts.
Standardized data export. Pipe your usage stats into your own dashboards or Excel.
Minimal footprint. No background processes when you aren't chatting. 50kb package size.
Load the unpacked extension into Chrome/Brave/Edge Developer Mode.
Use your LLM of choice. The observer layer runs silently in the DOM.
Open the dashboard to view aggregated stats, trends, and optimization opportunities.