OCP chat widget with a local AI running in your browser — zero cloud API tokens. Open the chat widget (bottom-right) to try it.
Waiting…
Checking…
—
processMessage callback. Only consulted when both local tiers can't handle the request.Click a query to send it to the chat widget. High-confidence queries use Tier 1 (regex), ambiguous ones use Tier 2 (browser LLM).
// Option 1: Just enable it (sensible defaults) OCP.init({ handlers: { /* your tool handlers */ }, widget: { browserLLM: true } }); // Option 2: Fine-tune the behavior OCP.init({ handlers: { /* your tool handlers */ }, widget: { browserLLM: { model: 'SmolLM2-360M-Instruct-q4f16_1-MLC', loadStrategy: 'on-widget-open', confidenceThreshold: 0.7, maxTokens: 256, }, // Cloud LLM is still the Tier 3 fallback processMessage: async (msg, invoke) => { /* ... */ }, } });