Provider Credentials
Navvy stores provider configuration in the browser extension and uses a separate token handshake for trusted page integration.
Current storage
Section titled “Current storage”The shipped extension has two credential-related paths:
- LLM provider settings are stored in
chrome.storage.localunder keys such asllmConfig,llmProfiles, andactiveProfileId. - Page integration access uses a generated user auth token stored in
chrome.storage.local.
The settings UI lets users configure provider profiles, base URLs, models, and API keys. These values are used by the extension-side LLM client when it sends OpenAI-compatible chat completion requests.
Provider API keys
Section titled “Provider API keys”Provider profiles contain the API key needed for hosted providers. The extension sends that key as a bearer token when the selected profile has an apiKey.
{ baseURL: "https://api.openai.com/v1", model: "gpt-4.1", apiKey: "..."}For local providers such as Ollama, the profile can omit apiKey and point baseURL at a local chat-completions-compatible server.
Page integration token
Section titled “Page integration token”The background entrypoint creates the page integration auth token if it does not exist. The content script compares the extension token with the matching page-side localStorage token. If the values match, it injects the main-world bridge and exposes the page API.
This token gates page-to-extension control for the PAGE_AGENT_EXT bridge. It is separate from provider API keys.
What can leave the browser
Section titled “What can leave the browser”Current model requests can include:
- The user task text from the side panel composer.
- The compact browser state generated from the DOM tree.
- Tool results, observations, retries, and error details.
- The provider API key in the request header when a hosted provider profile is selected.
For local model endpoints, configure a provider profile without an API key and point baseURL to the local OpenAI-compatible server.