Last updated: May 7, 2026

This page is a placeholder. Replace this content with your own legal copy before going live with paying customers. The notes below describe what WatchdogAI actually does so you have an accurate factual baseline.

What WatchdogAI is

WatchdogAI is a Laravel package installed inside your Laravel application. The package itself does not collect any data — it persists events to your database, sends LLM analysis requests to your chosen provider using your API key, and renders a dashboard on your server.

Data collected by the package

  • Watchdog events: source name, event name, level, message, optional exception class/file/line/trace, optional context data (sanitized to strip authorization / cookie / x-api-key headers by default), correlation IDs, timestamps. Stored in your watchdog_events table.
  • AI analyses: LLM responses summarizing event windows. Stored in ai_log_analyses.
  • Token usage aggregates: per-day, per-provider counts of prompt/completion tokens and request counts. Stored in watchdog_ai_token_usage.
  • Provider settings: per-provider configuration. API keys are encrypted at rest using your application's APP_KEY. Stored in watchdog_ai_providers.

Data sent to LLM providers

When you click "Suggest Fix" or the AI Monitor cron runs, the package sends a request to the LLM provider you configured. That request contains:

  • The event messages and metadata in the analysis window
  • For Suggest Fix: the relevant source code (~50 lines of context around the offending line) from your app/ directory
  • A system prompt and your application's name/description from your config

You should review the privacy policy of whichever LLM provider you use (Groq, Anthropic, xAI, OpenAI) for what they do with the data they receive. Ollama runs locally on your server and does not send data anywhere.

Data the package does NOT collect

The package authors and SICL do not receive any telemetry, analytics, or usage data from your installation. There is no "phone home". Network traffic from the package only goes to (a) the LLM provider you configured and (b) your own users via the dashboard you host.

Retention and deletion

Watchdog events and AI analyses are pruned automatically after the configured retention window (default 30 days). You can change the windows or disable pruning in config/watchdog-ai.php.

Contact

Questions about this policy: scgvegas@gmail.com.