VixPro AIHome
VixPro AI Logo
Pricing & Plans

What is the BYOK model and why does it matter?

BYOK (Bring Your Own Key) means you provide your own API keys for the LLM providers powering VixPro AI.

Supported providers:

  • Anthropic Claude
  • Google Gemini
  • OpenAI
  • xAI (Grok)

Why it matters:

  • Cost transparency — you see exactly what AI execution costs in your provider's billing dashboard, with no VixPro margin on top
  • Cost control — you set your own spending limits directly with the LLM provider
  • Model choice — you choose which model powers each handler independently

Per-handler model selection (five configurable handlers):

  • Incident Response — reactive agent for alert analysis (highest stakes, use your most capable model)
  • Chat Queries — natural language server queries
  • Health Checks — L1, L2, L3 and circle group checks (high volume, cost-sensitive)
  • Maintenance — scheduled maintenance analysis
  • Daily Summary — daily digest generation

Cost tracking uses published per-token API rates by default. If you have negotiated enterprise pricing, VixPro AI lets you override rates per model so your cost dashboard reflects your actual spend.

The same BYOK principle applies to PagerDuty, Twilio, and cloud provider credentials. VixPro AI is a coordination and execution layer — you own the relationships with every underlying service.

Ready to get started?

Try the live demo or explore pricing for your team.