Skip to main content
Kong works with any OpenAI API-compatible endpoint. Run analysis with local models (Ollama, vLLM), third-party providers (OpenRouter), or your own infrastructure.

Supported platforms

PlatformExample URLNotes
Ollamahttp://localhost:11434/v1Local models, no API key needed
vLLMhttp://localhost:8000/v1High-performance local serving
OpenRouterhttps://openrouter.ai/api/v1Multi-provider gateway, requires API key
LocalAIhttp://localhost:8080/v1Local models
LM Studiohttp://localhost:1234/v1Desktop app for local models

Setup

Option 1: Interactive setup

kong setup
# Choose option 3: Custom endpoint
# Enter your endpoint URL, model name, and optional API key

Option 2: CLI flags

kong analyze ./binary \
  --provider custom \
  --base-url http://localhost:11434/v1 \
  --model mistral

Limit overrides

Local models often have smaller context windows than cloud APIs. Use these flags to prevent prompt truncation:
FlagDefault (cloud)Suggested for local
--max-prompt-chars400,00050,000 - 150,000
--max-chunk-functions12020 - 40
--max-output-tokens16,3844,096 - 8,192
kong analyze ./binary \
  --provider custom \
  --base-url http://localhost:11434/v1 \
  --model mistral \
  --max-prompt-chars 100000 \
  --max-chunk-functions 30 \
  --max-output-tokens 4096
You can also set these during kong setup so they persist as defaults for your custom endpoint.

Endpoint probing

Kong validates that the endpoint is reachable before starting analysis. If your local server is down, you’ll get an error immediately instead of waiting for Ghidra to finish loading the binary:
Could not connect to LLM endpoint.
Ensure your server is running at http://localhost:11434/v1

Cost tracking

Cost tracking is disabled for custom providers since Kong has no pricing data for arbitrary models. Token counts are still recorded and shown in the final stats:
Tokens: 1,234,567 in / 456,789 out / 1,691,356 total
  mistral: 234 calls, 1,234,567 in / 456,789 out
Cost tracking disabled for custom provider

Further reading

Last modified on March 20, 2026