Documentation Index
Fetch the complete documentation index at: https://docs.kong.fyi/llms.txt
Use this file to discover all available pages before exploring further.
Synopsis
kong analyze [OPTIONS] BINARY
Description
Analyze a binary with Kong’s autonomous agent. Opens the binary in Ghidra, runs the five-phase pipeline (triage, analysis, cleanup, synthesis, export), and writes results to the output directory.
Requires kong setup to be run first.
Arguments
| Argument | Type | Required | Description |
|---|
BINARY | Path | Yes | Path to the binary file to analyze. Must exist. |
Options
| Flag | Type | Default | Description |
|---|
--headless | Flag | false | Run without TUI. Events are printed to stdout. Useful for CI/Docker. |
-o, --output | Path | ./kong_output | Output directory for results. |
-f, --format | Choice | source, json | Output format(s). Choices: source, json, ghidra. Can be specified multiple times. |
--ghidra-dir | Path | Auto-detected | Override Ghidra installation directory. |
-p, --provider | Choice | From setup | LLM provider. Choices: anthropic, openai, custom. |
-m, --model | String | Provider default | Override the LLM model name. |
--base-url | String | — | Custom OpenAI-compatible endpoint URL. Implies --provider custom. |
--max-prompt-chars | Integer | Model default | Override maximum prompt size in characters. |
--max-chunk-functions | Integer | Model default | Override maximum functions per LLM batch. |
--max-output-tokens | Integer | Model default | Override maximum output tokens. |
Global Options
| Flag | Type | Default | Description |
|---|
-v, --verbose | Flag | false | Enable verbose debug logging. Inherited by all subcommands. |
--version | Flag | — | Show Kong version and exit. |
Examples
# Basic analysis with default provider
kong analyze ./stripped_binary
# Headless mode for CI
kong analyze ./binary --headless -o ./results -f json
# Use a specific provider and model
kong analyze ./binary --provider openai --model gpt-4o-mini
# Custom endpoint (Ollama)
kong analyze ./binary --provider custom \
--base-url http://localhost:11434/v1 \
--model mistral
# All output formats with verbose logging
kong analyze ./binary -v -f source -f json -f ghidra
# Override model limits for small context models
kong analyze ./binary --provider custom \
--base-url http://localhost:11434/v1 \
--model phi3 \
--max-prompt-chars 50000 \
--max-chunk-functions 20
Further reading