Synopsis
Description
Analyze a binary with Kong’s autonomous agent. Opens the binary in Ghidra, runs the five-phase pipeline (triage, analysis, cleanup, synthesis, export), and writes results to the output directory. Requireskong setup to be run first.
Arguments
| Argument | Type | Required | Description |
|---|---|---|---|
BINARY | Path | Yes | Path to the binary file to analyze. Must exist. |
Options
| Flag | Type | Default | Description |
|---|---|---|---|
--headless | Flag | false | Run without TUI. Events are printed to stdout. Useful for CI/Docker. |
-o, --output | Path | ./kong_output | Output directory for results. |
-f, --format | Choice | source, json | Output format(s). Choices: source, json, ghidra. Can be specified multiple times. |
--ghidra-dir | Path | Auto-detected | Override Ghidra installation directory. |
-p, --provider | Choice | From setup | LLM provider. Choices: anthropic, openai, custom. |
-m, --model | String | Provider default | Override the LLM model name. |
--base-url | String | — | Custom OpenAI-compatible endpoint URL. Implies --provider custom. |
--max-prompt-chars | Integer | Model default | Override maximum prompt size in characters. |
--max-chunk-functions | Integer | Model default | Override maximum functions per LLM batch. |
--max-output-tokens | Integer | Model default | Override maximum output tokens. |
Global Options
| Flag | Type | Default | Description |
|---|---|---|---|
-v, --verbose | Flag | false | Enable verbose debug logging. Inherited by all subcommands. |
--version | Flag | — | Show Kong version and exit. |
Examples
Further reading
- Analyzing a Binary — workflow guide
- Output Formats — details on each output format
- Custom Endpoints — setting up local models

