Getting Started with GhidrAssist

This guide helps you install GhidrAssist, configure an LLM provider, and run your first analysis in Ghidra.

Prerequisites

Before installing GhidrAssist, ensure you have:

Installation

GhidrAssist is installed as a Ghidra extension.

Step 1: Install the Extension

Option A: Extension Manager (Recommended)

  1. Download the GhidrAssist release ZIP
  2. Open Ghidra
  3. Go to File → Install Extensions
  4. Click the + button and select the ZIP
  5. Enable the extension and restart Ghidra

Option B: Manual Install

  1. Copy the release ZIP into:
    • Ghidra_Install/Extensions/Ghidra/
  2. Restart Ghidra
  3. Enable the extension in File → Install Extensions

Step 2: Enable the Plugin

  1. Open or create a project
  2. Launch CodeBrowser
  3. Go to File → Configure → Miscellaneous
  4. Check Enable GhidrAssist

Step 3: Open GhidrAssist

  1. In CodeBrowser, open Window → GhidrAssist
  2. The GhidrAssist panel appears with the tab interface

Initial Configuration

You need to configure at least one LLM provider.

Accessing Settings

  1. In the GhidrAssist panel, click the Settings tab
  2. The LLM Providers section appears at the top

Setting Up an LLM Provider

GhidrAssist supports multiple providers. Choose the one that fits your needs:

Option 1: Ollama (Local, Free, Private)

Ollama runs models locally on your machine.

Step 1: Install Ollama

# Linux/macOS
curl -fsSL https://ollama.ai/install.sh | sh

# Windows: Download from https://ollama.ai/download

Step 2: Pull a Model

# General purpose model
ollama pull llama3.1:8b

# Reasoning model (recommended for complex analysis)
ollama pull gpt-oss:20b

# Start the server
ollama serve

Step 3: Configure in GhidrAssist

  1. In Settings, click Add in LLM Providers
  2. Fill in:
    • Name: Ollama Local
    • Type: Ollama
    • Model: gpt-oss:20b
    • URL: http://localhost:11434
    • API Key: Leave empty
    • Max Tokens: 16384
  3. Click Save
  4. Click Test

Option 2: OpenAI Platform API

Use OpenAI models with a paid API key.

Step 1: Get an API Key

  1. Go to platform.openai.com
  2. Create an API key from the dashboard

Step 2: Configure in GhidrAssist

  1. Click Add in LLM Providers
  2. Fill in:
    • Name: OpenAI
    • Type: OpenAI Platform API
    • Model: gpt-5.2-codex
    • URL: Leave empty (default)
    • API Key: Paste your API key
    • Max Tokens: 20000
  3. Click Save
  4. Click Test

Option 3: Anthropic Platform API

Use Claude models with a paid API key.

Step 1: Get an API Key

  1. Go to console.anthropic.com
  2. Create an API key

Step 2: Configure in GhidrAssist

  1. Click Add in LLM Providers
  2. Fill in:
    • Name: Anthropic Claude
    • Type: Anthropic Platform API
    • Model: claude-sonnet-4-5
    • URL: Leave empty (default)
    • API Key: Paste your API key
    • Max Tokens: 20000
  3. Click Save
  4. Click Test

Option 4: OAuth Providers (Claude Pro/Max or ChatGPT Pro/Plus)

If you have a Claude Pro/Max or ChatGPT Pro/Plus subscription, use OAuth instead of an API key.

Claude Pro/Max:

  1. Click Add in LLM Providers
  2. Select Type: Anthropic OAuth
  3. Enter Name and Model (e.g., claude-sonnet-4-5)
  4. Click Authenticate
  5. A browser window opens for login
  6. After authorization, credentials are saved automatically
  7. Click Save

ChatGPT Pro/Plus:

  1. Click Add in LLM Providers
  2. Select Type: OpenAI OAuth
  3. Enter Name and Model (e.g., gpt-5.2-codex)
  4. Click Authenticate
  5. A browser window opens for login
  6. After authorization, credentials are saved automatically
  7. Click Save

Setting the Active Provider

  1. Use the Active Provider dropdown at the bottom of the LLM Providers section
  2. Select the provider you want to use

Your First Analysis

Step 1: Load a Binary

  1. Open a binary in Ghidra
  2. Wait for auto-analysis to complete

Step 2: Navigate to a Function

  1. In the Functions window, click a function
  2. Or press G and enter an address

Step 3: Explain the Function

  1. Open the GhidrAssist panel
  2. Click the Explain tab
  3. Click Explain Function
  4. Wait for the explanation to stream in

Step 4: Ask a Question

  1. Switch to the Query tab
  2. Type a question, for example:
    • “What does this function do?”
    • “Are there any security concerns here?”
    • “What functions does this call?”
  3. Click Submit
  4. Watch the response stream in

Next Steps

Explore these guides:

Troubleshooting

“Connection failed” when testing provider

No response from LLM

Plugin not appearing

Slow responses