MUDFLOOD AI

Multi-LLM AI Assistant for Unreal Engine 5

A native Slate-based plugin that integrates the world's most advanced language models directly into the Unreal Editor. Seamlessly introspect your projects, analyze blueprints, search code, and accelerate development with real-time AI assistance powered by Claude, GPT-4, Gemini, or local models.

UE 5.4+ Claude ยท OpenAI ยท Gemini 33 Source Files 11 Tools Native Slate UI
View on GitHub
Unreal Editor 5.4.0 | Mudflood AI Preview
Content Browser
Details
Mudflood AI
[3D Viewport - Level Editor]
Details
Transform
Visibility
Rendering
Mudflood AI Chat
What blueprint variables are in BP_Character?
Found BP_Character with 8 variables:
- Health (float)
- MaxHealth (float)
- Speed (float)
Show me the construction script

LLM Providers

Choose from the world's most advanced language models, or run locally for complete privacy

Claude

Best-in-class reasoning with native tool use. The Anthropic Messages API provides the most robust streaming, structured tool responses, and content block architecture. Claude consistently delivers the most accurate project analysis and code understanding across complex codebases. Default model: claude-sonnet-4-5-20250514 with extended thinking capabilities for advanced problem-solving.

OpenAI GPT-4

Powerful function calling with Chat Completions API. Full support for GPT-4o with multimodal capabilities and advanced reasoning. Streaming responses with chunked message delivery. Perfect balance of capability and speed for rapid iteration and creative problem-solving in development workflows.

Google Gemini

Fast inference with Generative AI API and function declarations. Excellent for context understanding and project introspection. Competitive pricing with strong performance on code analysis and technical documentation tasks. Supports long context windows for comprehensive project analysis.

Local Models

Run any OpenAI-compatible endpoint: Ollama, LM Studio, or your own inference server. Zero API costs, complete privacy, and offline capability. Perfect for proprietary projects or teams with data sovereignty requirements. Full streaming support with any local model.

Powerful Features

Everything you need to revolutionize your development workflow

๐Ÿ”
Deep Project Introspection

Access 11 specialized tools that introspect your entire project. Read asset metadata, analyze blueprint graphs, enumerate world actors, search source code, and retrieve class hierarchies. Complete visibility into your project structure.

๐Ÿ”—
Multi-Provider Architecture

Unified interface for Claude, GPT-4, Gemini, and local models. Hot-swap providers without closing the editor. Extensible provider pattern makes adding custom endpoints effortless. Choose the best model for each task.

โšก
Real-time Streaming

See responses appear instantly as they stream. Native implementation for Claude/OpenAI with chunked Gemini support. Progressive UI updates create a responsive, interactive experience. No waiting for complete responses.

๐Ÿ› ๏ธ
Native Tool Use

Integrated function calling system. The AI automatically selects and chains tools to answer complex questions. Supports parameter inference and structured JSON responses. Tools work seamlessly with conversation context.

๐Ÿ“Œ
Dockable Chat Panel

Native Slate widget docks anywhere in the editor interface. Open with Ctrl+Shift+M or from the Tools menu. Persists position and state across editor sessions. Fully integrated with UE5's UI paradigms.

๐Ÿ“Š
Blueprint Analysis

Deep inspection of blueprint variables, functions, event graphs, node connections, components, and interfaces. Understand blueprint logic, identify issues, and get optimization suggestions from the AI assistant.

๐Ÿ”Ž
Code Search & Read

Full-text search across all .h and .cpp files in your project. Read source files with configurable line limits for efficient context. Automatically scoped to your project directory for safety and performance.

๐Ÿงฉ
Extensible Design

Add new LLM providers, create custom tools, build custom UI panels, and hook into delegates. Well-documented extension points and a clean architecture make extending functionality straightforward for C++ developers.

11 Specialized Tools

Comprehensive introspection capabilities for complete project understanding

search_assets

Search the Asset Registry by name and class. Returns matched assets with paths and metadata. Essential for finding blueprints, materials, and other project assets by partial name matching or filter criteria.

get_blueprint_details

Retrieve complete information about a blueprint including variables, functions, event graph connections, components, and interfaces. Essential for understanding blueprint structure and logic flow.

read_source_file

Read C++ source files (.h, .cpp) with configurable line limits. Returns code with line numbers and proper context. Automatically enforces safety boundaries to prevent reading outside the project.

search_code

Full-text search across all C++ source files in your project. Returns matching lines with file paths and line numbers. Perfect for finding functions, classes, and specific code patterns across large codebases.

get_project_info

Retrieve high-level project information including name, version, target engine version, and paths. Get summary statistics about blueprints, C++ classes, and project plugins.

list_world_actors

Enumerate all actors in the current level with their classes, locations, and components. Understand the level composition and actor relationships for the current working context.

get_class_hierarchy

Retrieve inheritance hierarchies for C++ classes and blueprints. Understand parent-child relationships and see what classes inherit from a given base. Essential for API exploration.

get_material_details

Inspect material instances and master materials. Retrieve parameters, scalar values, vector values, and texture assignments. Understand material shader graphs and parameter relationships.

get_selected_actors

Get information about currently selected actors in the editor. Retrieve properties, components, and transforms. Provides context about what you're actively working on in the viewport.

get_level_info

Retrieve information about the current level including name, actor count, static mesh count, and other statistics. Get a comprehensive overview of level composition and streaming volumes.

execute_console_command

Execute editor console commands safely with whitelist validation. Useful for gameplay debugging, profiling commands, and editor state manipulation. Prevents dangerous or destructive commands.

Architecture & Design

A clean, extensible architecture built for maximum flexibility

ULLMProvider (Abstract) โ”œโ”€ UClaudeProvider โ”‚ โ””โ”€ Anthropic Messages API โ”œโ”€ UOpenAIProvider โ”‚ โ””โ”€ Chat Completions API โ”œโ”€ UGeminiProvider โ”‚ โ””โ”€ Generative AI API โ””โ”€ ULocalProvider โ””โ”€ OpenAI-compatible class FConversationMessage โ”œโ”€ Role (User/Assistant) โ”œโ”€ Content (TArray<FContentBlock>) โ”œโ”€ Tools โ””โ”€ Metadata class FLLMTool โ”œโ”€ Name โ”œโ”€ Description โ”œโ”€ InputProperties โ””โ”€ RequiredProperties

Two-Module Architecture

Mudflood AI is structured as two complementary modules working in perfect harmony. The MudfloodAICore runtime module contains all provider implementations, conversation management, streaming logic, and tool execution systems. The MudfloodAIEditor module provides the native Slate UI, tool handler, project introspection systems, and editor integration hooks.

Provider Pattern

The abstract ULLMProvider base class defines the interface for all LLM integrations. Each provider implements streaming, tool handling, and response parsing. The plugin ships with four providers (Claude, OpenAI, Gemini, Local), but you can easily add custom providers by subclassing ULLMProvider.

Tool System

Tools are registered in the MudfloodToolHandler as structured metadata. When the LLM requests a tool call, the handler validates parameters and executes the appropriate function. Results are serialized to JSON and returned to the conversation context for multi-turn reasoning.

Conversation Management

All messages are tracked in a conversation history with automatic context management. The system includes configurable message limits, system prompt generation, and conversation state persistence. Tools are automatically included in the system context so the AI knows what it can do.

Mudflood AI Plugin Structure (33 Files)
Source/
MudfloodAICore/ (Runtime Module)
Public/
LLMProvider.h
ClaudeProvider.h
OpenAIProvider.h
GeminiProvider.h
LocalProvider.h
ConversationManager.h
StreamingManager.h
MudfloodAICoreModule.h
Private/
ClaudeProvider.cpp
OpenAIProvider.cpp
GeminiProvider.cpp
LocalProvider.cpp
ConversationManager.cpp
StreamingManager.cpp
MudfloodAICoreModule.cpp
MudfloodAIEditor/ (Editor Module)
Public/
MudfloodChatPanel.h
MudfloodStyle.h
MudfloodCommands.h
ProjectContextGatherer.h
MudfloodToolHandler.h
MudfloodEditorSettings.h
MudfloodAIEditorModule.h
Private/
MudfloodChatPanel.cpp
MudfloodStyle.cpp
MudfloodCommands.cpp
ProjectContextGatherer.cpp
MudfloodToolHandler.cpp
MudfloodEditorSettings.cpp
MudfloodAIEditorModule.cpp
Mudflood.uplugin

Documentation

Quick Start

Step 1: Install the Plugin Clone the Mudflood AI repository into your project's Plugins folder. Regenerate Visual Studio project files and recompile. The plugin will appear in the Installed Plugins list.
Step 2: Configure Your Provider Go to Edit โ†’ Project Settings โ†’ Mudflood AI. Select your preferred LLM provider (Claude recommended) and enter your API key. You can change providers anytime without restarting.
[/Script/MudfloodAIEditor.MudfloodEditorSettings] SelectedProvider=Claude ClaudeAPIKey=sk-ant-... OpenAIAPIKey= GeminiAPIKey= LocalBaseURL=http://localhost:11434
Step 3: Open the Chat Panel Press Ctrl+Shift+M or go to Tools โ†’ Mudflood AI โ†’ Open Chat. The panel will dock in your editor interface. Pin it to keep it visible while you work.
Step 4: Start Asking Questions Type questions about your project naturally. The AI will automatically use its tools to inspect your code, analyze blueprints, search assets, and provide detailed answers. Examples:
"What blueprints inherit from Character?" "Show me the implementation of the Jump function" "List all materials using the M_Metallic master" "Find todos in my C++ code"
Step 5: Extend with Custom Tools (Optional) Add custom tools by subclassing UMudfloodToolHandler and registering them in the constructor. The AI will automatically include your tools in its context and use them when appropriate.

Configuration Settings

All settings are configured in Project Settings โ†’ Mudflood AI. Changes take effect immediately without restarting.

Provider Selection

Select which LLM provider to use: Claude (recommended), OpenAI, Gemini, or Local.

Setting Type Default Description
SelectedProvider ELLMProvider Claude Active LLM provider for conversations
ClaudeAPIKey FString Empty API key from console.anthropic.com
OpenAIAPIKey FString Empty API key from platform.openai.com
GeminiAPIKey FString Empty API key from aistudio.google.com
LocalBaseURL FString http://localhost:11434 Base URL for local/custom OpenAI-compatible endpoints

Model Configuration

Setting Type Default Description
ModelName FString Empty (uses provider default) Specific model to use (e.g., gpt-4o for OpenAI)
MaxTokens int32 8192 Maximum tokens per response (256-128000)
Temperature float 0.7 Response randomness (0.0-2.0, higher = more creative)

Behavior Settings

Setting Type Default Description
bEnableStreaming bool true Show responses in real-time as they stream
bEnableToolUse bool true Allow the AI to use introspection tools
bAutoIncludeProjectContext bool true Automatically include project info in system prompt
MaxContextMessages int32 50 Maximum conversation history to maintain (5-200)

LLM Providers

Claude (Anthropic) - RECOMMENDED

Claude provides the best reasoning capability, native tool use, and most reliable code analysis. Go to console.anthropic.com, create an API key, and enter it in the settings.

SelectedProvider=Claude ClaudeAPIKey=sk-ant-xxxxxxxxxxxx // Uses claude-sonnet-4-5-20250514 by default // 200K context window, native tool use, extended thinking

OpenAI GPT-4

Get your API key from platform.openai.com. Supports GPT-4o with multimodal capabilities and function calling.

SelectedProvider=OpenAI OpenAIAPIKey=sk-proj-xxxxxxxxxxxx ModelName=gpt-4o // Or use gpt-4-turbo for lower cost

Google Gemini

Get a free API key from aistudio.google.com. Fast inference with strong code understanding.

SelectedProvider=Gemini GeminiAPIKey=AIzaSyxxxxxxxxxxxxxx // Uses latest Gemini model by default // Excellent for rapid iteration and testing

Local Models

Run any OpenAI-compatible endpoint locally. Use Ollama or LM Studio for complete privacy and offline capability.

SelectedProvider=Local LocalBaseURL=http://localhost:11434 // For Ollama: run `ollama serve` then pull a model // Ollama models: ollama pull mistral, ollama pull llama2, etc. // Zero API costs, complete data privacy

Custom OpenAI-Compatible Endpoints

Use any OpenAI-compatible endpoint. Many services offer this (Groq, Together, Fireworks, etc.).

SelectedProvider=Local LocalBaseURL=https://api.fireworks.ai/inference/v1 // Configure with your provider's API key in the endpoint URL // Works with any OpenAI-compatible API

Introspection System

The ProjectContextGatherer scans your entire Unreal project and provides detailed information to the LLM. This enables the AI to understand your project structure, architecture, and code.

Asset Registry Integration

The introspection system uses Unreal's Asset Registry to enumerate all assets in your project. This includes blueprints, materials, meshes, textures, sounds, and all other project assets. The system extracts metadata like asset type, class, and path.

Blueprint Reflection

For blueprint assets, the system reflects on the blueprint graph structure, variables (with types and defaults), functions, event implementations, and component hierarchies. This allows the AI to understand your blueprint logic and architecture.

class UBlueprint โ”œโ”€ UEdGraph for EventGraph โ”œโ”€ UEdGraph for ConstructionScript โ”œโ”€ FBlueprintVariableDescription [] โ”œโ”€ FProperty [] for Components โ””โ”€ FImplementedInterface []

World Actor Enumeration

When a level is open, the system enumerates all actors in the world using TActorIterator. For each actor, it retrieves the class, location, rotation, components, and other relevant properties. This provides context about what's in the current level.

Source File Access

The system has safe, validated access to all C++ source files in your project's Source folder. Files are read with configurable line limits to prevent excessive context sizes. All file paths are validated to ensure they're within the project folder.

System Prompt Generation

All gathered context is intelligently formatted into a system prompt that tells the LLM about your project. This includes project metadata, available assets, blueprints, classes, and tools. The system prompt is automatically updated as you work.

Tool System

The tool system allows the LLM to call specialized functions to introspect your project. Tools are defined as structured metadata and executed by the tool handler.

Tool Structure

struct FLLMTool { FString Name; FString Description; TArray<FToolProperty> InputProperties; TArray<FString> RequiredProperties; }; struct FToolProperty { FString Name; FString Type; FString Description; TArray<FString> EnumValues; };

Tool Registration

Tools are registered in the MudfloodToolHandler constructor. Each tool maps a name to an execution function. When the LLM requests a tool, the handler validates the request, executes the function, and returns the result as JSON.

void FMudfloodToolHandler::RegisterTools() { RegisterTool( "search_assets", "Search the Asset Registry by name", [this](const FString& Query) { return HandleSearchAssets(Query); } ); }

Tool Execution Flow

When the LLM decides to use a tool, this sequence occurs:

  1. LLM returns tool_use content block with name and parameters
  2. Handler validates tool exists and parameters are valid
  3. Handler executes the tool function with provided parameters
  4. Result is serialized to JSON
  5. Tool result is added to conversation history
  6. LLM sees result and continues reasoning to answer your question
  7. Multi-tool chaining: LLM can request multiple tools in sequence

Adding Custom Tools

To add a custom tool, subclass FMudfloodToolHandler and register your tool:

void FMyCustomToolHandler::RegisterTools() { Super::RegisterTools(); RegisterTool( "get_actor_count", "Get total actor count in level", [this]() { int32 Count = 0; for (TActorIterator<AActor> It(World); It; ++It) { Count++; } return FString::FromInt(Count); } ); }

Editor UI Architecture

Chat Panel Layout

The SMudfloodChatPanel is a native Slate widget with this hierarchy:

SVerticalBox โ”œโ”€ SHorizontalBox (Header) โ”‚ โ”œโ”€ Title โ”‚ โ”œโ”€ Provider selector โ”‚ โ””โ”€ Settings button โ”œโ”€ SScrollBox (Messages) โ”‚ โ””โ”€ SMudfloodChatMessage [] โ”œโ”€ SHorizontalBox (Input) โ”‚ โ”œโ”€ SMultiLineEditableTextBox โ”‚ โ””โ”€ Send button โ””โ”€ STextBlock (Status)

Message Display

SMudfloodChatMessage handles displaying both user and assistant messages. It automatically detects code blocks and renders them with syntax highlighting. Streaming messages update in real-time as tokens arrive.

Streaming Text Display

Text is displayed progressively as it streams from the LLM. The widget uses an FString that grows with each update and automatically reformats for word wrapping. Streaming status is shown to the user.

Tool Use Visualization

When the AI uses a tool, the panel shows which tool is being called and with what parameters. Tool results are displayed in a collapsible format so you can understand what data the AI is working with.

Styling System

FMudfloodAIStyle defines all colors, fonts, and visual properties. Extend or override the style by subclassing FMudfloodAIStyle and registering your custom style.

const FLinearColor AccentBlue = FLinearColor(0.302f, 0.545f, 1.0f); const FLinearColor AccentPurple = FLinearColor(0.486f, 0.361f, 0.988f); const FLinearColor AccentCyan = FLinearColor(0.0f, 0.851f, 1.0f);

Keyboard Shortcuts

FMudfloodAICommands defines editor shortcuts:

  • Ctrl+Shift+M - Toggle Mudflood AI panel
  • Ctrl+Enter - Send message from input box
  • Shift+Enter - New line in input box

Extending Mudflood AI

Creating a New LLM Provider

Subclass ULLMProvider and implement the required virtual functions:

class UMyCustomProvider : public ULLMProvider { virtual void SendMessage(...) override; virtual void ProcessResponse(...) override; virtual bool SupportsStreaming() override; };

Creating Custom Tools

Register tools by subclassing the tool handler or by adding to the global tool registry:

void RegisterMyTool() { FMudfloodToolHandler::GetInstance()->RegisterTool( "my_tool", "My custom tool description", [](){ return "Tool result"; } ); }

Customizing the UI

Create a custom Slate widget that inherits from SMudfloodChatPanel. Override the layout, styling, or message rendering to match your preferences.

Hooking into Delegates

The system provides delegates for key events:

// Called when a message is sent FMudfloodAI::OnMessageSent.AddDynamic(...); // Called when a tool is executed FMudfloodToolHandler::OnToolExecuted.AddDynamic(...); // Called when response streaming completes FStreamingManager::OnStreamingComplete.AddDynamic(...);

Extending Project Context

Override ProjectContextGatherer to add custom project information to the system prompt. This allows the AI to know about custom project structures or metadata.