Register

RPGX AI Assistant

An Add-on Module for Foundry Virtual Tabletop

Author: x8xid82 Project: Source Foundry Versions 10.00+ (Verified 13.531) Last Updated 2 days, 21 hours ago

This package contains art, text, or software code produced using generative AI.

RPGX AI Assistant for Foundry VTT

An Add-on Module for Foundry Virtual Tabletop

Author: RPGX Studios
Foundry Compatibility: Version 13+
Last Updated: 4/19/2026 - v2.0.0 Core Release

The RPGX AI Assistant brings a fully local, customizable AI into Foundry Virtual Tabletop. It connects directly to your locally hosted Ollama instance and any compatible large language model (LLM) — allowing Game Masters and players to integrate intelligent conversation, rule/lore checks, or creative narration directly into their tabletop sessions.

This module is designed to work independently (requiring only Foundry VTT and Ollama) or can be paired with the RPGX Proton application for easy RAG server integration and management to retain world-specific rules, lore, and other contextual data. 

Overview

The RPGX AI Assistant provides seamless AI integration for Foundry, offering a responsive in-game assistant that can assist with story generation, NPC dialogue, rules clarifications, and scene narration — all through a private, locally hosted model.

With your preferred LLM running via Ollama (such as Qwen 2.5, Llama 3, Mistral, or others), this module lets you:

Core Features

Local AI Integration via Ollama

Foundry Chat Integration

Customizable Model Settings

Optional World Knowledge Support

Example Prompts

Troubleshooting & FAQ

Q: The chat command doesn’t respond.
A: Make sure Ollama is running and reachable at the configured address.

Q: The model runs too slowly or times out.
A: There should be no timeout due to streaming responses. If the model is "thinking" too long, you should make sure you are not using any extra background resources that aren't needed on your Ollama machine. You may also want to try a simpler language model.

Q: Do I need the Librarian module?
A: No — This module is obsolete and is no longer compatible with the latest version of RPGX AI Assistant.

Q: Can I connect to an online API model?
A: The Assistant is optimized for Ollama, but technically compatible with any local REST-based LLM endpoint.

Q: My current setup works the way I want. Should I update? 
A: The latest version of the assistant is not compatible with previous models. If you have a custom RAG setup and it is working, backup your module setup into a separate folder before updating so you can roll back if needed. The module has been rebuilt from the ground up, and the structures are not the same.

 

Supported Game Systems

License

Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) License

Changelog (v2.0.0 Core Release)

Categories

Available Versions

  1. Version v2.00

    1 week ago
    Foundry Version 10.00+ (Verified 13.351) Manifest URL Read Notes
  2. Version v2.0.2

    2 days, 21 hours ago
    Foundry Version 11+ (Verified 13.531) Manifest URL Read Notes
  3. Version v1.50

    5 months ago
    Foundry Version 10.00 - 13.351 (Verified 13.350) Manifest URL Read Notes