Banner

ArtRevolt XR · In Development

AGI Art Gallery

Virtual · Mixed Reality · Augmented Reality

AGI — Art Gallery Interactive

An XR experience placing the visitor inside a virtual art gallery built around Vasil's original works. The project spans three development procedures — from fully immersive 360° rendered walkthroughs, through a Mixed Reality interactive game-mode for headset users, to an Augmented Reality mobile demo that brings individual artworks into the real world. Developed independently under ArtRevolt.

Unreal Engine 5 Unity (AR Foundation) 360° Rendering Mixed Reality Augmented Reality After Effects Meta Quest iOS / Android AR C++
ROLE
Solo Developer
XR Artist
STATUS
In Development
STUDIO
ArtRevolt B.V.
Chapter 01 of 03
01 360° Render Complete

360° Rendered Animations

// EQUIRECTANGULAR · UNREAL ENGINE · IMMERSIVE WALKTHROUGH

The first entry point into the AGI experience. High-resolution equirectangular 360° animations rendered in Unreal Engine 5 capture a complete VR world based on a canvas that embodies Pythagorean Cosmic Morphology and the five Platonic solids.
Guided by an AI narrator, participants embark on narrative journeys through these 360° environments, which are visually and thematically expanded from the original paintings.

Unreal Engine 5 360° Monoscopic LLM curator NPCs Lumen GI Nanite Geometry Cinematic Camera
Render Type Monoscopic 360° — easy access to all mobile devices
Resolution 8096 × 4048 render sequence compressed to 4K in 30fps
Lighting Unreal Engine 5 Lumen Global Illumination + Sky Atmosphere
Output AI agent narrator integrated as a NPC character
Post Davinci Resolve — grading, audio, transitions
CaptionShot
Full 360° Shot — Gallery Walkthrough
OpeningShot
Intro shot from the 360° World of Color Canvas
Full 360° Animation — Trailer
CaptionShot-UV
Full 360° Shot — Gallery Walkthrough in Fluorescent Post-Processing Effect
// Dev Note 360° renders are the cinematic backbone of the project — they let anyone experience the gallery without a headset.
Chapter 02 of 03
02 Mixed Reality In Development

MR Interactive Game-Mode

// PASSTHROUGH · META QUEST · REAL-WORLD ANCHORED · INTERACTIVE

The second chapter layers the virtual gallery over the user's real physical space using Meta Quest passthrough. Visitors wearing the headset see their actual room augmented with world portals linked to the corresponding painting, and interactive elements altering the environment in bio diversity. A game-mode layer introduces interaction mechanics — interacting with AI agent -NPC to aquire information about the artworks, reading information panels, and navigating between Canvas worlds — making the experience exploratory rather than passive.

Unreal Engine 5 Meta XR SDK Mixed Reality Passthrough Scene Understanding Spatial Anchors Hand Tracking Meta Quest 2 / 3 Blueprint / C++
Platform PC tethered running on Meta Quest 2 / 3S
SDK Meta XR SDK — Mixed Reality Utility Kit (MRUK)
Room Mapping Scene Understanding API — walls, floor, ceiling, furniture
Interaction Hand tracking + controller fallback — grab, inspect, navigate
Rendering UE5 Deferred Renderer — Higher Quality Rendering Settings
Target FPS 60 fps on PC tether
Worlds of Color
Worlds of Color - Overview
UV Effect
Fluorescent-UV post-process style
Interaction
World diversity - Interaction Walkthrough
Teleportation
Teleportation mobility
Loop of Life
Loop of Life - Work in progress
Beyond the Consciousness
Beyond the Consciousness - Coming Soon

As a solo developer and an aspiring artist, the challenge is to balance the technical implementation of MR features with the artistic vision of the gallery. The focus is on creating a compelling and immersive experience that captures the essence of the original artworks while leveraging the unique capabilities of Mixed Reality.

// Dev Note This prototype is a playground for experimenting with Mixed Reality interactions and visual styles. The goal is to find a sweet spot where the MR features enhance the artistic experience without overwhelming it, while also ensuring that the technical implementation is feasible for a solo developer. The future plan includes refining the interactions, optimizing performance, and expanding the content to cover more artworks and themes from the gallery.
In addition, a BSI (Brain-Computer Interface) layer is being explored as a long-term vision, which would allow users to interact with the gallery using neural inputs, creating a deeper connection between the art and the audience.
Chapter 03 of 03
03 AR Mobile Test Demo

AR Test Demo — Mobile

// AR FOUNDATION · iOS / ANDROID · SURFACE DETECTION · PORTABLE

The third chapter is an accessible AR demo deployable on any modern iOS or Android smartphone — no headset required. Using Unreal's ARKit/ARCore integration, individual artworks from the gallery are placed onto real-world surfaces detected by the phone camera. Users can walk around the virtual artwork, scale it, and read contextual information overlaid alongside it — a portable taste of the full gallery experience.

Unreal (ARCore) ARKit (iOS) ARCore (Android) Plane Detection World Anchors Touch Interaction Blueprints iOS / Android Build
Platform iOS 14+ (ARKit 4) & Android 8.0+ (ARCore 1.x)
Framework Unreal Engine — unified ARKit / ARCore backend
Detection Horizontal & vertical plane detection — floor, table, wall
Interaction Tap to place · Pinch to scale · Rotate with two fingers
Content Individual artworks as optimised real-time 3D objects
Distribution TestFlight (iOS) / APK sideload (Android) — test demo
📱
AR Plane Detection — Surface Placement
Add phone screen recording here
🖼️
AR Artwork in Real World — Walk-around
Add video or screenshot here
▶️
AR Demo — Full Mobile Test Recording
Add screen recording from iPhone or Android here
📱

Add notes about the AR test demo here — what artworks are available to place, how the plane detection performs, build instructions for the APK/TestFlight, or known limitations of the demo version.

// Dev Note For the AR mobile chapter, record the demo using your phone's built-in screen recorder while running the AR session — this captures both the camera feed and the overlaid 3D content accurately. Add those recordings above. You can also link to a TestFlight or APK download here for recruiters to try it directly.