// HERO RENDER PLACEHOLDER
🥽
Replace with 360° render or key visual

ArtRevolt XR · In Development

AGI Art Gallery

Virtual · Mixed Reality · Augmented Reality

AGI — Art Gallery Interactive

An XR experience placing the visitor inside a virtual art gallery built around Vasil's original works. The project spans three development procedures — from fully immersive 360° rendered walkthroughs, through a Mixed Reality interactive game-mode for headset users, to an Augmented Reality mobile demo that brings individual artworks into the real world. Developed independently under ArtRevolt.

Unreal Engine 5 Unity (AR Foundation) 360° Rendering Mixed Reality Augmented Reality After Effects Meta Quest iOS / Android AR C#
ROLE
Solo Developer
XR Artist
STATUS
In Development
STUDIO
ArtRevolt B.V.
Chapter 01 of 03
01 360° Render Complete

360° Rendered Animations

// EQUIRECTANGULAR · UNREAL ENGINE · IMMERSIVE WALKTHROUGH

The first entry point into the AGI experience. High-resolution equirectangular 360° animations rendered in Unreal Engine 5 capture a full walkthrough of the virtual gallery space — from the entrance foyer through the main halls and into individual artwork rooms. These renders serve both as a standalone immersive presentation and as the visual development reference for the full VR build.

Unreal Engine 5 360° Equirectangular After Effects Lumen GI Nanite Geometry Cinematic Camera
Render TypeEquirectangular 360° — stereo pair (left/right eye)
Resolution4096 × 2048 per eye — 60 fps target
LightingUnreal Engine 5 Lumen Global Illumination + Sky Atmosphere
OutputMP4 / YouTube 360 / Meta Horizon compatible
PostAdobe After Effects — grading, titles, transitions
🎥
360° Walkthrough — Gallery Entrance
Add video or embed src here
🖼️
360° Still — Main Hall
Add image src here
▶️
Full 360° Animation — Gallery Walkthrough
Embed YouTube 360 or local video file
// Dev Note 360° renders are the cinematic backbone of the project — they let anyone experience the gallery without a headset. Add your rendered video files above. For YouTube 360, use an iframe embed. For local files, use a <video> tag with the controls attribute.
Chapter 02 of 03
02 Mixed Reality In Development

MR Interactive Game-Mode

// PASSTHROUGH · META QUEST · REAL-WORLD ANCHORED · INTERACTIVE

The second chapter layers the virtual gallery over the user's real physical space using Meta Quest passthrough. Visitors wearing the headset see their actual room augmented with gallery walls, paintings, and interactive elements anchored to real-world surfaces. A game-mode layer introduces interaction mechanics — picking up artworks, reading information panels, and navigating between gallery rooms — making the experience exploratory rather than passive.

Unreal Engine 5 Meta XR SDK Mixed Reality Passthrough Scene Understanding Spatial Anchors Hand Tracking Meta Quest 2 / 3 Blueprint / C++
PlatformMeta Quest 2 / Quest 3 (standalone, no PC tether)
SDKMeta XR SDK — Mixed Reality Utility Kit (MRUK)
Room MappingScene Understanding API — walls, floor, ceiling, furniture
InteractionHand tracking + controller fallback — grab, inspect, navigate
RenderingUE5 Forward Renderer — optimised for standalone GPU budget
Target FPS90fps (Quest 3) / 72fps (Quest 2)
🥽
MR Passthrough — Room Anchoring
Add gameplay capture here
🖐️
Hand Tracking Interaction Demo
Add video capture here
▶️
MR Game-Mode — Full Interaction Walkthrough
Add headset capture / screen recording here
📋

Add development notes here — scene understanding setup, interaction blueprint logic, performance profiling results, or design decisions about the game-mode mechanics.

// Dev Note For the MR chapter, capture gameplay directly from the Quest headset using the Meta Quest Developer Hub screen recording, or use the built-in cast to Chromecast for live recording. Add those captures above to show the passthrough anchoring and hand-tracking interactions in context.
Chapter 03 of 03
03 AR Mobile Test Demo

AR Test Demo — Mobile

// AR FOUNDATION · iOS / ANDROID · SURFACE DETECTION · PORTABLE

The third chapter is an accessible AR demo deployable on any modern iOS or Android smartphone — no headset required. Using AR Foundation (Unity) or Unreal's ARKit/ARCore integration, individual artworks from the gallery are placed onto real-world surfaces detected by the phone camera. Users can walk around the virtual artwork, scale it, and read contextual information overlaid alongside it — a portable taste of the full gallery experience.

Unity (AR Foundation) ARKit (iOS) ARCore (Android) Plane Detection World Anchors Touch Interaction C# iOS / Android Build
PlatformiOS 14+ (ARKit 4) & Android 8.0+ (ARCore 1.x)
FrameworkUnity AR Foundation — unified ARKit / ARCore backend
DetectionHorizontal & vertical plane detection — floor, table, wall
InteractionTap to place · Pinch to scale · Rotate with two fingers
ContentIndividual artworks as optimised real-time 3D objects
DistributionTestFlight (iOS) / APK sideload (Android) — test demo
📱
AR Plane Detection — Surface Placement
Add phone screen recording here
🖼️
AR Artwork in Real World — Walk-around
Add video or screenshot here
▶️
AR Demo — Full Mobile Test Recording
Add screen recording from iPhone or Android here
📱

Add notes about the AR test demo here — what artworks are available to place, how the plane detection performs, build instructions for the APK/TestFlight, or known limitations of the demo version.

// Dev Note For the AR mobile chapter, record the demo using your phone's built-in screen recorder while running the AR session — this captures both the camera feed and the overlaid 3D content accurately. Add those recordings above. You can also link to a TestFlight or APK download here for recruiters to try it directly.