An XR experience placing the visitor inside a virtual art gallery built around Vasil's original works. The project spans three development procedures — from fully immersive 360° rendered walkthroughs, through a Mixed Reality interactive game-mode for headset users, to an Augmented Reality mobile demo that brings individual artworks into the real world. Developed independently under ArtRevolt.
// EQUIRECTANGULAR · UNREAL ENGINE · IMMERSIVE WALKTHROUGH
The first entry point into the AGI experience. High-resolution equirectangular 360° animations rendered in Unreal Engine 5 capture a full walkthrough of the virtual gallery space — from the entrance foyer through the main halls and into individual artwork rooms. These renders serve both as a standalone immersive presentation and as the visual development reference for the full VR build.
| Render Type | Equirectangular 360° — stereo pair (left/right eye) |
| Resolution | 4096 × 2048 per eye — 60 fps target |
| Lighting | Unreal Engine 5 Lumen Global Illumination + Sky Atmosphere |
| Output | MP4 / YouTube 360 / Meta Horizon compatible |
| Post | Adobe After Effects — grading, titles, transitions |
// PASSTHROUGH · META QUEST · REAL-WORLD ANCHORED · INTERACTIVE
The second chapter layers the virtual gallery over the user's real physical space using Meta Quest passthrough. Visitors wearing the headset see their actual room augmented with gallery walls, paintings, and interactive elements anchored to real-world surfaces. A game-mode layer introduces interaction mechanics — picking up artworks, reading information panels, and navigating between gallery rooms — making the experience exploratory rather than passive.
| Platform | Meta Quest 2 / Quest 3 (standalone, no PC tether) |
| SDK | Meta XR SDK — Mixed Reality Utility Kit (MRUK) |
| Room Mapping | Scene Understanding API — walls, floor, ceiling, furniture |
| Interaction | Hand tracking + controller fallback — grab, inspect, navigate |
| Rendering | UE5 Forward Renderer — optimised for standalone GPU budget |
| Target FPS | 90fps (Quest 3) / 72fps (Quest 2) |
Add development notes here — scene understanding setup, interaction blueprint logic, performance profiling results, or design decisions about the game-mode mechanics.
// AR FOUNDATION · iOS / ANDROID · SURFACE DETECTION · PORTABLE
The third chapter is an accessible AR demo deployable on any modern iOS or Android smartphone — no headset required. Using AR Foundation (Unity) or Unreal's ARKit/ARCore integration, individual artworks from the gallery are placed onto real-world surfaces detected by the phone camera. Users can walk around the virtual artwork, scale it, and read contextual information overlaid alongside it — a portable taste of the full gallery experience.
| Platform | iOS 14+ (ARKit 4) & Android 8.0+ (ARCore 1.x) |
| Framework | Unity AR Foundation — unified ARKit / ARCore backend |
| Detection | Horizontal & vertical plane detection — floor, table, wall |
| Interaction | Tap to place · Pinch to scale · Rotate with two fingers |
| Content | Individual artworks as optimised real-time 3D objects |
| Distribution | TestFlight (iOS) / APK sideload (Android) — test demo |
Add notes about the AR test demo here — what artworks are available to place, how the plane detection performs, build instructions for the APK/TestFlight, or known limitations of the demo version.