AR Sandbox with water simulation using hand gestures

Description

The goal of this work was the design and implementation of an interactive learning system that allows the user to explore and understand weather conditions in real time. The implementation is based on the existing AR SanBox solution.
I researched Lattice Boltzmann Method to simulate water, but in the end decided to use Smoothed Particle Hydrodynamics. First I represent the liquid using sphere particles and then visualize them as object simnilar to liquid using shader in Unity.
Terrain is captured with Kinect v2 depth sensor. Terrain game object in Unity is created from this depth array. Based on the depth a specific texture is applied. I made realistic, vulcanic (with lava instead of water), colorful, colorful with contour lines and exotic designs.
Another module is calibration UI, where user can calibrate the platform to correctly show projections onto the terrain.
Hand gesture recognition was from depth data (3rd picture), because of the projection overlaying the hand and this was hard to recognize from using normal camera. I reached 60% recognition accuracy which was unfortunately not enough for the final version.