top of page

Ideal VR Training

Software Used:


Ideal Year one VR Training is a supplemental training app meant to be used as an educational tool for electrical engineering students. WIth several narrated lessons, users are walked through various installation scenarios where they are able to freely walk and interact with equipment to get the job done. In addition to UI implementation, creating the tutorial, and some art pipeline support, one of my major tasks for this project was to manage the look we were aiming to achieve while maintaing performance on the Oculus Quest.  

Because the user experience was so important, it was crucial to have some basic visual feedback with interactions. This included both tools meant to be picked up (of various shapes and sizes) and visual guidance to direct interaction points.

Tool Shader


I created a shader that could support the PBR textures given to us by the art team, so the look could be accurate to branded tools that the client wanted used. On top of that I built in the 3 effects shown above: a pulse (to indicate a tool is selectable), an outline (to indicate a tool is being hovered over), and a dissolve (which was used if users dropped tools to teleport them back to the toolbench).

The image below shows the nodes used for the 3 effects (the pbr inputs are not shown here, as those are fairly straightforward texture samples with intensity multipliers).


Pulse Reveal Shader


Instead of building a whole system of putting conduit pieces together to feed wires through, we took a shortcut (for us and the users) by having a straight 'end-piece' that faded off on the end, and when that tool was placed in a valid area, the finished conduit would animate on. 

The above gif illustrates the reveal effect I came up with, as well as the pulse I added to signal the flow of electricity. Below is the node graph of the relevant nodes.


Environment Optimization

With the many performance demands that the project called for, our environment poly budget couldn't be very high. So naturally, there was a struggle to get the environment poly count under control. Until we found Google Seurat. Seurat is a fantastic (though unfortunately no longer supported) command line tool that allows us to take a number of 360 panoramic shots inside of a walkable environment and construct a 3D mesh and single texture of that entire environment.

No need to worry about poly count (the tri-count of the generated mesh could be specified, and we ended up with 36k) or draw calls (one mesh with one material means one draw call for the whole environment!). Below is a gif of the resulting geometry. It looks like an exploded mesh outside of the predetermined volume, but within it, it actually holds up quite nicely.

bottom of page