top of page

Wave XR

I've compiled case studies of select projects I had the pleasure to help ship throughout my time at Wave, ranging from early 2021 through 2023. Please note that I am only including material that is publicly available, as many of the projects I worked on (in particular more recent ones) were never released to the public, so I am not at liberty to share details about them.  â€‹

 

Below is a sizzle reel from Wave's Youtube channel, where many of the video links on this page (and more) can be found. Case studies on this page don't represent all the work (including shipped) I contributed to Wave, but I selected them as interesting ones to dive into.

Case Study- Justin Bieber

Project Roles and Responsibilities
  • Set up the interactions for the show, where viewer votes would contribute to visuals throughout

  • Configured crowd sims in select moments of the show 

  • Created initial prototypes for certain systems that were polished by others later (VFX avatars, dynamic environmental rigs, cloud system for finale)

I served primarily as R&D and technical support to the art team throughout the production for this show. Much of the work I did was proof of concept very early on, and was then polished with final content and tweaks to systems after the fact. 

Challenges
  • Short production time relative to scope meant that we needed to repurpose tech from previous shows we had done, which required me to clean up bespoke logic from former setups (like crowds, and parent constraint tech)

  • We needed a VFX system that would emit off of a live mocap skeletal mesh that was hidden, to create VFX characters​

​
Approach
  • For crowds, we used the in house solution we built for Pentakill, with a few quality of life improvements. I also created a custom shader that randomized colors of the crowd members based on their world position

  • For the VFX characters, I created a script that would take a skeletal mesh reference, convert it into a static mesh snapshot, and allowed shuriken to spawn particles off of it. Then, a VFX artist came in and polished the particle system

Case Study- Teflon Sega

Project Roles and Responsibilities

​

*We did 2 full shows for the artist Teflon Sega- thumbnail is Volume 1, but click highlight moment links for specific sections!

Zoetrope Effect- the brief was to generate snapshot meshes of the performer in a radius around the primary Object and spin them around at such a speed to give the illusion of stationary movement on instances.  â€‹

image.png
Challenges
  • The show was performed completely live, so we couldn't pre-bake a sequence outside of engine to import​

  • Even though we were shipping on a PC, we couldn't afford a naive approach of just duplicating a skeletal mesh around a ring ahead of time and just delaying frame playback

 
​Approach
  • I created a tool written in C#, as well as a custom shader for the performer's clones, which helped us easily configure the effect with art direction, and keyframe a sequence in timeline
  • The core logic involved using Unity's built in method to dynamically generate a static mesh from a skeletal mesh. With some radial math and time based offset logic at runtime, it periodically takes a new snapshot from the main performer in the center and replaces a mesh on the ring with a new snapshot of the live mocap, with the custom shader applied. 

Endless Tunnel hallway effect- Creative direction wanted a moving hallway for a section of the concert that we could configure the speed, shape and direction of

image.png
Challenges
  • With a live show, we can't predict exactly how long the performer will stay in a section before they're ready to move on to the next section, so we couldn't rely on a custom mesh moving at a static pace​

  • The tunnel needed to be highly configurable, including ability to turn, pulse down the length, change static mesh shape, size, and rotation, etc.

 
Approach​
  • A spline was used to populate instances of a specified mesh at the start point and move along to the end of the path. Along with other parameters to control speed and size, artists could create a variety of endless tunnel looks by modifying the spline

  • I integrated object pooling, to avoid constantly destroying and creating objects when they completed their path​

Portal Room- Creative wanted a section of the concert to contain portals behind the live performer that showed giant versions of him in pre-built posed animations, but with live facial capture​

image.png
Challenges
  • We wanted face cap on these giant characters, as well as partial motion capture data

  • The windows needed to show other scenes from inside the room, but an exterior background around the room

 
Approach
  • ​I configured a special version of the character that played back the animation on certain joints, but applied live data to others​ (as well as played back facial capture)

  • To create the portal illusion, I set up shots in different parts of the scene, placed render texture cameras framing those shots, and synchronized relative rotation and position with the main camera looking at the room​​

Case Study- Pentakill

Project Roles and Responsibilities
  • Helped create our own custom crowd sim tech, to render ~100k crowd members with different animations

  • Sequenced many of the camera shots for the timeline of the show

  • Set up the interactions for the show, where viewer votes would contribute to visuals throughout

Challenges
  • At the time of this concert, there were no out of the box massive crowd sim solutions we could use, and there were plenty of shots where we needed upwards of 100k crowds, as well as fullly dynamic rendered scene and hero characters

  • Due to pandemic restrictions, we had to prepare a number of contingencies- not just to account for the live nature of concerts, but also to account for talent that may not be able to physically present at all times

  • The sequence of the concert experience involved many shots of characters driven by live mocap on moving environments throughout the world

 
Approach
  • I worked closely with a graphics engineer to build a proprietary crowd system

    • custom HLSL shader to render a massive amount of crowds with features for animation offsetting, energy levels to switch between animation sets, and LOD levels

    • Helped integrate optimizations such as frustum culling, and dynamically configurable LOD distances, and impostor cards for far LODs

    • Created a simple Virtual Animation Texture generation tool, which processed animations and converted vertex positions into a texture

  • We created a parent constraint script that allowed us to dynamically switch characters' parent transforms at runtime to other objects such as environments, or even joints on other characters (ex, mordekaiser scene)

Portfolio of Tashkeel Shah

bottom of page