Schematics inspection gameplay of Neon Atelier
Over the previous weeks on proof of concept, our team had implemented the core tube bending mechanisms of Neon Atelier. For now on, we can start to deliver our effort to other parts of no less important gameplay.
One important but yet mentioned element is the inspection procedure before hands-on light tube making. Instead of throwing the player directly to each neon light quest, players are first introduced with a brief outline of the neon light schematics on a VR canvas. Players will use a VR cursor to navigate the schematic, in order to mark significant steps for the upcoming neon light crafting. This prepares the players with early impression over the crafting procedure, relieving the cognitive load during the actual hands-on crafting.
The schematics inspection gameplay is built on a VR canvas object. While XR Interaction Toolkit has a TrackedDeviceGraphicCaster that can be easier plugged onto an conventional UI Canvas object, the VR cursor detection is only partially supported. The TrackedDeviceGraphicCaster is able to detect the VR controllers pointing toward it, and can check any hovering action towards its children UI components. What it strangely lacks though, is the ability to returning a definite coordination of the VR cursor, despite this seemingly being a straightforward function. Fortunately, workaround is feasible with a collider mapped the canvas area. As the VR controller casts ray onto the collider, the hit position can be compared to dimension of the collider, returning a coordination point relative the the canvas. The point can is then used to show an cursor on the canvas, and also for implementation of section bracket explained below.
While our gameplay expects players to traverse the neon light schematic on the canvas, the steps are predefined for them so that process is done orderly. As player begin from the start, the progress is shown with a ruler-like bracket, as if the player is measuring the section while traversing it. This bracket serves well for indicating the direction for the player, and for extra sense of immersion.
In order the implement the section bracket, we once again enlist the help of the Unity.Spline package. An SplineContainer object is used to map the curve of a path section, then couples with a SplineExtrude component to visualize the curve with volume. A custom SplineHandles component is then implemented to make the one end of the path to follows the VR cursor, with detection on the progress on the section. A set of SplineHandles object is then grouped and works together to form the schematics inspection gameplay of a quest.
This is our progress so far. We are looking forward to upcoming gameplay implementations into Neon Atelier 🙂
___
Tracked Device Graphic Raycaster component | XR Interaction Toolkit | 3.0.8. (n.d.) https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@3.0/manual/tracked-device-graphic-raycaster.html