Skip to content

Latest commit

 

History

History
29 lines (18 loc) · 3.13 KB

README.md

File metadata and controls

29 lines (18 loc) · 3.13 KB

Description

Experimental project that enables non-official runtime worldspace Unity's UIToolkit with XR interaction.

This project is born from this reddit post where you can find some detailed information on how to setup a new project with the WorldSpaceUIDocument script.

I also posted some feedback on the official Unity's UIToolkit forums about my findings on the XR usecase for the package.

Documentation

The project is currently made with Unity 2021.2.13f1

XRInteractionExample scene

This scene just contains an example of how to setup the XR Interaction Toolkit package with a traditional worldspace canvas.

WorldSpace_UIToolkit scene

This scene contains the experimental worldspace UIToolkit panel that can be interacted within VR.

The first thing to notice is the WorldSpaceUIDocument script, attached to the WorldSpace_UIToolkit_Panel0 and WorldSpace_UIToolkit_Panel1 game objects. This script takes advantage of the UIToolkit's feature that outputs to a render texture to create a worldspace panel, and also handles some required operations to make this panel work with the EventSystem. This script can be used alone in any non-XR project.

The second part of the experimental work are the modifications of some original scripts from the XR Interaction Toolkit package:

  • The XRUIInputModuleFix component that must be added to the EventSystem object instead of the original XRUIInputModule. This fixes the dragging events that doesn't work with the original module.
  • The IUIInteractorRegisterer component that must be added to any ray interactor object that you want to enable interactions with your UI panel. This works together with the XRUIInputModuleFix module.

The XRUIInputModuleFix component has a new field called "Use Pen Pointer Id Base". This is a hack on how the UIToolkit pointer events are handled that will slightly change how interaction with multiple XR controllers work together. If enabled, you will be able to trigger hover events with both controllers (although only one of them will work if both pointers are hovering the same panel at the same time). You will also be able to drag the background to scroll with both controllers.

Conclusions

The current Unity's EventSystem for UI interactions (either with the traditional canvas or the new UIToolkit) looks too complex and convoluted for me. Also it is definitelly not well suited for new use cases like VR, where we can have multiple mouse-like pointers that can trigger hover events. The main pain point of the whole system is the fact that all the events are hardcoded to be projections from the screenspace pixels, which is definitelly not the case with some kinds of worldspace interactions.

I hoped for the new UIToolkit project to refactor the old Unity's EventSystem but it is not the case right now. I hope that they refactor it some day and make some better documentation so it is actually feasible to make your own custom UI input modules.