Hi all,
The development of an OpenXR module has started and the goal is, at first, to imitate the OpenVR module, and then factorize the VR code between OpenVR and OpenXR in a GenericVR module. Notice that we could also factorize some code with the zSpace plugin (available in ParaView and surely later in VTK directly), such as for the InteractorStyle and RenderWindowInteractor.
The branch on which I am working on OpenXR is still a WIP branch, but we can already render a VTK scene in any headset that supports OpenGL backend, on Windows (and Linux too but not tested yet), with head and controller tracking. An OpenXRInteractor
is in charge of converting inputs (eg. â/user/hand/right/input/triggerâ) to an action (eg. âshowmenuâ). Then, an OpenXRInteractorStyle
maps an action to an EventId (or a function).
The available VTK actions are specified in a file named vtk_openxr_actions.json, and the bindings between actions and inputs are specified in a file that depends on the type of headset you have (for example, you have a file vtk_openxr_binding_htc_vive_controller.json that uses the available inputs of the HTC Vive controller).
There is also a concept of ActionSets that are application-defined collections of actions. Bindings in a binding_***.json file are linked to an action set so that you can remap all bindings to other actions if you select an other action set. Taken from the OpenXR specification :
For example, a game may have one set of actions that apply to controlling a character and another set for navigating a menu system.
Every communication with the OpenXR runtime passes through a singleton class named vtkOpenXRManager
. It handles initialization / destruction of all the needed structures (Instance and its Extensions, Session) and stores the rendering resources and view / projection given by the OpenXR runtime. Notice that at the opposite of OpenVR, this is the runtime that creates OpenGL / DirectX textures in which we need to render at each frame.
Considering the support of the Hololens2 headset, here are useful thoughts :
As @ken-martin said in this post, OpenXR doesnât require support for all graphics library, and Hololens is supporting OpenXR but not the OpenGL backend. The problem here is that the Hololens has an ARM architecture and needs an UWP app to work.
What we could do to support the hololens :
-
Use a remote app to render VTK in OpenGL and then blit into a shared DirectX texture to stream it to the Hololens. For that, we can follow this link to connect the hololens to our OpenXR based app using the Holographic Remoting package.
-
If we donât want / canât do streaming (with at least 50ms delay it could be impratical), we need to render all VTK in DirectX using a DirectXRenderWindow, and build VTK as an UWP app. However that would mean rewriting all of our shaders, so we can forget this. What it means is that we need to build VTK as an UWP app (it should be OK?) and render VTK using OpenGL and then use a shared OpenGL/DirectX buffer inside the Hololens. Here is the BIG BIG problem : how can we render using OpenGL, if the Hololens hardware do not support OpenGL ? Does that means that we need to build Mesa (or other OpenGL implementation) inside the headset ? But is it even possible ?