OpenXR support in VTK

Hi all,

The development of an OpenXR module has started and the goal is, at first, to imitate the OpenVR module, and then factorize the VR code between OpenVR and OpenXR in a GenericVR module. Notice that we could also factorize some code with the zSpace plugin (available in ParaView and surely later in VTK directly), such as for the InteractorStyle and RenderWindowInteractor.

The branch on which I am working on OpenXR is still a WIP branch, but we can already render a VTK scene in any headset that supports OpenGL backend, on Windows (and Linux too but not tested yet), with head and controller tracking. An OpenXRInteractor is in charge of converting inputs (eg. “/user/hand/right/input/trigger”) to an action (eg. “showmenu”). Then, an OpenXRInteractorStyle maps an action to an EventId (or a function).

The available VTK actions are specified in a file named vtk_openxr_actions.json, and the bindings between actions and inputs are specified in a file that depends on the type of headset you have (for example, you have a file vtk_openxr_binding_htc_vive_controller.json that uses the available inputs of the HTC Vive controller).

There is also a concept of ActionSets that are application-defined collections of actions. Bindings in a binding_***.json file are linked to an action set so that you can remap all bindings to other actions if you select an other action set. Taken from the OpenXR specification :

For example, a game may have one set of actions that apply to controlling a character and another set for navigating a menu system.

Every communication with the OpenXR runtime passes through a singleton class named vtkOpenXRManager. It handles initialization / destruction of all the needed structures (Instance and its Extensions, Session) and stores the rendering resources and view / projection given by the OpenXR runtime. Notice that at the opposite of OpenVR, this is the runtime that creates OpenGL / DirectX textures in which we need to render at each frame.

Considering the support of the Hololens2 headset, here are useful thoughts :

As @ken-martin said in this post, OpenXR doesn’t require support for all graphics library, and Hololens is supporting OpenXR but not the OpenGL backend. The problem here is that the Hololens has an ARM architecture and needs an UWP app to work.

What we could do to support the hololens :

  • Use a remote app to render VTK in OpenGL and then blit into a shared DirectX texture to stream it to the Hololens. For that, we can follow this link to connect the hololens to our OpenXR based app using the Holographic Remoting package.

  • If we don’t want / can’t do streaming (with at least 50ms delay it could be impratical), we need to render all VTK in DirectX using a DirectXRenderWindow, and build VTK as an UWP app. However that would mean rewriting all of our shaders, so we can forget this. What it means is that we need to build VTK as an UWP app (it should be OK?) and render VTK using OpenGL and then use a shared OpenGL/DirectX buffer inside the Hololens. Here is the BIG BIG problem : how can we render using OpenGL, if the Hololens hardware do not support OpenGL ? Does that means that we need to build Mesa (or other OpenGL implementation) inside the headset ? But is it even possible ?

8 Likes

I’m just throwing this out there.

Would it be possible to render VTK with OpenGL ES 2.0 piped to DirectX using MS ANGLE?
https://chromium.googlesource.com/angle/angle

The original github repository also had some experimental support for HoloLens.

2 Likes

It’s great to see OpenXR support coming in!

Just FYI, OpenVR has a limited shelf life, so I wouldn’t invest too much effort in making it work. Valve is, or already has, moved to OpenXR as a backend.

FYI We currently have a grant to implement the remote rendering to HoloLens. Although this listing is currently expired, we will be re-listing shortly.

https://uwo.bonfirehub.ca/opportunities/39136

1 Like

@Paul Lafoix I was recently facing same problem. I tried many process but all in wane. Thanks for doing this. You shared very useful info. :slightly_smiling_face:

1 Like

Hi everyone,
Thanks for the support! I saw that you implemented a Paraview plugin for it to work on ZSpace, so I had questions about how to do the same but with 3D Slicer. Do you know if I can use this plugin, or some part of it, to display my 3D models with the ZSpace? Because right now it is not usable directly from VTK as you said, so I thought using the plugin would be my best shot. Thanks for the help!

Hi @MarineC,

Unfortunately you can’t use directly the paraview plugin inside 3D Slicer. You could however import the classes inside the paraview plugin (the zSpace SDK manager to interact with the zSpace SDK, the Interactor + Interactor Style, the Camera and the RayActor) and adapt it to be used inside a RenderWindow but that would need some code :slight_smile:

Hi,

Had another question about your plugin, I post it here but can create a new topic if you want !
Why didn’t you choose to use a Qt Widget? As this class seems to have everything needed to make a render window working with crystal eyes ? I thought it would be easier this way at first, but as a newbie, I may have completely missed something :sweat_smile:

Thanks !