OpenXR and the Oculus Quest 2

So I’ve revisited using VTK in VR on the Quest, this time inspired by developments in the recent blog post and some support from @LucasGandel. I’ve produced a small example to test the VR capabilities. In general the functionality is really good, but the interaction style could do with some tweaking to properly conform with modern VR standards.


class bcolors:
    HEADER = '\033[95m'
    OKBLUE = '\033[94m'
    OKCYAN = '\033[96m'
    OKGREEN = '\033[92m'
    WARNING = '\033[93m'
    FAIL = '\033[91m'
    ENDC = '\033[0m'
    BOLD = '\033[1m'
    UNDERLINE = '\033[4m'

import vtk
import sys

try:
    from vtkmodules.vtkRenderingOpenXR import vtkOpenXRRenderer, vtkOpenXRRenderWindowInteractor, vtkOpenXRRenderWindow
except ImportError:
    print(bcolors.FAIL + "vtkOpenXR modules not found. Please ensure you have the vtkOpenXR module installed." + bcolors.ENDC)
    print(bcolors.WARNING + "These modules are currently only found on Windows" + bcolors.ENDC)
    print(bcolors.WARNING + "and require VTK to be built with OpenXR support." + bcolors.ENDC)
    print(bcolors.WARNING + "pip install --extra-index-url https://wheels.vtk.org vtk --pre --no-cache" + bcolors.ENDC)
    sys.exit(1)

def create_VR_renderer(background_color=[0.03, 0.03, 0.15]):
        

    # Create the renderer and render window

    ren = vtkOpenXRRenderer()
    renwin = vtkOpenXRRenderWindow()
    iren = vtkOpenXRRenderWindowInteractor()

    renwin.SetInteractor(iren)
    renwin.AddRenderer(ren)
    ren.SetBackground(*background_color)  # Set background color to a nice blue
    ren.SetShowFloor(True)
    ren.SetShowLeftMarker(True)
    ren.SetShowRightMarker(True)
    renwin.SetPhysicalViewUp(0, 0, 1)  # Set the physical view up direction
    renwin.SetPhysicalViewDirection(0, 1, 0)  # Set the physical view direction
    renwin.RenderModels()  # Enable model rendering
    renwin.SetPhysicalScale(1.0)  # Set the physical scale for the scene, seems to get overridden later
    renwin.SetBaseStationVisibility(True)
    iren.SetDesiredUpdateRate(72.0) # FPS of Quest 2
    renwin.SetDesiredUpdateRate(72.0)
    
    # Set the path to the directory containing openxr_controllermodels.json
    renwin.SetModelsManifestDirectory('C:/VR/')

    # // Set the path to the directory containing vtk_openxr_actions.json
    iren.SetActionManifestDirectory('C:/VR/')
    iren.SetRenderWindow(renwin)

    return ren, renwin, iren

def main():
 
    ren, renwin, iren = create_VR_renderer(background_color=[0.03, 0.03, 0.15])

    menu = make_vrmenu_widget(ren, iren)
    panel = make_VR_panel_widget(ren, iren)

    ren.AddActor(create_cone_actor())

    # Initialize the OpenXR renderer and interactor
    iren.Initialize()
    renwin.SetPhysicalScale(1.0)  # Set the physical scale for the scene
    renwin.Render()

    iren.Start()

def make_vrmenu_widget(ren, iren):

    vrmenu_widget = vtk.vtkVRMenuWidget()

    rep = vtk.vtkVRMenuRepresentation()
    rep.VisibilityOn()
    rep.SetRenderer(ren)
    rep.SetNeedToRender(True)

    vrmenu_widget.SetRepresentation(rep)

    vrmenu_widget.SetInteractor(iren)
    vrmenu_widget.SetCurrentRenderer(ren)
    vrmenu_widget.SetDefaultRenderer(ren)
    vrmenu_widget.SetEnabled(True)
    vrmenu_widget.On()

    # print(vrmenu_widget, dir(vrmenu_widget))
    # print("VR Menu Widget created with representation:", rep)
    # print(sorted(dir(vrmenu_widget)), sorted(dir(rep)))

    return vrmenu_widget


def make_VR_panel_widget(ren, iren):

    VR_panel = vtk.vtkVRPanelWidget()
    rep = vtk.vtkVRPanelRepresentation()
    rep.VisibilityOn()
    rep.SetRenderer(ren)
    rep.SetNeedToRender(True)
    rep.SetAllowAdjustment(True)
    rep.SetText("VR Panel test")

    rep.PlaceWidget([0, 1, 0.5, 1.5, -3, -2])  # Set the position of the panel in the VR space
    VR_panel.SetRepresentation(rep)
    VR_panel.SetInteractor(iren)
    VR_panel.SetCurrentRenderer(ren)
    VR_panel.SetDefaultRenderer(ren)
    VR_panel.SetEnabled(True)
    VR_panel.ProcessEventsOn()
    VR_panel.SetManagesCursor(True)  # Manage the cursor visibility in VR
    VR_panel.SetDebug(True)
    VR_panel.On()

    # print("VR Panel Widget created with representation:", rep)
    # print(sorted(dir(VR_panel)), sorted(dir(rep)))

    return VR_panel



def create_cone_actor():

    coneSource = vtk.vtkConeSource()
    coneSource.SetHeight(3.0)
    coneSource.SetRadius(1.0)
    coneSource.SetResolution(600)
    coneSource.Update()
    mapper = vtk.vtkPolyDataMapper()
    mapper.SetInputConnection(coneSource.GetOutputPort())
    actor = vtk.vtkActor()
    actor.SetMapper(mapper)

    # actor.GetProperty().SetColor(colors.GetColor3d("Banana").GetData())

    return actor

if __name__ == "__main__":
    main()

Currently I’m working on understanding how the VR interaction works, the existing implementation is a little dated (seems like it’s designed for Vive wands & one-handed control, rather than knuckle-style controllers) and so it’d be good to get more intuitive interactions that make use of the grip buttons and joystick more.

My current usage case revolves around using both a vtkBoxWidget2 and vtkImagePlaneWidget to control a reconstruction in 3D space. The box widget works great in the ‘Grab mode’ of the VR interaction but the interaction with the ImagePlaneWidget is relatively poor (no cursor support, slicing, setting the scale etc.)

It’d be nice to have the grab action bound to only the grip button, whilst the (X/Y/A/B) buttons could emulate left/right mouse click events and have the click of the joystick count as a middle-click event. Left/right on the joystick currently does nothing, neither does the dedicated menu button on the left controller.

It looks like the movement style can be changed, there’s a class named GroundMovement3D, I just need to figure out how to actually use it.

It’s nice that you can enable the floor in the VR scene to help with VR comfort, but I’d suggest that the lighting should be disabled for that object, at oblique angles the floor goes completely black which reduces its usefulness.

For some reason I had to re-set the physical world scale after adding the actors, not a major issue though.

I still need to figure out how best to add in UI elements into the scene, VRMenu and VRPanel look promising. Interestingly the 2D objects in VTK like vtkBorderWidget are still displayed which might work as a HUD with some tweaking, though I’m not sure if they properly handle the left-right perspective properly. If I can figure out how to attach a VRPanel to the controller positions then I’ll try to make a UI that way.

In terms of hardware I’m using a Quest 2 with a PrismXR Puppis S1 router to reduce latency. The GPU is an RTX5000 but I can run with an A6000 if needed. Airlink works reasonably well but I’ve found Virtual Desktop to be more stable with their VDXR implementation. Virtual Desktop also allows for hand tracking which is neat, though not super useful without the joystick.

Just thought I’d document my initial findings in case it was useful to improving the functionality in the future.

1 Like

Thanks for taking the time to write this up and providing your complete sample application, it will be useful as we work on improvements in the future!

Regarding a “walk mode” for scenes without verticality: vtkVRInteractorStyle.h does have some api to set the movement style. The options are “FLY_STYLE” and “GROUNDED_STYLE”, I’m guessing the grounded one may behave like you want, though I haven’t tried it.

Scott

So I just figured that part out and you’re completely right, it looks like there is already grounded and flying support, you can set the style by calling iren.GetInteractorStyle().SetStyle(iren.GetInteractorStyle().GROUNDED_STYLE)).

I’m also looking into how the json file for the oculus controllers can add in the events I need, I likely just need to set up the json events to correspond to some appropriate observers in the interactor style.

So I just figured that part out and you’re completely right, to use the grounded version you just call iren.GetInteractorStyle().FLY_STYLE = 1

Good to know, thanks for reporting.

I’m also looking into how the json file for the oculus controllers can add in the events I need, I likely just need to set up the json events to correspond to some appropriate observers in the interactor style.

Yes, you may need to update the actions.json and then update or add new controller bindings json. Please continue to post here regarding your success or obstacles :slight_smile:

Scott

So I’ve made progress on a few fronts, adding some custom actions to the vtk_openxr_actions.json and changing the corresponding bindings in the oculus controller json allows for the interactor to add a set of new actions using iren.AddAction(), I then added some dummy observers to check the actions were being processed.

Currently I’m just using a convention (PrimaryButtonAction, SecondaryButtonAction) to describe the (B/A and Y/X) button pairs, having the top buttons (B & Y) be the primary action felt more natural if a little confusing in the bindings. I moved the action showmenu to /user/hand/left/input/menu which works great, in general the righthand menu button should be reserved for the oculus system so I’ve left that one unbounded.

Something that’s not clear to me is how to properly emulate the LeftButtonReleaseEvent for the buttons. Using a dummy observer I can see that VR button releases also just trigger the LeftButtonPressEvent. Looking at the C++ method HandleBooleanAction it seems like the press state is recorded, but I don’t know how to pass that information.

I added the thumbstick click as a middle button press, that was actually pretty easy to do. @Lucas.Givord it looks like there’s some discussion in one of the pull requests for the VTK bindings about the difference between input/a/click and input/a/touch, I’m pretty sure these two come from the fact that the oculus controllers have capacitive buttons that sense when your thumb rests on them without pressing. There’s also a small capacitive pad on each controller that is sometimes used to show where your thumb is in some games, largely the /touch actions are to give the user a little visual feedback rather than actually provide inputs.

Fixing the issue with the floor lighting was simple, I just looked at what actors were added to the renderer immediately after calling the SetShowFloor() method and set the ambient and specular lighting to be more appropriate.

For the controller models I did something similar to what was mentioned in the blog post, took the models from SteamVR and converted them to GLTFs, I did this in Blender (more familiar for me) and needed to delete the hierarchy to get them to display properly. I get a warning about KHR_materials_specular & ior not being supported but they seem to work fine.

So I’ve made progress on a few fronts, adding some custom actions to the vtk_openxr_actions.json and changing the corresponding bindings in the oculus controller json allows for the interactor to add a set of new actions using iren.AddAction(), I then added some dummy observers to check the actions were being processed.

Currently I’m just using a convention (PrimaryButtonAction, SecondaryButtonAction) to describe the (B/A and Y/X) button pairs, having the top buttons (B & Y) be the primary action felt more natural if a little confusing in the bindings. I moved the action showmenu to /user/hand/left/input/menu which works great, in general the righthand menu button should be reserved for the oculus system so I’ve left that one unbounded.

Ok, good to know you found a way to verify the new actions are working. If you feel like sharing the changes to the actions and your new bindings, it could help to illustrate the clunky things better.

Something that’s not clear to me is how to properly emulate the LeftButtonReleaseEvent for the buttons. Using a dummy observer I can see that VR button releases also just trigger the LeftButtonPressEvent. Looking at the C++ method HandleBooleanAction it seems like the press state is recorded, but I don’t know how to pass that information.

It might just be a limitation currently that VTK can’t properly detect release? I’m not sure, but we’ll keep it in mind next time we’re in there working on things.

I’m also unsure how the joystick click is handled, presumably its entry is missing in the json.

IIRC there is some custom C++ code handling joystick events in a special way, though I’m not sure why that was needed. But it probably means you can’t configure joystick events/handling using the json, unfortunately.

Fixing the issue with the floor lighting was simple, I just looked at what actors were added to the renderer immediately after calling the SetShowFloor() method and set the ambient and specular lighting to be more appropriate.

So you have something looking ok for now, but it still seems desirable to have some easier way to enable/disable lighting on the floor.

For the controller models I did something similar to what was mentioned in the blog post, took the models from SteamVR and converted them to GLTFs, I did this in Blender (more familiar for me) and needed to delete the hierarchy to get them to display properly. I get a warning about KHR_materials_specular & ior not being supported but they seem to work fine.

That sounds familiar, I think I saw something similar when adding the existing models. I think I just removed those bits from the gltf manually to get rid of the warning, but maybe my memory is failing me.

Thanks again for all of the valuable feedback!

Cheers,
Scott

So these are my current bindings, I added in the click as a tertiary event pretty easily. My main thinking behind the conventions is that the (A/B) and (X/Y) buttons should act more like a mouse whilst the trigger and possibly grip buttons can handle the more specific VR movements. Moving the menu event to the dedicated menu button made sense and the thumbstick is already treated like a scroll wheel so mapping its click to middle press felt pretty natural.

I think the values might be expressible as separate x & y values (it was in the oculus page) but it’s not a huge dealbreaker if I can’t add in camera turning.

I think my main concern is probably what you said here:

A lot of interaction in VTK relies on (click, drag, release) interactions, the main one I’m using at the moment is the vtkImagePlaneWidget which has operations for left, middle and right click events. With how the action system is currently set up the release from OpenXR is registered but then it’s translated to a vtkCommand without keeping track of whether the action was a press or a release so it gets triggered twice. I don’t know if there’s a workaround to build a state-recording callback that just maps that behaviour to a proper press-release behaviour.

vtk_openxr_actions.json (2.7 KB)
vtk_openxr_binding_oculus_touch_controller.json (2.8 KB)