So these are my current bindings, I added in the click as a tertiary event pretty easily. My main thinking behind the conventions is that the (A/B) and (X/Y) buttons should act more like a mouse whilst the trigger and possibly grip buttons can handle the more specific VR movements. Moving the menu event to the dedicated menu button made sense and the thumbstick is already treated like a scroll wheel so mapping its click to middle press felt pretty natural.
I think the values might be expressible as separate x & y values (it was in the oculus page) but it’s not a huge dealbreaker if I can’t add in camera turning.
I think my main concern is probably what you said here:
A lot of interaction in VTK relies on (click, drag, release) interactions, the main one I’m using at the moment is the vtkImagePlaneWidget which has operations for left, middle and right click events. With how the action system is currently set up the release from OpenXR is registered but then it’s translated to a vtkCommand without keeping track of whether the action was a press or a release so it gets triggered twice. I don’t know if there’s a workaround to build a state-recording callback that just maps that behaviour to a proper press-release behaviour.
vtk_openxr_actions.json (2.7 KB)
vtk_openxr_binding_oculus_touch_controller.json (2.8 KB)