I have two renderers one for a VR application and one as a standart 3D view in an QT GUI. I want to display these two vtkCamera via two vtkCameraActor and pass them to the other renderer.
The two applications run in their respective QThreads and the goal is to show the VR user the location of the desktop user’s camera and vice versa.
For a short time both vtkCameraActors move correctly in both renderers and then the application crashes with an exception in vtkAbstractTransform.cxx in the method void vtkTransformConcatenation::Concatenate(const double elements) because of this->PreMatrix was nullptr.
Is there any particular technique that should be followed when displaying a camera in different renderers?
Thanks to all!