Hi.
I’m loading a dicom dataset with vtk pipeline and volume rendering it and vtk.web protocols on a remote system. And accessing that on a different machine. I see that the GPU utilization is very low on the remote system only 3-4% when I interact with the data. But at the same time, local gpu is getting used around 10%. How can I resolve this to use fully only the remote server gpu and not the local system gpu, even if gpu exists locally.
Note:
Remote GPU: Nvidia Quadro 4000
Local GPU: Nvidia GeForce GTX 1650
Dataset Size: 200mb
Mapper: vtkGPUVolumeRayCastMapper
Web protocols used: vtkWebMouseHandler, vtkWebViewPort, and override vtkWebPublishImageDelivery like in this example - web-project-templates/vtkw-server.py at master · Kitware/web-project-templates · GitHub