I am trying to use ResliceCursorWidget with additional components. Currently I display additional actors (spheres) and some polydata, nothing crazy, but it is getting slower. I realized that the GPU is not being used and seems that all calculations are running on GPU. I’m no expert with GPU, is there a way to check if GPU is actually used except than looking at Task Manager? I tried with Chrome and Chrome Canary with and without --enable-unsafe-webgpu.
Also, do I need to download any specific packages to make calculations possible at the gpu (other than import @kitware/vtk.js/Rendering/Profiles/All
If you go to WebGL Report, does the unmasked vendor + unmasked renderer indicate you are using your machine’s GPU?
There is a POC that implements GPU with ResliceCursorWidget.
I checked WebGl Report and it says that I am using the GPU. I’m sorry I might have been unclear, my main question was more about calculation for the reslicing than rendering
Thanks for the reply! I got this to work but the rendering felt weird, it gets blurry during modification (e.g. mouse button down) then clear when done. That’s not a deal breaker but it just didn’t feel as an improvement. I guess the problem is not the rendering on GPU rather making the calculation for the reslicing on the GPU. Is that POC supposed to solve that, or only the redenering?
The POC avoids to do a CPU reslicing. Instead it does a simple GPU reslicing. The “weird” feeling is due to the existing default behavior that lower the GPU resolution on “interaction”. I think you can turn it off by looking at the render window interactor.
The POC also shows a great performance when the number of “slabs” is high.
There are however still many rendering oddities that must be tackled to be usable/mergable.