I need to determine whether points from one polydata project onto cells in another polydata.
I could use vtkOBBTree.IntersectWithLine() but I’m concerned about performance.
It seems that GPU based ray casting could solve the problem with speed and I see that VTK supports GPU base ray casting via vtkGPUVolumeRayCastMapper and vtkOSPRayPolyDataMapperNode and others, but I can’t figure out how I could leverage these classes to do what I want. Basically, I don’t want to render using raycasting, I just want to project a bunch of rays emanating from the points in one polydata, along their normals, and figure out which cell (if any) they intersect in the other polydata.
Any ideas, hints, examples on how to do this will be gratefully received,
It seems super easy to do this using Intel EMBREE, which conveniently has Python bindings so unless I’m missing the obvious solution in VTK I’m eaning towarsd this approach instead
During my graduate years i was dealing with the same problem when i needed raytracing for global ambient occlusion. For CPU, I had to go with IGL, which integrates embree, and for GPU i had to go with Nvidia OptiX. I don’t know if embree now supports GPU also. But it would be really interesting to add CPU support using embree and build a structure like vtkOBBTree, or GPU support using OptiX inside VTK-m and build a similar structure.
@cory.quammen @will.schroeder what do you think?
As far as I can tell, Embree currently only supports CPU, but will (or maybe has by now) add support for Intel ARC hardware based raytracing.
This effort is to accelerate vtkPointLocator with Embree. It’s not immediately obvious to me that this would directly enable point projection?
Indeed, it appears to not be useful directly, but could be a basis for implementing a ray casting filter in VTK based on Embree. I’m afraid I’m not aware of anything in or planned for VTK that provides the functionality you are looking for.
I quickly tested Embree for point projection onto polydata and it was very easy to do. I strip the points, trias and quads out of the target polydata as numpy arrays, construct Embree gomety directly from these arrays and add them to an Embree Scene. Trias and Quads have to be in separate geometries.
For testing I use another polydata and extract the points and normals as numpy arrays and use them to directly construct an EMbree RayHit object. For my test I only use the normals but I should really use the reverse normals too
Then I intersect the Rayhit with the Scene and the intersections are added to the RayHit - Index of the cell that was hit, uv location on that cell, global location of the intersections and distance from the source to the intersection.
I’m using the method that terminates after the first intersection and an infinte ray length, but there are other options, incuding all intersections, closest point etc.
The only issue I have at the moment is that rays that originate from a location that lises ON a cell are ignored which isn’t good for my application. There does seem to be an Embree option to disable this behavior but it’s a compile time option…
Next release of embree will support GPU, starting with Intel’s.
Exactly. The PointLocator seemed to me to be an ideal first use case to demonstrate the utility of applying optimized RT capabilities for general processing in VTK, but I’ve got a couple of others in my wish list to try later. I am excited to learn about this case too and am hoping that embree as an optional part of VTK will be helpful here.