Selecting random points of a PolyData (Stratified sampling)

Hi Charles. Thank for the answer but it doesn’t work. The points are clustered over the metaphysis and sparse around the diaphysis of the bone using the script I posted firstly with this options:

maskPointsFilter.SetRandomModeType(4)#UNIFORM_SPATIAL_SURFACE 
maskPointsFilter.SetMaximumNumberOfPoints(desiredNumberOfPoints)

Is your mesh a clean surface with only convex polygons ?

I think so, yes. Only triangles I think.

So indeed, this is not the expected output. Which version of VTK are you using ?

Can you share the dataset ? (or any dataset on which you observe the problem)

Here is the bone:

I think Slicer uses VTK 8 version, but I’m not sure

This version of VTK does not contains the UNIFORM_* mode yet unfortunately :confused:

Here is a result with ParaView using this mode:

Maybe you can use ParaView 5.9.1 or 5.10.0-RC1 or VTK 9 to extract these points ?

That result looks good. Although I think I’ll have to wait to Slicer to update its VTK version and try it. The idea is to launch the registration script inside Slicer or in the Slicer’s python interactor.

As the image above) shows (and I experienced this outside of VTK, too), surface-weighted random sampling produces highly non-uniform point distribution on surface meshes: points are often clumped together in small groups and there are huge empty areas between these clumps.

Can someone explain this? Is it just because the number of samples is relatively small (and the distribution would eventually become uniform when you reach hundreds of thousands of samples)?

Is there a solution in VTK that would provide more uniform distribution for few hundred or few thousand points? Is there a point sampler or a post-processing filter that would try to maximize distance between sample points on the surface?

Current Slicer version uses VTK9. The last stable release was still created with VTK8.

You can use vtkPoissonDiskSampler if you want evenly spaced points in your output. This filter guarantees that the output cannot have 2 points closer than Radius.

BTW. I just discovered a bug that can make the filter crash… I’ll push a fix.

2 Likes

Thank you for the suggestion, this point distribution looks very nice - random, but uniform.

The algorithm is pretty simple. The process is called by some people “dart throwing”.

Shuffle the points from the input. For each point taken in the shuffled order, if no point in the output is closer than Radius to the sample, you throw it away. If not, you keep it in the output.

2 Likes

Will this algorithm work for meshes having around 2-3 million points ?
Currently the code is crashing and I am checking if reducing the dimension will make it work.

No, it didn’t work for a small mesh (or pointset).

I don’t know if it helps but try to check if your mesh has non-manifold edges. Maybe it doesn’t work because of that

I tried for pointset also i.e. not edges but it crashed.

Also checking the sphere test case https://gitlab.kitware.com/vtk/vtk/-/blob/v9.1.0/Filters/Points/Testing/Cxx/TestPoissonDiskSampler.cxx

If it only crashes if you have millions of points then the problem might be that you run out of memory. Please run your executable in a debugger (or get a stack trace) to see where the crash happens and what the error is.

Ok I will do it. But wanted to inform that the size is not a problem. It crashes for a small mesh also.
I also wanted to confirm if the input should be a mesh or pointset. As per the document, it should take pointset but I checked and the method works for a polydata sphere, however if you convert the polydata to pointset then it crashes for the same input sphere.

Following example code for demonstration:

import vtk

def readvtk(filename):
    a = vtk.vtkPolyDataReader()
    a.SetFileName(filename)
    a.Update()
    m1 = a.GetOutput()
    return m1

sphere = vtk.vtkSphereSource()
sphere.SetThetaResolution(200)
sphere.SetPhiResolution(100)
sphere.SetRadius(1.0)
sphere.Update()
sphere = sphere.GetOutput()

points = sphere.GetPoints()


pointset = vtk.vtkPointSet()
pointset.SetPoints(points)

print(pointset.GetNumberOfPoints())

f = vtk.vtkPoissonDiskSampler()

if 1:
    print('Using pointset')
    # pointset as input will fail
    f.SetInputData(pointset)
else:
    print('Using polydata')
    # polydata as input works
    f.SetInputData(sphere)

f.SetRadius(0.1)
f.Update()

output = f.GetOutput()

print(output.GetNumberOfPoints())

Your code snippet works well for me on Windows, using VTK v9.1 (20220125). There is no crash and prints 751.

It will work for polydata but not for pointset. There is a flag in the code snippet to test the two cases.

To see if it is not due to installation issues I created a Colab notebook also.
https://colab.research.google.com/drive/1EayhemN5FHRFaZMbTId7U8zozFY6FQb7?usp=sharing

Also, there is one line by mistake in the documentation which needs to be removed.
https://vtk.org/doc/nightly/html/classvtkPoissonDiskSampler.html
“generate point normals using local tangent planes”

I am also uploading a small test triangle mesh to see if vtkPoissonDiskSampler works on it.
It crashes for me.
basic_gem.vtk (866 Bytes)

The VTK-9.1.0 version you are using does not include the fix for the crash that happens when point normals are missing. You can either use a more recent VTK version or add normals to your points.

1 Like