Why does FlyingEdges resulting Data have more points than the original?

`Hello everyone,

I have a question and was wondering if anyone could chime in on this. I've been working with the FlyingEdges3D class and recently I noticed on closer inspection that the resulting vtkPolyData generated possesses more points than the input data. Does anyone have any idea why this is so assuming that's the expected behaviour.

Something is fishy (is your scalar field continuous? i.e., not discrete; and how many contour values do you have?): What size is your volume? Running the modified test below with a single isocontour and input volume resolution 30^3 yields something that is typical:

vtkpython TestFlyingEdges3D.py
Number of points in volume: 27000
Number of points in flying edges isocontour: 3696

When you say ‘scalar field is continuous’ do you mean ComputeScalar flag? If so, I do have the ComputeScalarsOff() set. Although to be more exact, I’m feeding an image data vtkImageData to the filter after it’s been processed by both ‘vtkImageResample’, ‘vtkImageContinuousDilate3D’ then the ‘vtkImageConvolve’ for smoothing purposes before I run the FlyingEdges.

vtkImageData -> vtkImageResample -> vtkImageContinuousDilate3D -> vtkFlyingEdges3D
The issue I seem to be perceiving is that when I run the algorithm in the fashion described above the subsequent vtkPolyData result has more points than when I run the same steps described but feed the result to a vtkContourFilter.

vtkImageData -> vtkImageResample -> vtkImageContinuousDilate3D -> vtkContourFilter

Is there something I’m missing.?

Okay this is clearer now.

  • ComputeScalars simply indicates that scalar data is to be computed on the surface (this scalar data is just the isocontour values). It has nothing to do with the number of output points.

  • Scalar fields can be categorized as continuously varying, or discrete (as for example a “label map”). The algorithms for these types of scalars are different, and sometimes users try to use continuously-varying algorithms such as MC or FE on discrete data - this doesn’t work well due to the problem of interpolating across discrete data.

  • As briefly alluded to in the Flying Edges paper, the output of FE and Marching Cubes (and Synchronized Templates) may be different. This is because FE may produce degenerate (i.e., zero-area) triangles, including coincident points. In visualization applications this is typically not an issue; however in modeling applications (e.g., generating a finite element mesh or solid model) this can be problematic. Typically I’ll use a vtkStaticCleanPolyData to remove these degeneracies if necessary.

Background: The reason for this is due to the computational efficiencies used in FE, especially as related to parallelization. From a high-performance perspective, MC has some problematic algorithmic features, including a centralized point locator to merge coincident points (anything centralized in a parallel algorithm is typically a computing bottleneck). MC operates voxel cell by voxel cell, and incrementally inserts points and triangles: any degenerate triangles are rejected at the time of insertion. FE on the other hand performs a precomputation to determine how much output is to be produced, allocates output structures, and then actually computes the output. This precomputation step cannot take into account degeneracies without significant performance impacts. Since FE’s primary objectives were to create a scalable, very fast single-pass isocontouring visualization algorithm, this tradeoff is deemed to be acceptable. (Note: I’ll update the documentation for VTK’s FE to be clearer.)

Thanks for the expeditious and detailed response, it makes a lot more sense to me now as I was beginning to suspect the issue might be with one of the aforementioned filters I used before FE is applied. I’ll retest using the suggested vtkStaticCleanPolyData to see if it makes any difference, if I’m not mistaken in the past I’ve had an issue when I ran ‘vtkStaticCleanPolyData’ on some poly data input where I lost the normals (but I may be wrong). Thanks a lot, it’s really appreciated.

If you lose normals, it’s probably a bug, let us know.

I just gave it a try now and I would like to say thanks a lot Will, the vtkStaticCleanPolyData resolved the issue, now I see exactly the same set of points as I expected it to be when I use MC (ContourFilter) with my previously described steps. the CleanPolyData also doesn’t tamper with the normals.

1 Like