I have a question and was wondering if anyone could chime in on this. I've been working with the FlyingEdges3D class and recently I noticed on closer inspection that the resulting vtkPolyData generated possesses more points than the input data. Does anyone have any idea why this is so assuming that's the expected behaviour.
Something is fishy (is your scalar field continuous? i.e., not discrete; and how many contour values do you have?): What size is your volume? Running the modified test below with a single isocontour and input volume resolution 30^3 yields something that is typical:
vtkpython TestFlyingEdges3D.py
Number of points in volume: 27000
Number of points in flying edges isocontour: 3696
When you say āscalar field is continuousā do you mean ComputeScalar flag? If so, I do have the ComputeScalarsOff() set. Although to be more exact, Iām feeding an image data vtkImageData to the filter after itās been processed by both āvtkImageResampleā, āvtkImageContinuousDilate3Dā then the āvtkImageConvolveā for smoothing purposes before I run the FlyingEdges.
vtkImageData -> vtkImageResample -> vtkImageContinuousDilate3D -> vtkFlyingEdges3D
The issue I seem to be perceiving is that when I run the algorithm in the fashion described above the subsequent vtkPolyData result has more points than when I run the same steps described but feed the result to a vtkContourFilter.
ComputeScalars simply indicates that scalar data is to be computed on the surface (this scalar data is just the isocontour values). It has nothing to do with the number of output points.
Scalar fields can be categorized as continuously varying, or discrete (as for example a ālabel mapā). The algorithms for these types of scalars are different, and sometimes users try to use continuously-varying algorithms such as MC or FE on discrete data - this doesnāt work well due to the problem of interpolating across discrete data.
As briefly alluded to in the Flying Edges paper, the output of FE and Marching Cubes (and Synchronized Templates) may be different. This is because FE may produce degenerate (i.e., zero-area) triangles, including coincident points. In visualization applications this is typically not an issue; however in modeling applications (e.g., generating a finite element mesh or solid model) this can be problematic. Typically Iāll use a vtkStaticCleanPolyData to remove these degeneracies if necessary.
Background: The reason for this is due to the computational efficiencies used in FE, especially as related to parallelization. From a high-performance perspective, MC has some problematic algorithmic features, including a centralized point locator to merge coincident points (anything centralized in a parallel algorithm is typically a computing bottleneck). MC operates voxel cell by voxel cell, and incrementally inserts points and triangles: any degenerate triangles are rejected at the time of insertion. FE on the other hand performs a precomputation to determine how much output is to be produced, allocates output structures, and then actually computes the output. This precomputation step cannot take into account degeneracies without significant performance impacts. Since FEās primary objectives were to create a scalable, very fast single-pass isocontouring visualization algorithm, this tradeoff is deemed to be acceptable. (Note: Iāll update the documentation for VTKās FE to be clearer.)
Thanks for the expeditious and detailed response, it makes a lot more sense to me now as I was beginning to suspect the issue might be with one of the aforementioned filters I used before FE is applied. Iāll retest using the suggested vtkStaticCleanPolyData to see if it makes any difference, if Iām not mistaken in the past Iāve had an issue when I ran āvtkStaticCleanPolyDataā on some poly data input where I lost the normals (but I may be wrong). Thanks a lot, itās really appreciated.
I just gave it a try now and I would like to say thanks a lot Will, the vtkStaticCleanPolyData resolved the issue, now I see exactly the same set of points as I expected it to be when I use MC (ContourFilter) with my previously described steps. the CleanPolyData also doesnāt tamper with the normals.