VTK-8.2.0 install failure MacOS Mojave VTK_USE_X=On

Although I can successfully build VTK-8.2.0 with Qt and VTK_USE_COCOA=On I need to install it to use X11 and when I turn VTK_USE_X=On and COCOA=Off and direct VTK to /opt/local/lib for OpenGL install then I get this error message:

Undefined symbols for architecture x86_64:
“_kCFCoreFoundationVersionNumber”, referenced from:
vtkOpenGLRenderer::HaveApplePrimitiveIdBug() in vtkOpenGLRenderer.cxx.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make[2]: *** [lib/libvtkRenderingOpenGL2-8.2.1.dylib] Error 1
make[1]: *** [Rendering/OpenGL2/CMakeFiles/vtkRenderingOpenGL2.dir/all] Error 2
make: *** [all] Error 2

Any suggestions?


I don’t think anyone uses the combination of MacOS + X11 with VTK. In other words, “here be dragons”, you are in unexplored territory. I’m sure it can be tweaked so that it works, but don’t know how many “tweaks” are needed. Probably lots.

this software tries to.


But, I have it running on my linux server so I guess it’s a no never mind now.

If it’s still of interest to get it built on the Mac, I had the same error on Sierra and found it wants to lookup that symbol to check the OS X version for applying a bug fix. As the fix should no longer be necessary on 10.12 and later, simply commenting the whole block out worked for me:

--- VTK-8.2.0-orig/Rendering/OpenGL2/vtkOpenGLRenderer.cxx      2019-01-30 18:15:13.000000000 +0100
+++ VTK-8.2.0/Rendering/OpenGL2/vtkOpenGLRenderer.cxx   2020-04-11 05:54:12.000000000 +0200
@@ -766,10 +766,10 @@
     // Apple fixed this bug in OS X 10.11 beta 15A216g.
     // kCFCoreFoundationVersionNumber10_10_Max = 1199, we use the raw number
     // because the constant isn't present in older SDKs.
-    if (kCFCoreFoundationVersionNumber <= 1199)
-    {
-      this->HaveApplePrimitiveIdBugValue = true;
-    }
+    // if (kCFCoreFoundationVersionNumber <= 1199)
+    // {
+    //   this->HaveApplePrimitiveIdBugValue = true;
+    // }
     // but exclude systems we know do not have it
     std::string renderer = (const char *)glGetString(GL_RENDERER);