TL;DR : Does someone have an example where a vtkFramebuffer is used to render a texture which is then used into another actor ?
I’m new to vtk.js, and I would like to render my polyData into a texture, and then use this texture to do some filtering on it and display a square over a SliceImage.
I first tried to create my own Mapper, and then its corresponding openglMapper through the viewNodeFactory. I tried to render to a vtkFrameBuffer and then display the texture with my own shaders (a basic square with a texture). But I think I messed up, because when I use my shader program, and my own vertex buffers, I get some WebGL errors on other mappers, about wrong locations of uniform.
Then I tried to create a custom RenderPass that render my object, that I insert in first position. The second pass will be the generic ForwardPass that will render my textured square as a PolyDataMapper.
The problem is that I don’t know how to “send” the rendered texture to the vtkOpenGLPolyDataMapper, since it is not yet created when I do my whole setup.
Does anyone have any clue, or an example I could look at ?