Render to texture, and then use this texture on another actor.

Hi everyone,

TL;DR : Does someone have an example where a vtkFramebuffer is used to render a texture which is then used into another actor ?

I’m new to vtk.js, and I would like to render my polyData into a texture, and then use this texture to do some filtering on it and display a square over a SliceImage.

I first tried to create my own Mapper, and then its corresponding openglMapper through the viewNodeFactory. I tried to render to a vtkFrameBuffer and then display the texture with my own shaders (a basic square with a texture). But I think I messed up, because when I use my shader program, and my own vertex buffers, I get some WebGL errors on other mappers, about wrong locations of uniform.

Then I tried to create a custom RenderPass that render my object, that I insert in first position. The second pass will be the generic ForwardPass that will render my textured square as a PolyDataMapper.

The problem is that I don’t know how to “send” the rendered texture to the vtkOpenGLPolyDataMapper, since it is not yet created when I do my whole setup.

Does anyone have any clue, or an example I could look at ?

Thanks,
Etienne

If you want to render into a texture, check out what vtkHardwareSelector, which renders the scene into a framebuffer for color picking. beginSelection() is where we bind the framebuffer, and captureBuffers() is where we render and capture the buffer contents. You should be able to then use that as a texture for whatever else you need.

Thanks a lot for the answer !

I got the pixelData with :

const fbSize = model.renderedFrameBuffer.getSize();
let pixelData = model.openGLRenderWindow.getPixelData(0, 0, fbSize[0] - 1, fbSize[1] - 1); 

Then I created a canvas, and set its imageData :

canvas.width = fbSize[0];
canvas.height = fbSize[1];
let ctx = canvas.getContext('2d');
let idata = ctx.createImageData(fbSize[0], fbSize[1]); 
idata.data.set(pixelData);
ctx.putImageData(idata, 0, 0);

Then I created an Image from this canvas, and set the actor Texture :

const img = new Image();
img.onload = () => {
     actor.getTextures()[0].setImage(img);
};
img.src = canvas.toDataURL();

Definitely not the most efficient way, but it’s a good start.

Have the same problem here. Render to a texture, and then use this texture on another actor. We can use vtkWindowToImageFilter to finish the task. However I take a look at the source code and find vtkWindowToImageFilter copy the buffer from GPU memory to CPU memory, then we can use this CPU image on another actor( which I believe copy the buffer from CPU memory to GPU memory in the background). Is there any way to skip the GPU to CPU process and directy use the GPU image buffer on another actor?

Hello, what method do you use to customize shaders in vtk.js?
I replace shaders through addShaderReplacements in Ploydatamapper.
Can you share your method? I have been plagued by this problem for a long time recently.
I look forward to your answer!