How to use ImageCPRMapper to get CPR on the transverseplane

Hello, I saw the addition of CPR function a while ago. I imitated the example and tried to surface reconstruct the spine on the sagittal plane , and it seems to be successful.
image
But I attempted tooth reconstruction in the transverse plane and failed.

I use the vectors of line segments as the first column of the matrix, the normal of the current plane as the second column of the matrix, the cross product of two vectors as the third column, and point as the fourth column. Use this matrix as a orientation for points.
image

In fact, I am not very clear about the meaning of this sentence, and trying various values did not result in successful reconstruction.

mapper.setDirectionMatrix(baseDirections);

How can I do next? Thank you in advance.

Hello,

Your centerline needs:

  • Points in model coordinates (they take spacing, origin and direction of the ImageData into account)
    This is how they are set in the example:
  // Set positions of the centerline (model coordinates)
  const centerlinePoints = Float32Array.from(centerlineJson.position);
  const nPoints = centerlinePoints.length / 3;
  centerline.getPoints().setData(centerlinePoints, 3);
  • A polyline that gives the order of the points.
    This is how they are set in the example:
  // Set polylines of the centerline
  const centerlineLines = new Uint16Array(1 + nPoints);
  centerlineLines[0] = nPoints;
  for (let i = 0; i < nPoints; ++i) {
    centerlineLines[i + 1] = i;
  }
  centerline.getLines().setData(centerlineLines);
  • A PointData that gives an orientation for each point. See the documentation for getOrientationArrayName for more information.
    This is how they are set in the example (using 4x4 matrices):
  // Create a rotated basis data array to oriented the CPR
  centerline.getPointData().setTensors(
    vtkDataArray.newInstance({
      name: 'Orientation',
      numberOfComponents: 16,
      values: Float32Array.from(centerlineJson.orientation),
    })
  );
  centerline.modified();

Now, you also have to choose between stretched and straightened mode.
I think that for your application, the straightened mode is the way to go.
It is the default but in case you want to make sure that this mode is used:

mapper.useStraightenedMode();

Then, you can set the direction matrix. Depending on which plane you want (transversal in your example) and the orientation you gave to your points, you will choose a 3x3 matrix. You can see this matrix as a transformation that you apply to all your points.
The most important part of this matrix is the first vector. It will determine the direction that will be used to sample voxels at each local frame. The local frame is determined by the local position and orientation given by your input centerline.

In the example below, you can see in dark red the local frame determined by position and orientation of the centerline. And in green, the direction matrix. The samples will be taken along the tangent direction. In this example the tangent direction is [1, 0, 0], the bitangent is [0, -1, 0] and the normal is [0, 0, -1]. So the direction matrix is [1, 0, 0, 0, -1, 0, 0, 0, -1].

cpr explanation

1 Like

I used the normal vector of the transverse plane as the first vector in WorldDirections and it looked successful. Thank you very much for your help!
image

2 Likes

Hello,

Lanranlaura,

I am currently facing the same problem as you did. Could you please share more details about how you resolved it?

I tried change the worldDirections:

const worldBitangent = widgetPlanes[stretchViewType].normal;
  const worldNormal = widgetPlanes[stretchViewType].viewUp;
  widgetPlanes[crossViewType].normal = worldNormal;
  widgetPlanes[crossViewType].viewUp = worldBitangent;
  const worldTangent = vec3.cross([], worldBitangent, worldNormal);

But it did not result in success. Looking for your help. Thank you in advance.

To change the direction matrix of the ImageCPRMapper, you need to call setDirectionMatrix
For example, if tangent, bitangent and normal are of size 3:

mapper.setDirectionMatrix(...tangent, ...bitangent, ...normal);

Do you use this function?

Hello, I have a new problem. my CPR function is ready, but during testing, I found that CPR occupies a considerable amount of memory. If used together with the VR function, it requires a high level of computer performance. I think webgpu will improve performance.Therefore, I attempted to run CPR example in webgpu mode but failed. May I ask if there is any plan for cpr to support webgpu?

Hello,
The CPR Mapper needs to upload the whole volume to the GPU, which means that memory consumption is a lot higher than with a simple ImageMapper.
WebGPU support is not planned, but memory consumption should be the same if the approach is also to upload the whole volume to the GPU.