Does VTKHDF Format Support Dynamic Meshes with Changing Topology Over Time?

Hello.
I’m currently working on a project that involves exporting time-varying unstructured grid data into a VTKHDF file. In my case, the mesh geometry (e.g., the number of cells, points, connectivity, and offsets) changes at each time step, and I need to represent these changes accurately in the VTKHDF format.

I understand that VTKHDF provides support for temporal data through the Steps group, which tracks time step-specific offsets for Points, Connectivity, Offsets, and Types. However, I am unsure if the format explicitly supports scenarios where the mesh topology itself changes over time (e.g., adding/removing cells or updating connectivity).

Specifically, I attempted to write a Python script to test whether VTKHDF supports dynamic meshes. The code doesn’t work as intended when handling dynamic meshes, but if I set the point_offsets, cell_offsets, and connectivity_id_offsets to zero (effectively treating it as a static mesh), it works perfectly.
I would greatly appreciate it if someone could provide guidance or help me resolve this issue. Thank you in advance!

import pyvista as pv
import numpy as np
import h5py

def create_mesh_grid():
    """Create a simple structured grid and convert to unstructured."""
    x = np.arange(-50, 50, 5)
    y = np.arange(-150, 150, 10)
    z = np.arange(0, 70, 5)
    
    x, y, z = np.meshgrid(x, y, z, indexing='ij')
    grid = pv.StructuredGrid(x, y, z)
    return pv.UnstructuredGrid(grid)

def compute_displacement(grid, time_factor):
    """Compute displacement field for the mesh."""
    disp = np.zeros((grid.n_points, 3))
    points = grid.points
    
    # Define displacement zone
    indices = (points[:, 1] > -100) & (points[:, 1] < 100) & (points[:, 2] > 50)
    
    # Scale displacement by time factor
    disp[indices, 0] = 8.0 * time_factor
    disp[indices, 1] = 6.0 * time_factor
    disp[indices, 2] = 10.0 * time_factor
    
    return disp

def compute_stresses(grid, time_factor):
    """
    Compute stress field for the mesh and multiply by time_factor
    so that stress is zero at time=0 and maximum at time=1.
    """
    rho = 2400      # kg/m^3
    gravity = 9.81  # m/s^2
    num_gauss_points_per_cell = 8
    
    cell_centers = grid.cell_centers().points
    stress_per_gauss = np.zeros((grid.n_cells, num_gauss_points_per_cell, 6))
    
    for i, center in enumerate(cell_centers):
        depth = center[2]
        # Baseline gravity-induced stress
        sigma_zz = rho * gravity * depth
        
        # We'll store [σ_xx, σ_yy, σ_zz, σ_xy, σ_yz, σ_xz]
        stress_per_gauss[i, :, 0] = 0.4 * sigma_zz  # σ_xx
        stress_per_gauss[i, :, 1] = 0.3 * sigma_zz  # σ_yy
        stress_per_gauss[i, :, 2] = sigma_zz        # σ_zz
        # Shear components are zero here, but you could also scale them if desired
        stress_per_gauss[i, :, 3] = 0.0             # σ_xy
        stress_per_gauss[i, :, 4] = 0.0             # σ_yz
        stress_per_gauss[i, :, 5] = 0.0             # σ_xz
    
    # Scale stresses by time_factor to get time dependence
    stress_per_gauss *= time_factor
    
    # Return as a 2D array: (n_cells * num_gauss_points_per_cell, 6)
    return stress_per_gauss.reshape(grid.n_cells * num_gauss_points_per_cell, 6)

def create_vtkhdf_file(filename, grid, time_steps):
    """Create VTKHDF file with temporal data and geometry appended at each step."""
    with h5py.File(filename, 'w') as f:
        # Create root group and set attributes
        root = f.create_group('VTKHDF')
        root.attrs['Version'] = (2, 2)
        root.attrs.create('Type', 'UnstructuredGrid'.encode('ascii'),
                          dtype=h5py.string_dtype('ascii', len('UnstructuredGrid')))
        
        # -- Create main datasets to track geometry sizes at each time step
        root.create_dataset('NumberOfPoints', (0,), maxshape=(None,), dtype='i8')
        root.create_dataset('NumberOfCells', (0,), maxshape=(None,), dtype='i8')
        root.create_dataset('NumberOfConnectivityIds', (0,), maxshape=(None,), dtype='i8')
        
        # -- Create *resizable* geometry datasets (empty at first)
        # We will append geometry for each time step in the loop
        root.create_dataset('Points',       (0, 3), maxshape=(None, 3), dtype='f')
        root.create_dataset('Connectivity', (0,),   maxshape=(None,),   dtype='i4')
        root.create_dataset('Offsets',      (0,),   maxshape=(None,),   dtype='i4')
        root.create_dataset('Types',        (0,),   maxshape=(None,),   dtype='u1')
        
        # -- Create data groups for point and cell data
        point_data = root.create_group('PointData')
        point_data.create_dataset('disp',
                                  shape=(0, 3),
                                  maxshape=(None, 3),
                                  dtype='f')
        
        cell_data = root.create_group('CellData')
        cell_data.create_dataset('stress',
                                 shape=(0, 6),  # 6 components in a symmetric stress tensor
                                 maxshape=(None, 6),
                                 dtype='f')
        
        # -- Create Steps group
        steps_group = root.create_group('Steps')
        steps_group.attrs['NSteps'] = np.uint64(0)  # will be updated in the loop
        
        # -- Create temporal datasets
        values = steps_group.create_dataset('Values', (0,),
                                            maxshape=(None,),
                                            dtype='f')
        
        # "point_offsets" and "cell_offsets" describe how geometry changes over time
        point_offsets = steps_group.create_dataset('PointOffsets', (0,),
                                                   maxshape=(None,),
                                                   dtype='i8')
        cell_offsets = steps_group.create_dataset('CellOffsets', (0, 1),
                                                  maxshape=(None, 1),
                                                  dtype='i8')
        
        # Offsets for point/cell data
        point_data_offsets = steps_group.create_group('PointDataOffsets')
        disp_data_offsets = point_data_offsets.create_dataset('disp',
                                                              (0,),
                                                              maxshape=(None,),
                                                              dtype='i8')
        
        cell_data_offsets = steps_group.create_group('CellDataOffsets')
        stress_data_offsets = cell_data_offsets.create_dataset('stress',
                                                               (0,),
                                                               maxshape=(None,),
                                                               dtype='i8')
        
        # Additional offsets in the file structure
        part_offsets = steps_group.create_dataset('PartOffsets', (0,),
                                                  maxshape=(None,),
                                                  dtype='i8')
        connectivity_id_offsets = steps_group.create_dataset('ConnectivityIdOffsets', (0,),
                                                             maxshape=(None,),
                                                             dtype='i8')
        
        
        # -- Loop over time steps and add data
        for i, time in enumerate(time_steps):
            current_size = values.shape[0]  # how many steps we have so far
            new_size = current_size + 1
            
            # Resize "per-step" arrays
            for ds in [root['NumberOfPoints'],
                       root['NumberOfCells'],
                       root['NumberOfConnectivityIds']]:
                ds.resize((new_size,))
            
            # Resize step-based arrays
            values.resize((new_size,))
            point_offsets.resize((new_size,))
            cell_offsets.resize((new_size, 1))
            disp_data_offsets.resize((new_size,))
            stress_data_offsets.resize((new_size,))
            part_offsets.resize((new_size,))
            connectivity_id_offsets.resize((new_size,))
            
            # -- Append geometry for this time step --
            # Points
            points_start = root['Points'].shape[0]
            root['Points'].resize((points_start + grid.n_points, 3))
            root['Points'][points_start:] = grid.points
            
            # Connectivity
            connectivity_start = root['Connectivity'].shape[0]
            conn = grid.cell_connectivity
            root['Connectivity'].resize((connectivity_start + len(conn),))
            root['Connectivity'][connectivity_start:] = conn
            
            # Offsets
            offsets_start = root['Offsets'].shape[0]
            off = grid.offset
            root['Offsets'].resize((offsets_start + len(off),))
            root['Offsets'][offsets_start:] = off
            
            # Types
            types_start = root['Types'].shape[0]
            typ = grid.celltypes
            root['Types'].resize((types_start + len(typ),))
            root['Types'][types_start:] = typ
            
            # Record geometry sizes in the standard arrays
            root['NumberOfPoints'][current_size] = grid.n_points
            root['NumberOfCells'][current_size] = grid.n_cells
            # Each cell is a hexahedron with 8 connectivity indices
            root['NumberOfConnectivityIds'][current_size] = grid.n_cells * 8
            
            # Record time value
            values[current_size] = time
            
            # Record geometry offsets for this step
            point_offsets[current_size] = points_start
            cell_offsets[current_size, 0] = offsets_start
            connectivity_id_offsets[current_size] = connectivity_start
            
            # If you have only one partition, "part_offsets" is often just 0
            part_offsets[current_size] = 0
            
            # --- Displacement ---
            disp = compute_displacement(grid, time)
            current_disp_size = point_data['disp'].shape[0]
            disp_data_offsets[current_size] = current_disp_size
            # Expand 'disp' dataset to accommodate new data
            point_data['disp'].resize((current_disp_size + grid.n_points, 3))
            point_data['disp'][current_disp_size:] = disp
            
            # --- Stress ---
            stress = compute_stresses(grid, time)  # scaled by "time"
            current_stress_size = cell_data['stress'].shape[0]
            stress_data_offsets[current_size] = current_stress_size
            # Expand 'stress' dataset to accommodate new data
            cell_data['stress'].resize((current_stress_size + stress.shape[0], 6))
            cell_data['stress'][current_stress_size:] = stress
            
            # Update step count
            steps_group.attrs['NSteps'] = np.uint64(i + 1)

def main():
    # Create mesh and define time steps
    grid = create_mesh_grid()
    # 5 time steps from 0 to 1.0 (stresses and displacements are zero at time=0, max at time=1)
    time_steps = np.linspace(0, 1.0, 5)
    
    # Create VTKHDF file
    create_vtkhdf_file('temporal_data_with_stress_dynamic_geometry.hdf', grid, time_steps)

if __name__ == '__main__':
    main()

Hello @amnp95,

However, I am unsure if the format explicitly supports scenarios where the mesh topology itself changes over time (e.g., adding/removing cells or updating connectivity).

It definitively makes sense to support such scenario.

Few month ago, I wrote an issue regarding “varying cells” (which is exactly what you call ‘dynamic meshes’) : https://gitlab.kitware.com/vtk/vtk/-/issues/19257

As you can see, we discuss a lot with another contributor, the current status is here: https://gitlab.kitware.com/vtk/vtk/-/issues/19257#note_1548628. In short, I think that the current spec is enough to support it and the error is in the vtkHDFReader because we don’t read correctly the NumberOfCells dataset, however we may need to investigate it more to confirm that.

Currently, I don’t have funding to continue to investigate it or to fix it. If you have any insight do not hesitate to share it (and if you want to contribute, you’re welcome :slightly_smiling_face: )