Partitioning large meshes with uneven cell distribution

I’m trying to find a partitioning method that can scale for large unstructured meshes. I need to support mesh sizes that can be several hundred million cells.

I’m aware of vtkRedistributeDataSetFilter, and I’ve accessed it through the pyvista api, but it does not fit with my needs. The power of 2 preference limits how I can distribute the partitions afterwards across the cores I have. But the bigger issue is that this filter seems to not handle meshes with varied cell densities, such as if I have regions of high refinement, this results in the partitions sometimes being orders of magnitude different in cell counts.

The closest approach that has shown promise for me has been to use metis/pymetis. However this approach requires building a cell adjacency array like so:

#Cell 0 is connected to: cells 1, 3
#Cell 1 is connected to: cells 0, 2, 4
#Cell 2 is connected to: cells 1, 3
#Cell 3 is connected to: cells 0, 2, 4
#Cell 4 is connected to: cells 1, 3

adjncy = [1, 3, 0, 2, 4, 1, 3, 0, 2, 4, 1, 3]

xadj = [0, 2, 5, 7, 10, 12]

Using the metis approach I get consistent partition sizes with good clustering. The problem is building the adjncy array in a manner that scales well.

Is there a way to use vtk filters to make these arrays or otherwise get the partitioning style I described?