You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a large structured .pvts file, and when I read it with the following code, I was surprised to find that the memory always overflows, but Paraview runs normally. It seems that getting the coordinates is very memory expensive:
using ReadVTK
fname ="./plt-100.pvts"
vtk =PVTKFile(fname)
# point data
p_data =get_point_data(vtk)
# mesh cordinate
x,y,z =get_coordinate_data(vtk)
Nx, Ny, Nz =size(x)
# variables
p =get_data_reshaped(p_data["p"])
u =get_data_reshaped(p_data["u"])
v =get_data_reshaped(p_data["v"])
T =get_data_reshaped(p_data["T"])
ρ =get_data_reshaped(p_data["rho"])
It is very likely you are the first person to use this code for a large file, so there might very well be some inefficiencies nobody has noticed yet 😅. Can you boil it down to which call exactly is the one that causes the most allocations?
It is very likely you are the first person to use this code for a large file, so there might very well be some inefficiencies nobody has noticed yet 😅. Can you boil it down to which call exactly is the one that causes the most allocations?
I have a large structured
.pvts
file, and when I read it with the following code, I was surprised to find that the memory always overflows, but Paraview runs normally. It seems that getting the coordinates is very memory expensive:File metadata:
The text was updated successfully, but these errors were encountered: