You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've noticed large graph warnings from Dask when working with reasonably-sized stackstac DataArrays, like
UserWarning: Large object of size 1.73 MiB detected in task graph:
One thought: what if this is a situation like dask/dask#8008? When we turn the asset table into a Dask array, we're making one chunk per element. What if each of these embedded elements aren't actually size one, but reference the entire memory of the asset table? That would make the serialized size of the asset table N^2!
Naw, I think that's unlikely. Serialization isn't dumb enough to copy the entire buffer even when it's not needed. There's probably some other cruft in there.
The text was updated successfully, but these errors were encountered:
I've noticed large graph warnings from Dask when working with reasonably-sized stackstac DataArrays, like
One thought: what if this is a situation like dask/dask#8008? When we turn the asset table into a Dask array, we're making one chunk per element. What if each of these embedded elements aren't actually size one, but reference the entire memory of the asset table? That would make the serialized size of the asset table N^2!Naw, I think that's unlikely. Serialization isn't dumb enough to copy the entire buffer even when it's not needed. There's probably some other cruft in there.
The text was updated successfully, but these errors were encountered: