You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for sharing the code. I successfully ran the code on a regression datasets (tsra). Additionally, I also explored two ways to run my custom dataset
a) I created my own .ts file, although, as you mentioned in other discussions, this might not be necessary and I can create a custom class to load the dataset and simply update the 'readdata' function.
b) I developed a custom class that traverses the directory and loads all the pickle datasets (multiple files). However, I've noticed that it loads all the data regardless of the batch size. Since I have multiple files, each ranging from a few megabytes to gigabytes in size, it can be a bottleneck to load all the data at once into memory. Is there any workaround for this? I was wondering dataloader but if it was already implemented or if you have explored then will appreciate your insights on this.
Thanks,
The text was updated successfully, but these errors were encountered:
Hi gzerveas,
Thank you for sharing the code. I successfully ran the code on a regression datasets (tsra). Additionally, I also explored two ways to run my custom dataset
a) I created my own .ts file, although, as you mentioned in other discussions, this might not be necessary and I can create a custom class to load the dataset and simply update the 'readdata' function.
b) I developed a custom class that traverses the directory and loads all the pickle datasets (multiple files). However, I've noticed that it loads all the data regardless of the batch size. Since I have multiple files, each ranging from a few megabytes to gigabytes in size, it can be a bottleneck to load all the data at once into memory. Is there any workaround for this? I was wondering dataloader but if it was already implemented or if you have explored then will appreciate your insights on this.
Thanks,
The text was updated successfully, but these errors were encountered: