You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Processing a large dataset (300000 + images), and after about 16 hours, the memory use is now up to around 25 GB. Running with append=false in the toml file, and --num_chuncks 4 on the command line, pyopia version 2.5.5
The text was updated successfully, but these errors were encountered:
Can we try "gc.collect()" from garbage collection module at the end of the "def process_file_list(file_list, c)"??? So, it would release the unused memory (I used it for streaming mode of Vimba_python). The code can be modified as:
import gc
def process_file_list(file_list, c):
for ...:
try:
....
gc.collect()
Can we try "gc.collect()" from garbage collection module at the end of the "def process_file_list(file_list, c)"??? So, it would release the unused memory (I used it for streaming mode of Vimba_python). The code can be modified as:
import gc def process_file_list(file_list, c): for ...: try: .... gc.collect()
Could you try this out and report back if you saw a decrease in memory use for long running processes? If so, please open a pull request with your changes.
Describe the bug
Processing a large dataset (300000 + images), and after about 16 hours, the memory use is now up to around 25 GB. Running with append=false in the toml file, and --num_chuncks 4 on the command line, pyopia version 2.5.5
The text was updated successfully, but these errors were encountered: