-
Notifications
You must be signed in to change notification settings - Fork 381
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issues with CUDA OOM #134
Comments
Hello @Sachin-Bhat Your RTX 3060 GPU might be running out of memory due to the high batch size during inference. To resolve this, try reducing the batch size using the --batch-size parameter. For example, start with a batch size of 16: python pdf_extract.py --pdf assets/examples/example.pdf --batch-size 16 If you still encounter memory issues, further reduce the batch size: python pdf_extract.py --pdf assets/examples/example.pdf --batch-size 8
python pdf_extract.py --pdf assets/examples/example.pdf --batch-size 4 This should help manage the GPU memory usage better. |
Hey @wangbinDL Thank you for the prompt response. I did try what you said. However even setting the batch size to 1 did not fix the issues. So I am unsure if this is a bug. Please let me know if I can try something else to work around this issue. Cheers, |
|
Hello there,
Firstly I would like to thank you for creating this tool.I am getting the following error with CUDA and was hoping I could get some assistance in debugging the issue. I am running a RTX 3060 GPU on my Laptop and I am currently unsure if my device meets the minimum requirements to run on the GPU.
I am currently running Artix Linux in case the distro information is needed.
Once again any assistance on this would be greatly appreciated.
Cheers,
Sachin
The text was updated successfully, but these errors were encountered: