Script doesn't use GPU #586
Replies: 4 comments 22 replies
-
Did you set the |
Beta Was this translation helpful? Give feedback.
-
You can also verify whether CUDA was correctly detected when building PyGeNN by running the following: import pygenn
print(pygenn.genn_model.backend_modules) If the first entry in the dictionary doesn't have a key of "CUDA" then the CUDA backend was not build correctly |
Beta Was this translation helpful? Give feedback.
-
do you maybe have any other suggestions? |
Beta Was this translation helpful? Give feedback.
-
Apologies, I've been out of the office for a few days. Your output suggests that the CUDA backend was built and loaded correctly so it should be being used. Could you post the output from the |
Beta Was this translation helpful? Give feedback.
-
I run my pyGeNN script on ubuntu, despite following the steps V and VI in the GeNN installation process (and ensuring that my GPU, CUDA and everything is compatible and set up), my script somehow only seems to run over my CPU and doesn´t connect to the GPU - it's very slow and when checking with the top program, 100% of my CPU is being used for the process which shouldn´t be the case when actually using the GPU. Nothing changes upon un- and reinstalling. Is this maybe some sort of known error you have a simple solution for?
Maybe it helps clarifying my problem, when running the script I also get multiple messages like
Beta Was this translation helpful? Give feedback.
All reactions