Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FIX] torch can not use GPU #924

Open
3 tasks
andreiramani opened this issue Sep 26, 2024 · 1 comment
Open
3 tasks

[FIX] torch can not use GPU #924

andreiramani opened this issue Sep 26, 2024 · 1 comment
Labels
fix Fix something that isn't working as expected

Comments

@andreiramani
Copy link

Describe the bug

I have successfully setup khoj using conda, postgresql 16, pytorch=2.1.0, torchvision=0.16.0, torchaudio=2.1.0, pytorch-cuda=12.1.

How to set torch.device to use my GPU?

Screenshots/Error message

UserWarning: Failed to initialize NumPy: DLL load failed while importing _multiarray_umath: The specified module could not be found. (Triggered internally at C:\cb\pytorch_1000000000000\work\torch\csrc\utils\tensor_numpy.cpp:84.)
device: torch.device = torch.device(torch._C._get_default_device()), # torch.device('cpu'),

Platform

  • Server:
    • Self-Hosted Python package
  • Client:
    • Web browser
  • OS:
    • Windows 11, RTX 3050, CUDA runtime 12.1
@andreiramani andreiramani added the fix Fix something that isn't working as expected label Sep 26, 2024
@andreiramani
Copy link
Author

Force upgrade to numpy 2.0.1 gave error start. So, I downgraded to numpy=1.26.4. Relaunch khoj successfully, but it still not detect the GPU.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
fix Fix something that isn't working as expected
Projects
None yet
Development

No branches or pull requests

1 participant