You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I have searched extensively for a solution but have not been able to resolve the issue, so I am writing this message.
I tried running the mediapipe-samples - LLM inference - android example on two devices following the guide:
Galaxy S22 Ultra, GPU: Adreno 730 818 MHz
Galaxy S10e GPU: Mali-G76 MP12
Android 8.1 Device based on ARM Cortex A53 MP Processor, GPU: ARM Mali T820 MP1
Model used: gemma-2b-it-gpu-int8.bin
In environment 1, 2, the model inferred successfully.
In environment 3, the model displayed an error message saying: "internal: Failed to initialize engine: %INVALID_ARGUMENT: Unknown backend"
I believe this issue occurs because the backend cannot find the GPU.
I would like to know what the cause of the issue is and how to resolve it.
The text was updated successfully, but these errors were encountered:
Hello, I have searched extensively for a solution but have not been able to resolve the issue, so I am writing this message.
I tried running the mediapipe-samples - LLM inference - android example on two devices following the guide:
Model used: gemma-2b-it-gpu-int8.bin
In environment 1, 2, the model inferred successfully.
In environment 3, the model displayed an error message saying:
"internal: Failed to initialize engine: %INVALID_ARGUMENT: Unknown backend"
I believe this issue occurs because the backend cannot find the GPU.
I would like to know what the cause of the issue is and how to resolve it.
The text was updated successfully, but these errors were encountered: