Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Predict is aborted with a .tflite model #8426

Open
lhofinger opened this issue Oct 27, 2024 · 2 comments
Open

Predict is aborted with a .tflite model #8426

lhofinger opened this issue Oct 27, 2024 · 2 comments
Assignees
Labels
type:bug Something isn't working

Comments

@lhofinger
Copy link

Hello,

System information

Describe the current behavior
Trying to use BirdNET model https://github.com/rdz-oss/BattyBirdNET-Analyzer/raw/refs/heads/main/checkpoints/V2.4/BirdNET_GLOBAL_6K_V2.4_Model_FP32.tflite

But I get this error:
classif.js:22 Error: RuntimeError: Aborted(). Build with -sASSERTIONS for more info.
at abort (tflite_web_api_cc_simd.js:9:6515)
at abort (tflite_web_api_cc_simd.js:9:60528)
at tflite_web_api_cc_simd.wasm:0x1baad
at tflite_web_api_cc_simd.wasm:0x2e1f87
at tflite_web_api_cc_simd.wasm:0x2f907
at tflite_web_api_cc_simd.wasm:0x522f0
at tflite_web_api_cc_simd.wasm:0x32b150
at tflite_web_api_cc_simd.wasm:0x17e14
at TFLiteWebModelRunner.TFLiteWebModelRunner$Infer [as Infer] (eval at new
(tflite_web_api_cc_simd.js:9:33059), :8:10)
at module$exports$google3$third_party$tensorflow_lite_support$web$tflite_web_api_client.TFLiteWebModelRunner.infer (tflite_web_api_client.js:2714:188)

The same code, with a simple model is not aborting.

Standalone code to reproduce the issue
Simple code with issue: https://www.paludour.net/bats/classification.html

@lhofinger lhofinger added the type:bug Something isn't working label Oct 27, 2024
@shmishra99 shmishra99 self-assigned this Oct 28, 2024
@shmishra99
Copy link
Contributor

Hi @lhofinger ,

I am able to reproduce the issue you're facing. I created a dummy dataset of 144,000 points and encountered the same error.

image

I've noticed that the model sometimes halts for a while before throwing the error. Additionally, models with fewer tensors seem to work without problems.
While the model is running, I've observed that browser memory consumption becomes very high. I suspect this could be an 'out-of-memory' issue due to the execution of large tensors.

Thank You!!

@lhofinger
Copy link
Author

Hello and thank you. Great that you have reproduced the problem. I'm not sure it is a memory problem. This model works great in Python with not too much memory. Can anyone look deeper?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants