-
Notifications
You must be signed in to change notification settings - Fork 854
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Engine Reuse Fails with Different JSON Schemas - "Module has already been disposed" Error #560
Comments
CharlieFRuan
added a commit
that referenced
this issue
Sep 23, 2024
Prior to this PR, when reusing the same engine but for different schema will run into error "Module has already been disposed". An example to reproduce this is included in #560. This is because `this.tokenTable` is a `tvmjs.TVMObject` and will be disposed after the scope ends. We fix this by wrapping the call with `this.tvm.detachFromCurrentScope()`, and only dispose `this.tokenTable` when we dispose the entire `LLMChatPipeline`. Co-authored-by: SMarioMan <[email protected]>
This should be fixed by #571 and will be available in the next npm! |
Should be fixed with npm version |
Thanks for the more proper fix! The issue has been resolved on my end. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
When using web-llm, there is an issue with reusing the same
MLCEngine
instance across multiple completion requests if the response schema changes between requests. Specifically, if we initialize the engine and generate a completion with a schema, subsequent completions with a different schema cause the engine to throw the error:Module has already been disposed
This error occurs despite the ability to reuse the engine for multiple requests when the schema remains the same or is not provided at all. To maintain feature parity with the OpenAI API, the same MLCEngine instance should be able to handle multiple schemas.
This may be related to #486.
Below is a my reproducer code:
reproducer.ts
:reproducer.astro
:Example output:
The text was updated successfully, but these errors were encountered: