-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: issue with isModelGenerating when switching between multiple models #73
fix: issue with isModelGenerating when switching between multiple models #73
Conversation
if (llamaModule != null || isFetching) { | ||
promise.reject("Model already loaded", "Model is already loaded or fetching") | ||
if (isFetching) { | ||
promise.reject("Model already loaded", "Model is already fetching") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The first argument seems wrong
ios/RnExecutorch/LLM.mm
Outdated
if(self->runner || isFetching){ | ||
|
||
if(isFetching){ | ||
reject(@"model_already_loaded", @"Model and tokenizer already loaded", nil); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this error message correct? you're not checking if its loaded
@@ -148,10 +149,6 @@ class LLM(reactContext: ReactApplicationContext) : | |||
llamaModule!!.stop() | |||
} | |||
|
|||
override fun deleteModule() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why are we getting rid of this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I noticed that it doesn't really change memory usage, but let's leave it for now as it's not breaking anything
Description
There was a problem when user was switching between multiple llms in one component, to fix this issue I removed code related to handling strict mode problems(strict mode was causing problems with event listeners also so I think we should ignore it as it isn't the best with background tasks), I've also remove deleteModule native function as it wasn't really doing anything, now user can download multiple llms within one component and seamlessly switch between them without bugs.
The problem was mentioned in issue #42
Type of change
Tested on
Testing instructions
Screenshots
Related issues
Checklist
Additional notes