Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] I can haz cancel button (on predict)? #18

Open
tlambert03 opened this issue Jan 22, 2025 · 7 comments
Open

[Feature] I can haz cancel button (on predict)? #18

tlambert03 opened this issue Jan 22, 2025 · 7 comments
Labels
feature New feature or request

Comments

@tlambert03
Copy link
Contributor

Image

not sure how easy it would be to enable a "cancel" button next to the predict button.

@tlambert03 tlambert03 added the feature New feature or request label Jan 22, 2025
@jdeschamps
Copy link
Member

Dude 😆

So this is one of the pain point and it is good that you open an issue for that. There is currently no good way I know of to interrupt PyTorch Lightning prediction...

One possibility would be to be able to fake a KeyBoardInterrupt in the prediction thread worker. No idea how to do that, but maybe you are more inventive! 😄

@tlambert03
Copy link
Contributor Author

did somebody say "pain points"??

Image

@mese79
Copy link
Contributor

mese79 commented Jan 22, 2025

if it's a thread then we can threaten it with quit maybe, using a stop button?

if worker is not None:
    worker.quit()

@tlambert03
Copy link
Contributor Author

it's actually harder than you think to just kill a thread in python (went down a somewhat deep rabbit hole on this when creating the thread_worker pattern originally). This post was somewhat relevant... but doesn't solve anything, just provides a workaround pattern that one can implement. And we did implement something like this in the superqt/napari threadworker... and it requires that the function running in the thread be a generator that yields periodically. (That yield can be leveraged as a clean stopping point). docs here.

While you do decorate a generator in your prediction here, the yielding that happens in there isn't where we really need to have it, which is inside of the actual work function careamist.predict here. I haven't dug deeper yet into that function, but presumably it loops over tiles somehow? that loop is where you really need to have a hook into somehow if you want to be able to call worker.quit and have it do anything

@tlambert03
Copy link
Contributor Author

I suspect you would need to instantiate your Trainer with a custom callback and implement
on_predict_batch_end? (assuming the tiles get broken into batches?)

looks like Careamist does allow you to pass in custom callback? So i think you could implement this in napari-careamics without needing to modify upstream

@jdeschamps
Copy link
Member

Ah yes that's a very good idea. I was hoping to not have to implement our own mechanism in CAREamics... It may be possible to interrupt the tiling, we can have a look. If users pass a large image, then this cannot be interrupted elegantly I believe. We also have prediction from disk in CAREamics that has a loop over files if I remember correctly, that can also be interrupted.

I would not go down the callback route unless it is to throw an error and interrupt inelegantly the thread (but it probably runs in a different thread), but I guess just use an actual flag in CAREamist and dry up the Datasets is a good way. Or I am missing soemthing about the callback mechanism.

I opened an issue here: CAREamics/careamics#370

PS: @mese79 let's be careful about @tlambert03's PRs, I have the feeling we may end up with the NDV showing AI memes instead of the images 😆 😆

@tlambert03
Copy link
Contributor Author

It may be possible to interrupt the tiling, we can have a look. If users pass a large image, then this cannot be interrupted elegantly I believe.

Yeah that seems like a reasonable compromise. FWIW, I did have tiling on (and probably too small) when I experienced the slow prediction that I wanted to cancel which made me open this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants