From 13ecbb5b4656f0a224b461b0a41dc5095f02d8d0 Mon Sep 17 00:00:00 2001 From: Constantin Pape Date: Sun, 27 Oct 2024 21:26:55 +0100 Subject: [PATCH] Update workshop (#764) Update workshop --- workshops/i2k_2024/README.md | 26 +++++++++++++------------- workshops/i2k_2024/download_models.py | 5 +++++ 2 files changed, 18 insertions(+), 13 deletions(-) create mode 100644 workshops/i2k_2024/download_models.py diff --git a/workshops/i2k_2024/README.md b/workshops/i2k_2024/README.md index cf3db61d..673f1e18 100644 --- a/workshops/i2k_2024/README.md +++ b/workshops/i2k_2024/README.md @@ -20,12 +20,11 @@ Alternatively you can also work on model finetuning or an advanced application, To prepare for the workshop, please do the following: - Install the latest version of `micro_sam`, see [Installation](#installation) for details. -- Download the pre-computed embeddings for the first 3D segmentation data, see [here](#download-embeddings-for-3d-segmentation). +- Download the models and pre-computed embeddings for the common 3D segmentation, see [here](#download-embeddings-for-3d-segmentation). - Decide what you want to do in the 3rd part of the workshop and follow the respective preparation steps. You have the following options: - High-throughput annotation of cells (or other structures) in 2D images, see [high-throughput annotation](#high-throughput-image-annotation). - 3D segmentation in light or electron microscopy, see [3D LM segmentation](#3d-lm-segmentation) and [3D EM segmentation](#3d-em-segmentation). - Finetuning SAM on custom data, see [model finetuning](#model-finetuning). - - Writing your own scripts using the `micro_sam` python library, see [scripting](#scripting-with-micro_sam). You can do all of this on your laptop with a CPU, except for model finetuning, which requires a GPU. We have prepared a notebook that runs on cloud resources with a GPU for this. @@ -43,24 +42,25 @@ conda install -c pytorch -c conda-forge "micro_sam>=1.1" "pytorch>=2.4" "protobu ``` If you already have an installation of `micro_sam` please update it by running the last command in your respective environment. You can find more information about the installation [here](https://computational-cell-analytics.github.io/micro-sam/micro_sam.html#installation). -### Download Embeddings for 3D EM Segmentation - -We provide a script to download the image embeddings for the 3D segmentation problem in part 2. -The image embeddings are necessary to run interactive segmentation. Computing them on the CPU can take some time for volumetric data, but we support precomputing them and have done this for this data already. -To run the script you first need to use `git` to download this repository: +### Download Embeddings for 3D EM Segmentation +We provide a script to download the models used in the workshop. To run the script you first need to use `git` to download this repository: ```bash git clone https://github.com/computational-cell-analytics/micro-sam ``` then go to this directory: - ```bash cd micro-sam/workshops/i2k_2024 ``` +and run the script: +```bash +python download_models.py +``` -and download the precomputed embeddings: - +We also provide a script to download the image embeddings for the 3D segmentation problem in part 2. +The image embeddings are necessary to run interactive segmentation. Computing them on the CPU can take some time for volumetric data, but we support precomputing them and have done this for this data already. +You can download them by running the script: ```bash python download_embeddings.py -e embeddings -d lucchi ``` @@ -154,13 +154,13 @@ We have prepared the notebook so that it can be run on [kaggle](ttps://www.kaggl **If you want to bring your own data for training please store it in a similar format to the example data. You have to bring both images and annotations (= instance segmentation masks) for training. If you want to use kaggle please also upload your data so that you can retrieve it within the notebook.** -### Scripting with micro_sam +### Advanced applications: scripting with `micro_sam` -You can also use the [micro_sam python library](https://computational-cell-analytics.github.io/micro-sam/micro_sam.html#using-the-python-library) to implement your own functionality. +If you want to develop applications based on `micro_sam` you can use +the [micro_sam python library](https://computational-cell-analytics.github.io/micro-sam/micro_sam.html#using-the-python-library) to implement your own functionality. For example, you could implement a script to segment cells based on prompts derived from a nucleus segmentation via [batched inference](https://computational-cell-analytics.github.io/micro-sam/micro_sam/inference.html#batched_inference). Or a script to automatically segment data with a finetuned model using [automatic segmentation](https://computational-cell-analytics.github.io/micro-sam/micro_sam/automatic_segmentation.html). -Feel free to contact us before the workshop if you have an idea for what you want to implement and would like to know if this is feasible and how to get started. ### Precompute Embeddings diff --git a/workshops/i2k_2024/download_models.py b/workshops/i2k_2024/download_models.py new file mode 100644 index 00000000..75225df9 --- /dev/null +++ b/workshops/i2k_2024/download_models.py @@ -0,0 +1,5 @@ +from micro_sam.util import get_sam_model + +get_sam_model(model_type="vit_b") +get_sam_model(model_type="vit_b_lm") +get_sam_model(model_type="vit_b_em_organelles")