Skip to content

Commit

Permalink
Merge pull request #63 from fractal-analytics-platform/62-create-01_c…
Browse files Browse the repository at this point in the history
…ardio_tiny_dataset_with_partial_execution

New example: cardio tiny dataset with partial execution
  • Loading branch information
jluethi authored Aug 9, 2023
2 parents 5ebfed6 + eda79de commit f44c518
Show file tree
Hide file tree
Showing 14 changed files with 96 additions and 89 deletions.
2 changes: 1 addition & 1 deletion examples/00_user_setup/install_client.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
VERSION="1.3.2"
VERSION="1.3.3"

ENVNAME=fractal-client-$VERSION
conda deactivate
Expand Down
1 change: 1 addition & 0 deletions examples/01_cardio_tiny_dataset/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,4 @@ output*
task_list
.fractal.env
proj*
workflow.json
14 changes: 12 additions & 2 deletions examples/01_cardio_tiny_dataset/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,19 @@ This needs to be done in each example folder you're running
2. Get the example data: `pip install zenodo_get`, then run `. ./fetch_test_data_from_zenodo.sh`
3. One can then either go through the project creation, dataset creation, workflow creation & submission one by one. Or run it all at once by running: `. ./run_example.sh`

This should complete fairly quickly (submitting the script to it being finished took 30s on my machine). One can check the status with `fractal job show ID` (where the ID is the job ID of the submitted workflow, 1 for the first workflow submitted. This is shown when submitting the workflow)
This should complete fairly quickly (submitting the script to it being finished took 30s on my machine). One can check the status with `fractal job show ID` (where the ID is the job ID of the submitted workflow: 1 for the first workflow submitted. This is shown when submitting the workflow)

## Running partial workflows
You can use the run_example_with_partial_execution.sh as an example of how to run only parts of workflows.
Modify its last lines if you don't want to automatically run the second part of the workflow after a short wait.


## Import a workflow, then run it
1. Create a project & add a workflow to it that is downloaded by running `prepare_and_export_workflow.sh`
2. Create a second project, import the workflow and apply it by running `import_and_run_workflow.sh`


Check the client documentation for details on using the Fractal Client: https://fractal-analytics-platform.github.io/fractal/install.html
Check the Fractal Tasks Core documentation for details on the individual tasks of this example workflow: https://fractal-analytics-platform.github.io/fractal-tasks-core/

Successfully run with `fractal-server==1.3.0a5`, `fractal-client==1.3.0a3` and `fractal-tasks-core==0.10.0a4`
Successfully run with `fractal-server==1.3.4`, `fractal-client==1.3.3` and `fractal-tasks-core==0.10.0`
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
LABEL="cardiac-test-partial-1"

###############################################################################
# IMPORTANT: This defines the location of input & output data
INPUT_PATH=`pwd`/../images/10.5281_zenodo.7059515/
OUTPUT_PATH=`pwd`/output_${LABEL}
###############################################################################

# Get the credentials: If you followed the instructions, they can be copied
# from the .fractal.env file in ../00_user_setup. Alternatively, you can write
# a .fractal.env file yourself or add --user & --password entries to all fractal
# commands below
cp ../00_user_setup/.fractal.env .fractal.env

# Set useful variables
PRJ_NAME="proj-$LABEL"
DS_IN_NAME="input-ds-$LABEL"
DS_OUT_NAME="output-ds-$LABEL"
WF_NAME="Workflow $LABEL"

# Set cache path and remove any previous file from there
export FRACTAL_CACHE_PATH=`pwd`/".cache"
rm -rv ${FRACTAL_CACHE_PATH} 2> /dev/null

###############################################################################

# Create project
OUTPUT=`fractal --batch project new $PRJ_NAME`
PRJ_ID=`echo $OUTPUT | cut -d ' ' -f1`
DS_IN_ID=`echo $OUTPUT | cut -d ' ' -f2`
echo "PRJ_ID: $PRJ_ID"
echo "DS_IN_ID: $DS_IN_ID"

# Update dataset name/type, and add a resource
fractal dataset edit --new-name "$DS_IN_NAME" --new-type image --make-read-only $PRJ_ID $DS_IN_ID
fractal dataset add-resource $PRJ_ID $DS_IN_ID $INPUT_PATH

# Add output dataset, and add a resource to it
DS_OUT_ID=`fractal --batch project add-dataset $PRJ_ID "$DS_OUT_NAME"`
echo "DS_OUT_ID: $DS_OUT_ID"

fractal dataset edit --new-type zarr --remove-read-only $PRJ_ID $DS_OUT_ID
fractal dataset add-resource $PRJ_ID $DS_OUT_ID $OUTPUT_PATH

# Create workflow
WF_ID=`fractal --batch workflow new "$WF_NAME" $PRJ_ID`
echo "WF_ID: $WF_ID"

###############################################################################

# Prepare some JSON files for task arguments (note: this has to happen here,
# because we need to include the path of the current directory)
CURRENT_FOLDER=`pwd`
echo "{
\"level\": 0,
\"input_ROI_table\": \"well_ROI_table\",
\"workflow_file\": \"$CURRENT_FOLDER/regionprops_from_existing_labels_feature.yaml\",
\"input_specs\": {
\"dapi_img\": { \"type\": \"image\", \"channel\":{ \"wavelength_id\": \"A01_C01\" } },
\"label_img\": { \"type\": \"label\", \"label_name\": \"nuclei\" }
},
\"output_specs\": {
\"regionprops_DAPI\": { \"type\": \"dataframe\", \"table_name\": \"nuclei\" }
}
}
" > Parameters/args_measurement.json

###############################################################################

# Add tasks to workflow
fractal --batch workflow add-task $PRJ_ID $WF_ID --task-name "Create OME-Zarr structure" --args-file Parameters/args_create_ome_zarr.json --meta-file Parameters/example_meta.json
fractal --batch workflow add-task $PRJ_ID $WF_ID --task-name "Convert Yokogawa to OME-Zarr"
fractal --batch workflow add-task $PRJ_ID $WF_ID --task-name "Copy OME-Zarr structure"
fractal --batch workflow add-task $PRJ_ID $WF_ID --task-name "Maximum Intensity Projection"
fractal --batch workflow add-task $PRJ_ID $WF_ID --task-name "Cellpose Segmentation" --args-file Parameters/args_cellpose_segmentation.json #--meta-file Parameters/cellpose_meta.json
fractal --batch workflow add-task $PRJ_ID $WF_ID --task-name "Napari workflows wrapper" --args-file Parameters/args_measurement.json --meta-file Parameters/example_meta.json

# Apply workflow
fractal workflow apply $PRJ_ID $WF_ID $DS_IN_ID $DS_OUT_ID --end 1
sleep 90
fractal workflow apply $PRJ_ID $WF_ID $DS_OUT_ID $DS_OUT_ID --start 2

This file was deleted.

This file was deleted.

This file was deleted.

This file was deleted.

18 changes: 0 additions & 18 deletions examples/01_cardio_tiny_dataset_with_import_export/README.md

This file was deleted.

This file was deleted.

This file was deleted.

2 changes: 1 addition & 1 deletion examples/server/install_script.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
VERSION="1.3.3"
VERSION="1.3.4"

ENVNAME=fractal-server-$VERSION
conda deactivate
Expand Down

0 comments on commit f44c518

Please sign in to comment.