diff --git a/label_studio_ml/examples/grounding_sam/README.md b/label_studio_ml/examples/grounding_sam/README.md index 997c05534..f03d98a79 100644 --- a/label_studio_ml/examples/grounding_sam/README.md +++ b/label_studio_ml/examples/grounding_sam/README.md @@ -1,13 +1,13 @@ https://github.com/HumanSignal/label-studio-ml-backend/assets/106922533/d1d2f233-d7c0-40ac-ba6f-368c3c01fd36 -# Grounding DINO backend integration +# Grounding DINO backend integration with SAM enabled This integration will allow you to: diff --git a/label_studio_ml/examples/segment_anything_2_image/README.md b/label_studio_ml/examples/segment_anything_2_image/README.md index 002131254..dd7847745 100644 --- a/label_studio_ml/examples/segment_anything_2_image/README.md +++ b/label_studio_ml/examples/segment_anything_2_image/README.md @@ -1,9 +1,27 @@ - -Segment Anything 2, or SAM 2, is a model releaed by Meta in July 2024. An update to the original Segment Anything Model, + + +# Using SAM2 with Label Studio for Image Annotation + +Segment Anything 2, or SAM 2, is a model released by Meta in July 2024. An update to the original Segment Anything Model, SAM 2 provides even better object segmentation for both images and video. In this guide, we'll show you how to use SAM 2 for better image labeling with label studio. -## Using SAM2 with Label Studio (tutorial) Click on the image below to watch our ML Evangelist Micaela Kaplan explain how to link SAM 2 to your Label Studio Project. You'll need to follow the instructions below to stand up an instance of SAM2 before you can link your model! diff --git a/label_studio_ml/examples/segment_anything_2_video/README.md b/label_studio_ml/examples/segment_anything_2_video/README.md index 6c01c273f..e8a2eda4f 100644 --- a/label_studio_ml/examples/segment_anything_2_video/README.md +++ b/label_studio_ml/examples/segment_anything_2_video/README.md @@ -1,3 +1,23 @@ + + +# Using SAM2 with Label Studio for Video Annotation + This guide describes the simplest way to start using **SegmentAnything 2** with Label Studio. This repository is specifically for working with object tracking in videos. For working with images, @@ -51,7 +71,7 @@ For your project, you can use any labeling config with video properties. Here's }--> -# Known limitiations +# Known limitations - As of 8/11/2024, SAM2 only runs on GPU servers. - Currently, we only support the tracking of one object in video, although SAM2 can support multiple. - Currently, we do not support video segmentation. diff --git a/label_studio_ml/examples/tesseract/README.md b/label_studio_ml/examples/tesseract/README.md index d6d8939a7..aa91140ae 100644 --- a/label_studio_ml/examples/tesseract/README.md +++ b/label_studio_ml/examples/tesseract/README.md @@ -45,7 +45,7 @@ Launch Label Studio. You can follow the guide from the [official documentation]( docker run -it \ -p 8080:8080 \ -v `pwd`/mydata:/label-studio/data \ - humansignal/label-studio:latest + heartex/label-studio:latest ``` Optionally, you may enable local file serving in Label Studio @@ -56,7 +56,7 @@ Launch Label Studio. You can follow the guide from the [official documentation]( -v `pwd`/mydata:/label-studio/data \ --env LABEL_STUDIO_LOCAL_FILES_SERVING_ENABLED=true \ --env LABEL_STUDIO_LOCAL_FILES_DOCUMENT_ROOT=/label-studio/data/images \ - humansignal/label-studio:latest + heartex/label-studio:latest ``` If you're using local file serving, be sure to [get a copy of the API token](https://labelstud.io/guide/user_account#Access-token) from Label Studio to connect the model. diff --git a/label_studio_ml/examples/watsonx_llm/README.md b/label_studio_ml/examples/watsonx_llm/README.md index 54224b305..ebbdce489 100644 --- a/label_studio_ml/examples/watsonx_llm/README.md +++ b/label_studio_ml/examples/watsonx_llm/README.md @@ -1,6 +1,23 @@ - + # Integrate WatsonX to Label Studio + WatsonX offers a suite of machine learning tools, including access to many LLMs, prompt refinement interfaces, and datastores via WatsonX.data. When you integrate WatsonX with Label Studio, you get access to these models and can automatically keep your annotated data up to date in your WatsonX.data tables. @@ -13,8 +30,12 @@ on webhooks, see [our documentation](https://labelstud.io/guide/webhooks) See the configuration notes at the bottom for details on how to set up your environment variables to get the system to work. +For a video demonstration, see [Integrating Label Studio with IBM WatsonX](https://www.youtube.com/watch?v=9iP2yO4Geqc). + +