Addressing the Pubic Symphysis-Fetal Head Segmentation and Angle of Progression Challenge, the repository focuses on automating the segmentation of the transperineal ultrasound images.
The ultimate objective is to enhance the accuracy and objectivity of fetal head descent assessment, crucial for optimizing obstetric practices and minimizing difficult vaginal deliveries. The challenge encourages the development and application of cutting-edge techniques for FH-PS segmentation, serving as a benchmark dataset for comprehensive evaluation and fostering advancements in the field.
The JNU-IFM dataset for segmenting pubic symphysis-fetal head is originally introduced in the article.
This segmentation competition exclusively utilizes MHA (MetaImage) files for image data, comprising a dataset of 4000 samples. The original images are in a
The label pixels are categorized as follows:
-
0: Background
-
1: Pubic Symphysis
-
2: Fetal Head
- A U-Net model (PyTorch SMP library) with
mit_b0
encoder (transformer) andimagenet
pre-trained weights for encoder initialization. - The file named
segmentation_model.pth
contains the weight parameters of the trained model. - Execute the
FH_PS_AOP_Challenge.ipynb
notebook to train the designated model.
- Sensitivity score: 0.986
- Specificity score: 0.993
- Pixel accuracy: 0.991
- Jaccard score: 0.973
- Dice score: 0.986
The implementation details are in the file aop_estimation.py
.
Implement from scratch a ViT model, experiment with existing ones