Skip to content

boostcampaitech2/semantic-segmentation-level2-cv-06

Repository files navigation

✨칠성사이다✨

강지우 곽지윤 서지유 송나은(NaeunSong) 오재환 이준혁 전경재

Competition Overview

  • 목적: 사진에서 쓰레기를 Segmentation 하는 모델 제작
  • Dataset: 일반 쓰레기, 플라스틱, 종이, 유리 등 11 종류의 쓰레기가 찍힌 사진 데이터셋 4091여장(train 2617장, valid : 665장, test : 819장)
  • 평가 metric: mean Intersection over Union(mIOU) on test dataset

    segmentation_viz

Project Roadmap

contents

|-- datasets
|   |-- coco.py
|   |-- copy_paste.py
|   |-- dataset.py
|   `-- transform_test.py
|-- loss
|   |-- losses.py
|   `-- rmi_utils.py
|-- models
|   |-- HRNET_OCR
|   |   |-- hrnetv2.py
|   |   |-- ocrnet.py
|   |   `-- ocrnet_utils.py
|   |-- TransUnet
|   |   |-- vit_seg_configs.py
|   |   |-- vit_seg_modeling.py
|   |   `-- vit_seg_modeling_resnet_skip.py
|   `-- model.py
|-- optimizer
|   |-- optim_sche.py
|   `-- radam.py
|-- sample_data
|   |-- image
|   `-- train.json
|-- utils
|    |-- densecrf.py
|    |-- ensemble.ipynb
|    |-- img_diff.py
|    |-- labelcount.py
|    |-- new_copy_paste
|    |   |-- new_copy_paste.py
|    |   `-- new_copy_paste_dataset.py
|    `-- utils.py
|-- train.py
|-- inference.py
|-- class_dict.csv
|-- README.md

best result

  • ./runs/Transunet_SGD_1024.pt
  • ./runs/OCRNet_augmix.pt
  • ./runs/DeepLabv3_efficientb7_copypaste.pt
  • Ensemble (TransUNet+DeepLabV3)
    • Public : 0.707, private : 0.661

simple start

environment

pip install requirement.txt
pip install git+https://github.com/lucasb-eyer/pydensecrf.git

Train

python train.py --model MscaleOCRNet --batch_size 10 --wandb True --custom_trs True
                --model DeepLabV3
                --model TransUnet

Inference

python inference.py

ensemble

for ensemble, please reffer to ensemble.ipynb

reference

Hierarchical Multi-Scale Attention for Semantic Segmentation

Copy Paste

TransUNet

About

semantic-segmentation-level2-cv-06 created by GitHub Classroom

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published