Skip to content

Latest commit

 

History

History
20 lines (14 loc) · 847 Bytes

README.md

File metadata and controls

20 lines (14 loc) · 847 Bytes

BISKD

Efficient Biomedical Instance Segmentation via Knowledge Distillation MICCAI 2022

Installaion

This code was implemented with Pytorch 1.0.1 (later versions may work), CUDA 9.0, Python 3.7.4 and Ubuntu 16.04.

If you have a Docker environment, we strongly recommend you to pull our image as follows:

docker pull registry.cn-hangzhou.aliyuncs.com/em_seg/v54_higra

Pretrained Model

The related pretrained models are available.

Contact

This repo is related to knowledge distillation, and please refer to repo for network training details without KD.

If you have any problem with the released code, please contact me by email ([email protected]).