-
Notifications
You must be signed in to change notification settings - Fork 418
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug in seg_output #102
base: master
Are you sure you want to change the base?
Bug in seg_output #102
Conversation
…venience function get_segmentation, changed upscaling in UNet from Deconv2DLayer to Upscale2DLayer
…re). Also fixed a typo in massachusetts
…sing.Pool for miltithreaded download. Seems to work well!
…neration (needed for validation)
…s faster to load and supports mmap) while being not much larger (10.5GB instead of 7.5GB with .npz). Also switched out batch_generator with f0k's suggestion. Added AUC score computation for validation set
…more comments, less learning rate decay, fixed some typos, added try-except for import of dnn layers
…UNet paper. Changed Upscale2DLayer to Deconv2DLaer as in paper. Changed Patch size to 512. Default setting for mmap_mode is now r as this is almost as fast as copying everything into memory
…UNet is implemented almost identically to the paper. Solely the padding has been set to 'same' (instead of 'valid') and we make use of batch normalization. The functionality of UNet is demonstrated at the example of road segmentation on a pablicly available dataset. added UNet + example (road segmentation) Reduced number of plotted test images, Added a plot of the road probability map renamed some layers of UNet to make their names more clear, added convenience function get_segmentation, changed upscaling in UNet from Deconv2DLayer to Upscale2DLayer code restructuring (now it's not cramped into a single .py file anymore). Also fixed a typo in massachusetts now using urlretrieve for downloading the data. Also uses multiprocessing.Pool for miltithreaded download. Seems to work well! updated batch_iterator so that it now supports deterministic batch generation (needed for validation) added center crop generator now uses compressed numpy arrays to store the dataset some minor fixes some more minor fixes. It now runs out of the box switched back to uncompressed numpy arrays for saving the dataset (its faster to load and supports mmap) while being not much larger (10.5GB instead of 7.5GB with .npz). Also switched out batch_generator with f0k's suggestion. Added AUC score computation for validation set minor bugfixes, new pretrained weights minor bug fixes and cleanup minor bug fix forgot to comment loading of pretrained params renamed layers for clarity python code for downloading and extracting pretrained weights, added more comments, less learning rate decay, fixed some typos, added try-except for import of dnn layers Unet now uses relu and He's initialization scheme as in the original UNet paper. Changed Upscale2DLayer to Deconv2DLaer as in paper. Changed Patch size to 512. Default setting for mmap_mode is now r as this is almost as fast as copying everything into memory new pretrained weights
…ng to mostly empty segmentation maps
Thank you for the PR! There are two issues with this:
|
Hi Jan,
|
If they fine-tune with batch size 1, then the network will behave well (and deterministically) when testing with batch size 1. If you train or test with batch sizes larger than 1, we should probably include the running averages, or use layer normalization instead of batch normalization (if you fear that a batch size of 8 is also too small). If we want to avoid retraining and updating the parameters, we should add a clear comment on what's done here and what should probably be done instead. |
After training the script prints some segmentation results. These are mostly empty and suggest that the network did not train properly (which is not the case). This is due to a bug in the seg_output definition which I fixed in this pull request.
Cheers,
Fabian