Skip to content
/ UDCT Public
forked from njcronin/UDCT

Code repository for the UDCT project

Notifications You must be signed in to change notification settings

oseynnes/UDCT

 
 

Repository files navigation

UDCT

You can find here the Cycle-GAN network (Tensorflow, Python 2.7+) with our histogram loss. Additionally, we provide the scripts we used to generate the synthetic datasets.

Our results can be found at https://www.biorxiv.org/content/biorxiv/early/2019/03/01/563734.full.pdf

How to use

  1. Clone or download the repository
  2. Create a synthetic dataset similar to your real dataset or use example dataset in ./Data/Example
  3. Execute:
    python create_h5_dataset.py <directory_of_raw_images> <directory_of_syn_images> <filename_of_hdf5_file>  
    Example:
    python create_h5_dataset.py ./Data/Example/Genuine/ \
    ./Data/Example/Synthetic/ ./Data/Example/example_dataset.h5
  4. Create the directory 'Models' in the root directory
  5. Execute:
     python main.py --dataset=./Data/..../dataset.h5 --name=name_of_model 
    Example:
     python main.py --dataset=./Data/Example/example_dataset.h5 --name=example_model 
  6. This will create a network that is saved in ./Models/ along with a parameter textfile. Furthermore, the average loss terms for each epoch are saved in this directory.
  7. To generate the results after training, use:
     python main.py --dataset=./Data/Example/example_dataset.h5 --name=example_model --mode=gen_B 
    The generated synthetic images can be found in ./Models/<name_of_model>_gen_B.h5

Parameters

All parameters are of the shape: --<parameter_name>=<value>
Below is the list of all possible parameters that can be set. The standard value used if the parameter is not defined is given in brackets

name ('unnamed')
Name of the model. This value should be unique to not load/overwrite old models. Its value must be changed to ensure functionality!

dataset ('pathtodata.h5')
Describes which h5 files is used. Its value must be changed to ensure functionality!

architecture ('Res6')
The network architecture for the generators. Currently, you can choose between 'Res6' and 'Res9', which corresponds to 6 and 9 residual layers, respectively.

deconv ('transpose')
Upsampling method used in the generators. You can either choose transpose CNNs ('transpose') or image resizing ('resize').

PatchGAN ('Patch70')
Different PatchGAN Architectures: 'Patch34', 'Patch70', or 'Patch142'. A mixture of these is possible: 'MultiPatch' (experimental).

mode ('training')
Decides what should be done with the network. You can either train it ('training'), or create generated images: 'gen_A' for raw images from synthetic images and 'gen_B' for synthetic images from raw images.

dataset ('pathtodata.h5')
Describes which h5 files is used. Its value must be changed to ensure functionality!

lambda_c (10.)
The loss multiplier of the cycle consistency term used while training the generators.

lambda_h (1.)
The loss multiplier of the histogram discriminators. If the histogram should not be used, set this term to 0.

dis_noise (0.1)
To make the network more stable, we added noise to the input of the discriminators, which slowly decays over time. This value describes how high the std of the gaussian noise is, which is added to the inputs.

syn_noise (0.)
It is possible to add gaussian noise to the synthetic dataset. Default: not used.

real_noise (0.)
It is possible to add gaussian noise to the real dataset. Default: not used.

epoch (200)
Number of training epochs.

batch_size (4)
Batch size during training.

buffer_size (50)
Size of the buffer (history) saved to train the discriminators. This makes the network more stable.

save (1)
If value is not 0, the network progress is saved at the end of each epoch.

gpu (0)
If multiple GPUs exist, this parameter choses which GPU should be used. Only one GPU can currently be used.

verbose (0)
If value is not 0, the network is more verbose.

About

Code repository for the UDCT project

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 93.5%
  • Python 6.1%
  • Shell 0.4%