Skip to content

jhlegarreta/tractolearn

 
 

Repository files navigation

tractolearn

test, package documentation DOI tractolearn DOI RBX Code format Imports: isort Code style: black

Tractography learning.

Installation

To use tractolearn, it is recommended to create a virtual environment using python 3.10 that will host the necessary dependencies. Torch tested with an NVIDIA RTX 3090 with:

   virtualenv tractolearn_env --python=python3.10
   source tractolearn_env/bin/activate

tractolearn can be installed from its sources by executing, at its root:

   pip install -e .
   pip install --upgrade numpy==1.23

Torch tested with an NVIDIA RTX 3090

   pip install torch==1.8.1+cu111 -f https://download.pytorch.org/whl/cu111/torch_stable.html

In order to execute experiments reporting to Comet, an api_key needs to be set as an environment variable named COMETML. You can write this command in you .bashrc

   export COMETML="api_key"

Training models

To train deep learning models, you need to launch the script ae_train.py. This script takes a config file with all training parameters such as epochs, datasets path, etc. The most up-to-date config file is config.yaml. You can launch the training pipeline with the following command:

   ae_train.py train_config.yaml -vv

Data

To automatically fetch or use the tractolearn data provided, you can use the retrieve_dataset method located in the tractolearn.tractoio.dataset_fetch module, or the dataset_fetch script, e.g.:

fetch_data contrastive_autoencoder_weights {my_path}

The datasets that can be automatically fetched and used are available in tractolearn.tractoio.dataset_fetch.Dataset.

Fetching the RecoBundlesX data is also made available.

How to cite

If you use this toolkit in a scientific publication or if you want to cite our previous works, we would appreciate if you considered the following aspects:

  • If you use tractolearn, please add a link to the appropriate code, data or related resource hosting service (e.g., repository, PyPI) from where you obtained tractolearn. You may want to include the specific version or commit hash information for the sake of reproducibility.
  • Please, cite the appropriate scientific works:
    • If you use tractolearn to filter implausible streamlines or you want to cite our work in tractography filtering, cite FINTA and FIESTA.
    • If you want to cite our work in tractography bundling, cite CINTA and FIESTA.
      • If you use tractolearn to bundle streamlines using a k-nearest neighbor label approach, cite CINTA.
      • If you use tractolearn to bundle streamlines using a thresholding approach, cite FINTA and FIESTA.
    • If you use tractolearn for generative purposes or you want to cite our work in generative models for tractography, cite GESTA and FIESTA.
    • If you use parts of tractolearn for other purposes, please generally cite FINTA and FIESTA.

The corresponding BibTeX files are contained in the above links.

If you use the data made available by the authors, please cite the appropriate Zenodo record.

Please reach out to us if you have related questions.

Patent

J. H. Legarreta, M. Descoteaux, and P.-M. Jodoin. “PROCESSING OF TRACTOGRAPHY RESULTS USING AN AUTOENCODER”. Filed 03 2021. Imeka Solutions Inc. United States Patent #17/337,413. Pending.

License

This software is distributed under a particular license. Please see the LICENSE file for details.

About

Tractography learning

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 98.7%
  • Shell 1.3%