Skip to content

Commit

Permalink
STY: Correct a few more code style issues
Browse files Browse the repository at this point in the history
  • Loading branch information
sebastientourbier committed Jan 31, 2022
1 parent 5c86f02 commit 2319578
Show file tree
Hide file tree
Showing 6 changed files with 102 additions and 108 deletions.
189 changes: 95 additions & 94 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ You need to have first either Docker or Singularity engine and miniconda install
Then, download the appropriate [environment.yml](https://github.com/connectomicslab/connectomemapper3/raw/master/conda/environment.yml) / [environment_macosx.yml](https://github.com/connectomicslab/connectomemapper3/raw/master/conda/environment_macosx.yml) and create a conda environment `py37cmp-gui` with the following command:

```bash
$ conda create env -f /path/to/environment[_macosx].yml
conda create env -f /path/to/environment[_macosx].yml
```

Once the environment is created, activate it and install Connectome Mapper 3 with `PyPI` as follows:
Expand All @@ -37,10 +37,10 @@ You are ready to use Connectome Mapper 3!

### Resources

* **Documentation:** [https://connectome-mapper-3.readthedocs.io](https://connectome-mapper-3.readthedocs.io)
* **Mailing list:** [https://groups.google.com/forum/#!forum/cmtk-users](https://groups.google.com/forum/#!forum/cmtk-users)
* **Source:** [https://github.com/connectomicslab/connectomemapper3](https://github.com/connectomicslab/connectomemapper3)
* **Bug reports:** [https://github.com/connectomicslab/connectomemapper3/issues](https://github.com/connectomicslab/connectomemapper3/issues)
* **Documentation:** [https://connectome-mapper-3.readthedocs.io](https://connectome-mapper-3.readthedocs.io)
* **Mailing list:** [https://groups.google.com/forum/#!forum/cmtk-users](https://groups.google.com/forum/#!forum/cmtk-users)
* **Source:** [https://github.com/connectomicslab/connectomemapper3](https://github.com/connectomicslab/connectomemapper3)
* **Bug reports:** [https://github.com/connectomicslab/connectomemapper3/issues](https://github.com/connectomicslab/connectomemapper3/issues)

### New in ``v3.0.2`` 🌍🌳✨

Expand All @@ -67,94 +67,96 @@ Please check [https://ohbm-environment.org](https://ohbm-environment.org) to lea

Having the `py37cmp-gui` conda environment previously installed activated, the BIDS App can easily be run using `connectomemapper3_docker`, the python wrapper for Docker, as follows:

usage: connectomemapper3_docker [-h]
[--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]]
[--session_label SESSION_LABEL [SESSION_LABEL ...]]
[--anat_pipeline_config ANAT_PIPELINE_CONFIG]
[--dwi_pipeline_config DWI_PIPELINE_CONFIG]
[--func_pipeline_config FUNC_PIPELINE_CONFIG]
[--number_of_threads NUMBER_OF_THREADS]
[--number_of_participants_processed_in_parallel NUMBER_OF_PARTICIPANTS_PROCESSED_IN_PARALLEL]
[--mrtrix_random_seed MRTRIX_RANDOM_SEED]
[--ants_random_seed ANTS_RANDOM_SEED]
[--ants_number_of_threads ANTS_NUMBER_OF_THREADS]
[--fs_license FS_LICENSE] [--coverage]
[--notrack] [-v] [--track_carbon_footprint]
[--docker_image DOCKER_IMAGE]
[--config_dir CONFIG_DIR]
bids_dir output_dir {participant,group}

Entrypoint script of the Connectome Mapper BIDS-App version v3.0.2 via Docker.
positional arguments:
bids_dir The directory with the input dataset formatted
according to the BIDS standard.
output_dir The directory where the output files should be stored.
If you are running group level analysis this folder
should be prepopulated with the results of the
participant level analysis.
{participant,group} Level of the analysis that will be performed. Multiple
participant level analyses can be run independently
(in parallel) using the same output_dir.
optional arguments:
-h, --help show this help message and exit
--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]
The label(s) of the participant(s) that should be
analyzed. The label corresponds to
sub-<participant_label> from the BIDS spec (so it does
not include "sub-"). If this parameter is not provided
all subjects should be analyzed. Multiple participants
can be specified with a space separated list.
--session_label SESSION_LABEL [SESSION_LABEL ...]
The label(s) of the session that should be analyzed.
The label corresponds to ses-<session_label> from the
BIDS spec (so it does not include "ses-"). If this
parameter is not provided all sessions should be
analyzed. Multiple sessions can be specified with a
space separated list.
--anat_pipeline_config ANAT_PIPELINE_CONFIG
Configuration .txt file for processing stages of the
anatomical MRI processing pipeline
--dwi_pipeline_config DWI_PIPELINE_CONFIG
Configuration .txt file for processing stages of the
diffusion MRI processing pipeline
--func_pipeline_config FUNC_PIPELINE_CONFIG
Configuration .txt file for processing stages of the
fMRI processing pipeline
--number_of_threads NUMBER_OF_THREADS
The number of OpenMP threads used for multi-threading
by Freesurfer (Set to [Number of available CPUs -1] by
default).
--number_of_participants_processed_in_parallel NUMBER_OF_PARTICIPANTS_PROCESSED_IN_PARALLEL
The number of subjects to be processed in parallel
(One by default).
--mrtrix_random_seed MRTRIX_RANDOM_SEED
Fix MRtrix3 random number generator seed to the
specified value
--ants_random_seed ANTS_RANDOM_SEED
Fix ANTS random number generator seed to the specified
value
--ants_number_of_threads ANTS_NUMBER_OF_THREADS
Fix number of threads in ANTs. If not specified ANTs
will use the same number as the number of OpenMP
threads (see `----number_of_threads` option flag)
--fs_license FS_LICENSE
Freesurfer license.txt
--coverage Run connectomemapper3 with coverage
--notrack Do not send event to Google analytics to report BIDS
App execution, which is enabled by default.
-v, --version show program's version number and exit
--track_carbon_footprint
Track carbon footprint with `codecarbon
<https://codecarbon.io/>`_ and save results in a CSV
file called ``emissions.csv`` in the
``<bids_dir>/code`` directory.
--docker_image DOCKER_IMAGE
The path to the docker image.
--config_dir CONFIG_DIR
The path to the directory containing the configuration
files.
```output
usage: connectomemapper3_docker [-h]
[--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]]
[--session_label SESSION_LABEL [SESSION_LABEL ...]]
[--anat_pipeline_config ANAT_PIPELINE_CONFIG]
[--dwi_pipeline_config DWI_PIPELINE_CONFIG]
[--func_pipeline_config FUNC_PIPELINE_CONFIG]
[--number_of_threads NUMBER_OF_THREADS]
[--number_of_participants_processed_in_parallel NUMBER_OF_PARTICIPANTS_PROCESSED_IN_PARALLEL]
[--mrtrix_random_seed MRTRIX_RANDOM_SEED]
[--ants_random_seed ANTS_RANDOM_SEED]
[--ants_number_of_threads ANTS_NUMBER_OF_THREADS]
[--fs_license FS_LICENSE] [--coverage]
[--notrack] [-v] [--track_carbon_footprint]
[--docker_image DOCKER_IMAGE]
[--config_dir CONFIG_DIR]
bids_dir output_dir {participant,group}
Entrypoint script of the Connectome Mapper BIDS-App version v3.0.2 via Docker.
positional arguments:
bids_dir The directory with the input dataset formatted
according to the BIDS standard.
output_dir The directory where the output files should be stored.
If you are running group level analysis this folder
should be prepopulated with the results of the
participant level analysis.
{participant,group} Level of the analysis that will be performed. Multiple
participant level analyses can be run independently
(in parallel) using the same output_dir.
optional arguments:
-h, --help show this help message and exit
--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]
The label(s) of the participant(s) that should be
analyzed. The label corresponds to
sub-<participant_label> from the BIDS spec (so it does
not include "sub-"). If this parameter is not provided
all subjects should be analyzed. Multiple participants
can be specified with a space separated list.
--session_label SESSION_LABEL [SESSION_LABEL ...]
The label(s) of the session that should be analyzed.
The label corresponds to ses-<session_label> from the
BIDS spec (so it does not include "ses-"). If this
parameter is not provided all sessions should be
analyzed. Multiple sessions can be specified with a
space separated list.
--anat_pipeline_config ANAT_PIPELINE_CONFIG
Configuration .txt file for processing stages of the
anatomical MRI processing pipeline
--dwi_pipeline_config DWI_PIPELINE_CONFIG
Configuration .txt file for processing stages of the
diffusion MRI processing pipeline
--func_pipeline_config FUNC_PIPELINE_CONFIG
Configuration .txt file for processing stages of the
fMRI processing pipeline
--number_of_threads NUMBER_OF_THREADS
The number of OpenMP threads used for multi-threading
by Freesurfer (Set to [Number of available CPUs -1] by
default).
--number_of_participants_processed_in_parallel NUMBER_OF_PARTICIPANTS_PROCESSED_IN_PARALLEL
The number of subjects to be processed in parallel
(One by default).
--mrtrix_random_seed MRTRIX_RANDOM_SEED
Fix MRtrix3 random number generator seed to the
specified value
--ants_random_seed ANTS_RANDOM_SEED
Fix ANTS random number generator seed to the specified
value
--ants_number_of_threads ANTS_NUMBER_OF_THREADS
Fix number of threads in ANTs. If not specified ANTs
will use the same number as the number of OpenMP
threads (see `----number_of_threads` option flag)
--fs_license FS_LICENSE
Freesurfer license.txt
--coverage Run connectomemapper3 with coverage
--notrack Do not send event to Google analytics to report BIDS
App execution, which is enabled by default.
-v, --version show program's version number and exit
--track_carbon_footprint
Track carbon footprint with `codecarbon
<https://codecarbon.io/>`_ and save results in a CSV
file called ``emissions.csv`` in the
``<bids_dir>/code`` directory.
--docker_image DOCKER_IMAGE
The path to the docker image.
--config_dir CONFIG_DIR
The path to the directory containing the configuration
files.
```

## Contributors ✨

Expand Down Expand Up @@ -202,7 +204,6 @@ Thanks also goes to all these wonderful people that contributed to Connectome Ma
* Alia Lemkaddem (allem)
* Xavier Gigandet


* Collaborators from Children's Hospital, Boston:

* Ellen Grant
Expand Down
7 changes: 5 additions & 2 deletions cmp/parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ def get() -> argparse.ArgumentParser:
-------
p : argparse.ArgumentParser
Instance of :class:`argparse.ArgumentParser`
"""

p = argparse.ArgumentParser(
Expand Down Expand Up @@ -156,11 +157,12 @@ def get_wrapper_parser() -> argparse.ArgumentParser: # pragma: no cover

def get_docker_wrapper_parser() -> argparse.ArgumentParser: # pragma: no cover
"""Return the argparse parser of the Docker BIDS App.
Returns
-------
p : argparse.ArgumentParser
Instance of :class:`argparse.ArgumentParser`
"""
p: argparse.ArgumentParser = get_wrapper_parser()
p.description = f"Entrypoint script of the Connectome Mapper BIDS-App version {__version__} via Docker."
Expand All @@ -181,11 +183,12 @@ def get_docker_wrapper_parser() -> argparse.ArgumentParser: # pragma: no cover

def get_singularity_wrapper_parser() -> argparse.ArgumentParser: # pragma: no cover
"""Return the argparse parser of the Singularity BIDS App.
Returns
-------
p : argparse.ArgumentParser
Instance of :class:`argparse.ArgumentParser`
"""
p: argparse.ArgumentParser = get_wrapper_parser()
p.description = f"Entrypoint script of the Connectome Mapper BIDS-App version {__version__} via Singularity."
Expand Down
2 changes: 1 addition & 1 deletion cmp/stages/registration/registration.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
# Own imports
from cmp.stages.common import Stage
from cmtklib.interfaces.mrtrix3 import DWI2Tensor, MRConvert, ExtractMRTrixGrad
from cmtklib.interfaces.fsl import ApplymultipleXfm, ApplymultipleWarp
from cmtklib.interfaces.fsl import ApplymultipleXfm
import cmtklib.interfaces.freesurfer as cmp_fs
import cmtklib.interfaces.fsl as cmp_fsl
from cmtklib.interfaces.ants import MultipleANTsApplyTransforms
Expand Down
3 changes: 0 additions & 3 deletions cmtklib/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -542,7 +542,6 @@ def set_pipeline_attributes_from_config(pipeline, config, debug=False):
+ f"{sub_config}.{sub_key} to {conf_value}"
)
print_error(f" {e}")
pass
else:
if stage.name in config.keys():
if key in config[stage.name].keys():
Expand Down Expand Up @@ -573,8 +572,6 @@ def set_pipeline_attributes_from_config(pipeline, config, debug=False):
+ f"{stage.config}.{key} to {conf_value}"
)
print_error(f" {e}")
pass

setattr(
pipeline, "number_of_cores", int(config["Multi-processing"]["number_of_cores"])
)
Expand Down
7 changes: 0 additions & 7 deletions cmtklib/interfaces/dipy.py
Original file line number Diff line number Diff line change
Expand Up @@ -798,7 +798,6 @@ class DirectionGetterTractography(DipyBaseInterface):
def _run_interface(self, runtime):
from dipy.tracking import utils
from dipy.direction import DeterministicMaximumDirectionGetter, ProbabilisticDirectionGetter
# from dipy.tracking.local import ThresholdStoppingCriterion, ActStoppingCriterion
from dipy.tracking.stopping_criterion import BinaryStoppingCriterion, CmcStoppingCriterion
from dipy.tracking.local_tracking import LocalTracking, ParticleFilteringTracking
from dipy.direction.peaks import peaks_from_model
Expand Down Expand Up @@ -1138,20 +1137,14 @@ class MAPMRI(DipyDiffusionInterface):

def _run_interface(self, runtime):
from dipy.reconst import mapmri
# from dipy.data import fetch_cenir_multib, read_cenir_multib
from dipy.core.gradients import gradient_table
# import marshal as pickle
import pickle as pickle
import gzip

img = nib.load(self.inputs.in_file)
imref = nib.four_to_three(img)[0]
affine = img.affine

data = img.get_data().astype(np.float32)

hdr = imref.header.copy()

gtab = self._get_gradient_table()
gtab = gradient_table(
bvals=gtab.bvals, bvecs=gtab.bvecs,
Expand Down
2 changes: 1 addition & 1 deletion cmtklib/interfaces/fsl.py
Original file line number Diff line number Diff line change
Expand Up @@ -403,7 +403,7 @@ class EddyOpenMP(FSLCommand):
output_spec = EddyOutputSpec

def __init__(self, **inputs):
return super(EddyOpenMP, self).__init__(**inputs)
super(EddyOpenMP, self).__init__(**inputs)

def _run_interface(self, runtime):
if not isdefined(self.inputs.out_file):
Expand Down

0 comments on commit 2319578

Please sign in to comment.