Replies: 5 comments 2 replies
-
I had a similar problem, but with the 001BrainTumor dataset. I got around the issue by changing the the input when using the function determine_reader_writer_from_dataset_json() as '0000' were added manually to the name there (which I believe is the root of the problem). I now have the problem when running:
|
Beta Was this translation helpful? Give feedback.
-
@yejin2008 you need to use |
Beta Was this translation helpful? Give feedback.
-
I used:
nnUNetv2_convert_MSD_dataset -i /nnUNet/dataset/nnUNet_raw/Task02_Heart.
to convert
Then nnUNetv2_plan_and_preprocess -d 002 --verify_dataset_integrity, still
got error:
Fingerprint extraction...
Dataset002_Heart
Traceback (most recent call last):
File "/anaconda3/envs/nn_UNet/bin/nnUNetv2_plan_and_preprocess", line 33,
in <module>
sys.exit(load_entry_point('nnunetv2', 'console_scripts',
'nnUNetv2_plan_and_preprocess')())
File
"/nnUNet/nnunetv2/experiment_planning/plan_and_preprocess_entrypoints.py",
line 182, in plan_and_preprocess_entry
extract_fingerprints(args.d, args.fpe, args.npfp,
args.verify_dataset_integrity, args.clean, args.verbose)
File "/nnUNet/nnunetv2/experiment_planning/plan_and_preprocess_api.py",
line 46, in extract_fingerprints
extract_fingerprint_dataset(d, fingerprint_extractor_class,
num_processes, check_dataset_integrity, clean,
File
"/nnUNet/nnUNet/nnunetv2/experiment_planning/plan_and_preprocess_api.py",
line 29, in extract_fingerprint_dataset
verify_dataset_integrity(join(nnUNet_raw, dataset_name), num_processes)
File
"/nnUNet/nnUNet/nnunetv2/experiment_planning/verify_dataset_integrity.py",
line 167, in verify_dataset_integrity
assert all(labels_present), 'not all training cases have a label file
in labelsTr. Fix that. Missing: %s' % missing
AssertionError: not all training cases have a label file in labelsTr. Fix
that. Missing: ['la_003', 'la_004', 'la_005', 'la_007', 'la_009', 'la_010',
'la_011', 'la_014', 'la_016', 'la_017', 'la_018', 'la_019', 'la_020',
'la_021', 'la_022', 'la_023', 'la_024', 'la_026', 'la_029', 'la_030']
…On Tue, May 2, 2023 at 9:32 AM Fabian Isensee ***@***.***> wrote:
@yejin2008 <https://github.com/yejin2008> you need to use
nnUNetv2_convert_MSD_dataset, not nnUNetv2_convert_old_nnUNet_dataset.
The datase you downloaded is in the MSD format. This could also be the
cause of your problem @NajaJean <https://github.com/NajaJean>
—
Reply to this email directly, view it on GitHub
<#1409 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AC45RPQZHOBPNQKMDNBND3DXEEEF5ANCNFSM6AAAAAAXEVNWPA>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Thank you so much!
I got new error after I followed your instructions:
nnUNetv2_plan_and_preprocess -d 988
Fingerprint extraction...
Dataset988_Heart
Using <class 'nnunetv2.imageio.simpleitk_reader_writer.SimpleITKIO'> as
reader/writer
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████|
20/20 [00:04<00:00, 4.87it/s]
Experiment planning...
2D U-Net configuration:
{'data_identifier': 'nnUNetPlans_2d', 'preprocessor_name':
'DefaultPreprocessor', 'batch_size': 40, 'patch_size': array([320, 256]),
'median_image_size_in_voxels': array([320., 232.]), 'spacing': array([1.25,
1.25]), 'normalization_schemes': ['ZScoreNormalization'],
'use_mask_for_norm': [True], 'UNet_class_name': 'PlainConvUNet',
'UNet_base_num_features': 32, 'n_conv_per_stage_encoder': (2, 2, 2, 2, 2,
2, 2), 'n_conv_per_stage_decoder': (2, 2, 2, 2, 2, 2), 'num_pool_per_axis':
[6, 6], 'pool_op_kernel_sizes': [[1, 1], [2, 2], [2, 2], [2, 2], [2, 2],
[2, 2], [2, 2]], 'conv_kernel_sizes': [[3, 3], [3, 3], [3, 3], [3, 3], [3,
3], [3, 3], [3, 3]], 'unet_max_num_features': 512, 'resampling_fn_data':
'resample_data_or_seg_to_shape', 'resampling_fn_seg':
'resample_data_or_seg_to_shape', 'resampling_fn_data_kwargs': {'is_seg':
False, 'order': 3, 'order_z': 0, 'force_separate_z': None},
'resampling_fn_seg_kwargs': {'is_seg': True, 'order': 1, 'order_z': 0,
'force_separate_z': None}, 'resampling_fn_probabilities':
'resample_data_or_seg_to_shape', 'resampling_fn_probabilities_kwargs':
{'is_seg': False, 'order': 1, 'order_z': 0, 'force_separate_z': None},
'batch_dice': True}
Using <class 'nnunetv2.imageio.simpleitk_reader_writer.SimpleITKIO'> as
reader/writer
3D fullres U-Net configuration:
{'data_identifier': 'nnUNetPlans_3d_fullres', 'preprocessor_name':
'DefaultPreprocessor', 'batch_size': 2, 'patch_size': array([ 80, 192,
160]), 'median_image_size_in_voxels': array([115., 320., 232.]), 'spacing':
array([1.37, 1.25, 1.25]), 'normalization_schemes':
['ZScoreNormalization'], 'use_mask_for_norm': [True], 'UNet_class_name':
'PlainConvUNet', 'UNet_base_num_features': 32, 'n_conv_per_stage_encoder':
(2, 2, 2, 2, 2, 2), 'n_conv_per_stage_decoder': (2, 2, 2, 2, 2),
'num_pool_per_axis': [4, 5, 5], 'pool_op_kernel_sizes': [[1, 1, 1], [2, 2,
2], [2, 2, 2], [2, 2, 2], [2, 2, 2], [1, 2, 2]], 'conv_kernel_sizes': [[3,
3, 3], [3, 3, 3], [3, 3, 3], [3, 3, 3], [3, 3, 3], [3, 3, 3]],
'unet_max_num_features': 320, 'resampling_fn_data':
'resample_data_or_seg_to_shape', 'resampling_fn_seg':
'resample_data_or_seg_to_shape', 'resampling_fn_data_kwargs': {'is_seg':
False, 'order': 3, 'order_z': 0, 'force_separate_z': None},
'resampling_fn_seg_kwargs': {'is_seg': True, 'order': 1, 'order_z': 0,
'force_separate_z': None}, 'resampling_fn_probabilities':
'resample_data_or_seg_to_shape', 'resampling_fn_probabilities_kwargs':
{'is_seg': False, 'order': 1, 'order_z': 0, 'force_separate_z': None},
'batch_dice': False}
Plans were saved to
/nnUNet/dataset/nnUNet_preprocessed/Dataset988_Heart/nnUNetPlans.json
Preprocessing...
Traceback (most recent call last):
File "/anaconda3/envs/nn_UNet/bin/nnUNetv2_plan_and_preprocess", line 33,
in <module>
sys.exit(load_entry_point('nnunetv2', 'console_scripts',
'nnUNetv2_plan_and_preprocess')())
File
"/nnUNet/nnUNet/nnunetv2/experiment_planning/plan_and_preprocess_entrypoints.py",
line 201, in plan_and_preprocess_entry
preprocess(args.d, args.overwrite_plans_name, args.c, np, args.verbose)
File
"/nnUNet/nnUNet/nnunetv2/experiment_planning/plan_and_preprocess_api.py",
line 127, in preprocess
preprocess_dataset(d, plans_identifier, configurations, num_processes,
verbose)
File
"/nnUNet/nnUNet/nnunetv2/experiment_planning/plan_and_preprocess_api.py",
line 96, in preprocess_dataset
raise RuntimeError(
RuntimeError: The list provided with num_processes must either have len 1
or as many elements as there are configurations (see --help). Number of
configurations: 3, length of num_processes: 2
…On Wed, May 3, 2023 at 5:43 AM Fabian Isensee ***@***.***> wrote:
It works for me:
- download task2 from here:
https://drive.google.com/drive/folders/1HqEgzS8BV2c7xYNrZdEAnrHk7osJJ--2
- extract tar archive
- nnUNetv2_convert_MSD_dataset -i Task02_Heart -overwrite_id 988 (I
have to use a different dataset ID)
- nnUNetv2_plan_and_preprocess -d 988
done
—
Reply to this email directly, view it on GitHub
<#1409 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AC45RPWCPBSVIDWQPP3LWPTXEISFNANCNFSM6AAAAAAXEVNWPA>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
assert len(training_identifiers) == expected_num_training, 'Did not find the expected number of training cases ' I got above error.
In above, I had just changed the file enging from mha format to nii.gz and it works like a charm.
And run the code ---> Dataset600_Abdomen_Pelvic.py <--- again. |
Beta Was this translation helpful? Give feedback.
-
nnUNetv2_plan_and_preprocess -d 002 --verify_dataset_integrity
Got error:
Fingerprint extraction...
Dataset002_Heart
dataset directory:/nnUNet/dataset/nnUNet_raw/Dataset002_Heart
Traceback (most recent call last):
File "/nnUNet/nnUNetv2_plan_and_preprocess", line 33, in
sys.exit(load_entry_point('nnunetv2', 'console_scripts', 'nnUNetv2_plan_and_preprocess')())
File "/nnUNet/nnUNet/nnunetv2/experiment_planning/plan_and_preprocess_entrypoints.py", line 173, in plan_and_preprocess_entry
extract_fingerprints(args.d, args.fpe, args.npfp, args.verify_dataset_integrity, args.clean, args.verbose)
File "/nnUNet/nnUNet/nnunetv2/experiment_planning/plan_and_preprocess_api.py", line 46, in extract_fingerprints
extract_fingerprint_dataset(d, fingerprint_extractor_class, num_processes, check_dataset_integrity, clean,
File "/nnUNet/nnUNet/nnunetv2/experiment_planning/plan_and_preprocess_api.py", line 29, in extract_fingerprint_dataset
verify_dataset_integrity(join(nnUNet_raw, dataset_name), num_processes)
File "/nnUNet/nnUNet/nnunetv2/experiment_planning/verify_dataset_integrity.py", line 158, in verify_dataset_integrity
assert len(training_identifiers) == expected_num_training, 'Did not find the expected number of training cases '
AssertionError: Did not find the expected number of training cases (20). Found 1 instead.
Examples: ['l']
Beta Was this translation helpful? Give feedback.
All reactions