Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introspect values in failing checks #17

Open
yarikoptic opened this issue Jun 12, 2024 · 3 comments
Open

Introspect values in failing checks #17

yarikoptic opened this issue Jun 12, 2024 · 3 comments
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@yarikoptic
Copy link
Contributor

ATM

        [ERROR] Participant labels found in this dataset did not match the values in participant_id column
found in the participants.tsv file.
 (PARTICIPANT_ID_MISMATCH)

                ./participants.tsv
                        Evidence: schema.rules.rules.checks.dataset.ParticipantIDMismatch

                1 more files with the same issue

        Please visit https://neurostars.org/search?q=PARTICIPANT_ID_MISMATCH for existing conversations about this issue.

NB: formatting is a bit odd

unfortunately it gives no specific information on what particular "labels found in this dataset did not match the values" which particular "values in participant_id column".

Here in particular we had

(dev3) yoh@typhon:/data/yoh/1076_spacetop$ diff -Naur <(awk '/^sub-/{print $1;}' participants.tsv | sort) <(/bin/ls -1d sub-* | sort) | less
--- /dev/fd/63  2024-06-12 12:45:45.253486101 -0400
+++ /dev/fd/62  2024-06-12 12:45:45.257486117 -0400
@@ -115,3 +115,4 @@
 sub-0131
 sub-0132
 sub-0133
+sub-0147

attn @jungheejung

@effigies
Copy link
Contributor

This is a feature request for the validator to introspect the assertions that it's enforcing and show the variables that led to the failure. We would need something like pytest's magic, which would be a significant undertaking.

Not to say we shouldn't do it, but this is a consequence of pushing logic into the schema so that there is less custom code for each issue in the validator.

@effigies effigies added the help wanted Extra attention is needed label Jun 12, 2024
@yarikoptic
Copy link
Contributor Author

yarikoptic commented Jun 12, 2024

could it be made explicit? e.g. (assuming that syntax for sets handling is added)

diff --git a/src/schema/rules/checks/dataset.yaml b/src/schema/rules/checks/dataset.yaml
index 91704d32..434fe2fc 100644
--- a/src/schema/rules/checks/dataset.yaml
+++ b/src/schema/rules/checks/dataset.yaml
@@ -11,7 +11,8 @@ SubjectFolders:
   selectors:
     - path == '/dataset_description.json'
   checks:
-    - length(dataset.subjects.sub_dirs) > 0
+    - check: length(dataset.subjects.sub_dirs) > 0
+      hint: length(dataset.subjects.sub_dirs)
 
 # 49
 ParticipantIDMismatch:
@@ -24,7 +25,8 @@ ParticipantIDMismatch:
   selectors:
     - path == '/participants.tsv'
   checks:
-    - allequal(sorted(columns.participant_id), sorted(dataset.subjects.sub_dirs))
+    - check: allequal(set(columns.participant_id), set(dataset.subjects.sub_dirs))
+      hint: set(columns.participant_id).difference(dataset.subjects.sub_dirs)
 
 # 51
 PhenotypeSubjectsMissing:

? although indeed while doing few of those I started to think about pytest's approach ;-)

edit: 1st example I did was quite dumb ;-)

@effigies effigies changed the title deno validator: needs more hints for a user upon PARTICIPANT_ID_MISMATCH Introspect values in failing checks Jun 12, 2024
@effigies effigies added the enhancement New feature or request label Jun 12, 2024
@effigies
Copy link
Contributor

Whoever implements this gets to decide whether it makes more sense to modify the schema to allow hints to be specified there or to do introspection.

Personally, I don't want to do either. If we do this, I would probably do it in the Python validator, since I've already written a parser for the expression language. With an AST, we will be able to easily introspect every check.

@effigies effigies transferred this issue from bids-standard/legacy-validator Nov 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants