Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Design matrix has nvols + 1 rows after convolution #355

Closed
effigies opened this issue Jan 25, 2019 · 8 comments · Fixed by #357
Closed

Design matrix has nvols + 1 rows after convolution #355

effigies opened this issue Jan 25, 2019 · 8 comments · Fixed by #357

Comments

@effigies
Copy link
Collaborator

For a dataset with 451 volumes, running Split() and then Convolve(model='spm') creates a variable collection that, when we run get_design_matrix() the output is 452 rows long.

We have a bizarre theory that odd numbers of TRs is hitting a weird rounding issue, but in trying to create a minimal example, we've brought my computer to its knees trying to run Convolve.

@tyarkoni I think this is probably your fault.

cc @yarikoptic @AdinaWagner

@effigies
Copy link
Collaborator Author

See #357 for a partial fix. The other issue here is that a slight increase in TR can cause get_design_matrix to return a DM with a different number of rows from the time series, even when sampling_rate='TR' (default).

@effigies
Copy link
Collaborator Author

Working on a minimal replication, but with RepetitionTime == 2.000001 and number of volumes 451, something like SparseRunVariable.to_dense().resample(1/Repetitiontime) produces a column 452 rows long.

@yarikoptic
Copy link
Collaborator

452!=451 but then why "but" since sounds like you replicated it? Or not?

@effigies
Copy link
Collaborator Author

Haven't yet, just wanted to hit send on my note before I lost the tab.

@adswa
Copy link
Contributor

adswa commented Jan 27, 2019

I've been trying to go to higher level analysis during the weekend, and I'm just sharing a premature traceback from the current crash of the wf_loader Node.

Click here to expand traceback
Traceback: 
Traceback (most recent call last):
  File "/home/adina/env/fitlins/local/lib/python3.5/site-packages/nipype/pipeline/plugins/multiproc.py", line 69, in run_node
    result['result'] = node.run(updatehash=updatehash)
  File "/home/adina/env/fitlins/local/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 473, in run
    result = self._run_interface(execute=True)
  File "/home/adina/env/fitlins/local/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 557, in _run_interface
    return self._run_command(execute)
  File "/home/adina/env/fitlins/local/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 637, in _run_command
    result = self._interface.run(cwd=outdir)
  File "/home/adina/env/fitlins/local/lib/python3.5/site-packages/nipype/interfaces/base/core.py", line 369, in run
    runtime = self._run_interface(runtime)
  File "/home/adina/Repos/fitlins/fitlins/interfaces/bids.py", line 181, in _run_interface
    self._load_level1(runtime, analysis)
  File "/home/adina/Repos/fitlins/fitlins/interfaces/bids.py", line 195, in _load_level1
    for sparse, dense, ents in step.get_design_matrix():
  File "/home/adina/Repos/pybids/bids/analysis/analysis.py", line 263, in get_design_matrix
    for n in nodes]
  File "/home/adina/Repos/pybids/bids/analysis/analysis.py", line 263, in <listcomp>
    for n in nodes]
  File "/home/adina/Repos/pybids/bids/analysis/analysis.py", line 402, in get_design_matrix
    sampling_rate=sampling_rate, **kwargs)
  File "/home/adina/Repos/pybids/bids/variables/kollekshuns.py", line 345, in to_df
    in_place=False).values())
  File "/home/adina/Repos/pybids/bids/variables/kollekshuns.py", line 276, in resample
    kind=kind)
  File "/home/adina/Repos/pybids/bids/variables/variables.py", line 439, in resample
    var.resample(sampling_rate, True, kind)
  File "/home/adina/Repos/pybids/bids/variables/variables.py", line 454, in resample
    f = interp1d(x, self.values.values.ravel(), kind=kind)
  File "/usr/local/lib/python3.5/dist-packages/scipy/interpolate/interpolate.py", line 433, in __init__
    _Interpolator1D.__init__(self, x, y, axis=axis)
  File "/usr/local/lib/python3.5/dist-packages/scipy/interpolate/polyint.py", line 60, in __init__
    self._set_yi(yi, xi=xi, axis=axis)
  File "/usr/local/lib/python3.5/dist-packages/scipy/interpolate/polyint.py", line 125, in _set_yi
    raise ValueError("x and y arrays must be equal in length along "
ValueError: x and y arrays must be equal in length along interpolation axis.

To me its pretty bizarre and I hope to follow up on this soon with further updates that shed more light on what is happening (unfortunately, analysis are extremely slow). As it seems, interpolation during the convolve transformation which we altered Friday in #358 fails as onsets and amplitudes (at least I'm assuming that its those columns) given to the function are of unequal length now. (previously, it failed a bit earlier due to a dimension mismatch which is fixed by #358 and changing a TR of 2.0000something in nifti files and .jsons to 2.0). But what I've been breaking my head over and over for the past two days is that it does not fail for every subject and run. @effigies and I ran it on Friday successfully on two runs for one subject, and I've been able to run a full level 1 model without transformations blowing up on certain subjects, but for others it breaks.

I have double checked that all TRs are consistent and exchanged the TRs in the nifti headers from 2.00000something to 2.0. I keep looking into the specifics of my files to find the culprit responsible for this weird behavior.

Tbh I bet this is something extremely specific to my directory and the likeliest explanation is that I've messed up something, so I don't think this comment of mine requires (or even enables) any action. Sorry for this intermediate mess.

@effigies
Copy link
Collaborator Author

effigies commented Jan 28, 2019

@AdinaWagner Is there a path in smaug where I can fetch this dataset?

@adswa
Copy link
Contributor

adswa commented Jan 28, 2019

@effigies sorry, I missed that!
Its on hydra, do you have an account there by any chance?

@effigies
Copy link
Collaborator Author

I can't login with my smaug credentials, so I assume not.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants