Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIX: Convolve zero-duration (impulse) events when variable contains multiple events #645

Merged
merged 6 commits into from
Aug 12, 2020

Conversation

effigies
Copy link
Collaborator

@effigies effigies commented Aug 5, 2020

Fixing #643.

Starting with a test.

To bring some context over, convolving a single, zero-duration event produces expected results. If a second event is present, zero duration or otherwise, then the logic currently uses the minimum duration to estimate a reasonable oversampling rate. When duration is 0, the rate is infinite, and problems are had.

@codecov
Copy link

codecov bot commented Aug 5, 2020

Codecov Report

Merging #645 into master will increase coverage by 0.28%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #645      +/-   ##
==========================================
+ Coverage   83.81%   84.10%   +0.28%     
==========================================
  Files          27       27              
  Lines        3331     3335       +4     
  Branches      843      842       -1     
==========================================
+ Hits         2792     2805      +13     
+ Misses        346      341       -5     
+ Partials      193      189       -4     
Flag Coverage Δ
#unittests 84.10% <100.00%> (+0.28%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
bids/analysis/transformations/compute.py 91.05% <100.00%> (+2.26%) ⬆️
bids/layout/layout.py 84.89% <0.00%> (+0.50%) ⬆️
bids/variables/variables.py 89.43% <0.00%> (+0.81%) ⬆️
bids/analysis/hrf.py 55.55% <0.00%> (+2.02%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 76172ab...2ac570d. Read the comment docs.

@effigies
Copy link
Collaborator Author

@tyarkoni Just bumping this for review, when you get a chance.

@adelavega
Copy link
Collaborator

Looks pretty good to me but what I'm trying to figure out (haven't tested it with data), is how much the behavior is changed for non-impulse events.

Here's the original PR: #411

As an aside, what's up with Travis?

@effigies
Copy link
Collaborator Author

Will look at Travis tomorrow. I did see #411, but I'm not sure I understand the intent behind the behaviors that led to the specific oversampling rates originally, so if you're able to reproduce that logic, I would appreciate it.

If not, I can update these values, and try to put in an explanation of why they are (now) what they are.

@effigies
Copy link
Collaborator Author

What was the Travis issue you were seeing, btw? Was it just the test failures?

@adelavega
Copy link
Collaborator

The travis issue was just a failed build. Looks good now.

@effigies
Copy link
Collaborator Author

@tyarkoni This is ready for review. Revisited the oversampling factor of 2 that was suggested in #411 (comment) and implemented in 24646c8. The idea here was to add a factor to avoid aliasing.

In the case of a dense time series, the frequency components we have access to are already sampling_rate / 2, and oversampling beyond ensuring a reasonable shape to the HRF is unlikely to be helpful. There's an odd effect that we sparsify the dense variables before convolving, which means they have durations of 1/SR and onsets as a cumulative sum of durations. Our "minimum interval" is thus always 1/SR, so any safety factor we through in will just pop right back out and do us no good in the process. As an aside, we may want to reconsider how we convolve dense time series, as sparsifying, upsampling, densifying, convolving and downsampling seems like an inefficient alternative to upsampling, convolving and downsampling.

For sparse variables, I have preserved the factor. It may be overkill in many cases, as 1 / minimum interval is the upper bound of frequencies actually present in the data, but a factor of 2 is relatively cheap.

@adelavega
Copy link
Collaborator

LGTM. I rather err on the side of caution but let's keep an eye on performance. It hasn't been much of an issue for me so far.

@effigies
Copy link
Collaborator Author

Okay, let's do it.

@effigies effigies merged commit afa82b0 into bids-standard:master Aug 12, 2020
@effigies effigies deleted the fix/zero_duration_events branch November 19, 2020 22:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants