-
Notifications
You must be signed in to change notification settings - Fork 124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FIX: Convolve zero-duration (impulse) events when variable contains multiple events #645
FIX: Convolve zero-duration (impulse) events when variable contains multiple events #645
Conversation
Codecov Report
@@ Coverage Diff @@
## master #645 +/- ##
==========================================
+ Coverage 83.81% 84.10% +0.28%
==========================================
Files 27 27
Lines 3331 3335 +4
Branches 843 842 -1
==========================================
+ Hits 2792 2805 +13
+ Misses 346 341 -5
+ Partials 193 189 -4
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
@tyarkoni Just bumping this for review, when you get a chance. |
Looks pretty good to me but what I'm trying to figure out (haven't tested it with data), is how much the behavior is changed for non-impulse events. Here's the original PR: #411 As an aside, what's up with Travis? |
Will look at Travis tomorrow. I did see #411, but I'm not sure I understand the intent behind the behaviors that led to the specific oversampling rates originally, so if you're able to reproduce that logic, I would appreciate it. If not, I can update these values, and try to put in an explanation of why they are (now) what they are. |
What was the Travis issue you were seeing, btw? Was it just the test failures? |
The travis issue was just a failed build. Looks good now. |
@tyarkoni This is ready for review. Revisited the oversampling factor of 2 that was suggested in #411 (comment) and implemented in 24646c8. The idea here was to add a factor to avoid aliasing. In the case of a dense time series, the frequency components we have access to are already sampling_rate / 2, and oversampling beyond ensuring a reasonable shape to the HRF is unlikely to be helpful. There's an odd effect that we sparsify the dense variables before convolving, which means they have durations of 1/SR and onsets as a cumulative sum of durations. Our "minimum interval" is thus always 1/SR, so any safety factor we through in will just pop right back out and do us no good in the process. As an aside, we may want to reconsider how we convolve dense time series, as sparsifying, upsampling, densifying, convolving and downsampling seems like an inefficient alternative to upsampling, convolving and downsampling. For sparse variables, I have preserved the factor. It may be overkill in many cases, as |
LGTM. I rather err on the side of caution but let's keep an eye on performance. It hasn't been much of an issue for me so far. |
Okay, let's do it. |
Fixing #643.
Starting with a test.
To bring some context over, convolving a single, zero-duration event produces expected results. If a second event is present, zero duration or otherwise, then the logic currently uses the minimum duration to estimate a reasonable oversampling rate. When duration is 0, the rate is infinite, and problems are had.