-
Notifications
You must be signed in to change notification settings - Fork 67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deconvolution problems #157
Comments
The last change to spike deconvolution I remember was more than 5 months ago and it was a change in running baseline correction. If it's not that, have you checked that your timescale and sampling rate is the same? Btw, the new version looks better to me, the old seems to have been losing some smaller transients, which is exactly what you'd expect if the baselining was not good. The Python version also has running baseline correction, but some of the parameters might be treated differently, for example sampling rate is per plane not cumulative. Other than that, I think the baselining might be slightly different, I could double check. Would be good to figure this out, we'd like the Python and Matlab versions to be as similar as possible. |
Thanks for the fast reply! Timescale / sampling rate are the same, yes. I will do some more tests, hope to get back to you soon. |
By the way, I get quite a few warnings during baseline / coeff estimation: Suite2P/SpikeDetection/wrapperDECONV.m Line 79 in 73456d2
(This has not changed and happened since I started using suite2p) |
Sorry, didn't see this last one. Could you double check your neuropil subtraction coefficients? We have not yet implemented an adaptive coefficient per neuron in the Python version (it's fixed at 0.7). The warnings could mean that some neurons get poor estimates of coefficient. |
Hello! I have a problem with the output from deconvolution (OASIS). I noticed in recent analysis output that my spike matrices were much denser in comparison to ones I analyzed about half a year ago. Since then I have updated both the OASIS matlab package and suite2p code. I am thresholding the deconvolution output with
mean + 2 * standard deviation
for every cell, which seems to give stable results (across sessions / animals), meaning, results that "by eye" look reasonable. Now the same threshold does not filter the output adequately. I am showing an example of 4 different sessions, analyzed with old and new code, below.The problem is that even if I set the thresholds for spikes higher I seem not to be able to clean up the deconvolution results (I am getting rid of spikes that belong to obvious transients).
I tried to revert the OASIS package to the code version I used ~5 months ago. With all suite2p parameters the same (no differences in "ops") there are still differences visible (OLD - old analysis result, NEW - same cell, with old OASIS package and up-to-date suite2p code).
There are also apparent differences in the neuropil corrected traces in the examples above. I also tried to run the suite2p python version which, with the thresholds I usually use, leads to good results! Since my pipeline depends still on the Matlab version for now I would appreciate any input on what might be going on.
The text was updated successfully, but these errors were encountered: