Replies: 2 comments 2 replies
-
Is this a high dimensional measurement? Or a very precise measurement?
Those are the two main reasons for hitting very low ESS. In the latter
case, one can ask whether the measurement model implies more precision than
the actual system justifies. If these problems are unavoidable for your
system, a regular particle filter will suffer from so-called "particle
depletion" arising when few particles are consistent with the data. In this
case, one can consider using a guided particle filter, e.g.,
Park, J., and Ionides, E. L. (2020). A guided intermediate resampling
particle filter for inference on high dimensional systems. *Statistics and
Computing* *30* 1497-1522.
|
Beta Was this translation helpful? Give feedback.
2 replies
-
Yes, the particle filter breaks down surprisingly quickly with p. It sounds
like you might need spatPomp
https://kidusasfaw.github.io/spatPomp/
Possibly, if there is a meaningful way to decouple your system, you can
make your life a bit easier by fitting it into a panel structure and using
panelPomp
https://github.com/cbreto/panelPomp
We have been having success with the block particle filter recently, e.g.,
https://kidusasfaw.github.io/spatPomp/vignettes/ibpf.pdf
If your model is sufficiently close to linear and Gaussian, an ensemble
Kalman filter could be the way to go. These things are model-dependent,
unlike in the low dimensional case where the basic particle filter is
widely applicable.
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am wondering if anyone else has encountered this general problem and resolved it. When I run mif2 or pfilter on my real dataset I get log likelihoods that are reasonable (not -Inf) and no filter failure warnings, but the effective sample size (ESS) at each time point is 1. I tried increasing the number of particles (Np) from Np=2000 to 10000 but ESS was still very low (<2).
I also generated simulated datasets of 2000 time points from my model and ran pfilter with the true parameters. ESS had slightly higher values (between 1 and 6) but was still mostly 1s.
ESS is calculated as the inverse of the sum of the squared weights of each particle i at time point t. If most of the weights are very small and a few are larger this could produce an ESS of 1. Is there a way to extract the number of particle rejections from a mif2 or pfilter object?
Such a small ESS indicates, for example, that the quality of the estimate using Np=1000 particles is about the same as if we would have used the ESS number of direct samples from the target distribution. So ESS=1 is very bad. Is that necessarily the case here?
Thank you I appreciate any insight.
Beta Was this translation helpful? Give feedback.
All reactions