-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Predictive Coding vs. HGF ? #255
Comments
This probably warrants a deeper both technical and conceptual investigation, and I hesitate to claim much expertise in predictive coding in general. That said, I would say that the generalized HGF can be seen as one instance of a predictive coding network, with some differences between the traditional types of predictive coding networks. Finally, pyhgf as a package is made to be able to in principle implement other kinds of predictive coding networks than the HGF. So it should in principle be possible to implement other types of predictive coding networks too. In general, it would be exciting to compare these different variations of predictive coding properly. Thanks for writing! |
Thanks for the clarification! From your explanation and the linked paper, the main distinction seems to be that HGFs are inherently dynamic, modeling time sequences as guided random walks with predicted mean and variance. The hierarchical aspect then allows layers to model higher-order dynamics of the data. By contrast, Predictive Coding typically works on static data, indeed by minimizing an energy function, as you said. I can see a connection when static data is treated as an unchanging time sequence (then, each energy minimization step would correspond to an HGF update step). But I guess it's a superficial link, given the fundamentally different focus: Predictive Coding doesn't emphasize time dynamics, and HGFs are not meant for static data. Regardless, it would be cool to see a unifying framework for these two models! I wonder whether such a thing could exist and what it could tell us about how the brain might operate. |
(Sorry for the late response here!) We haven't shown this formally, especially because I'm not familiar enough with the classic predictive coding networks - but I think that they might be special cases of the new version of the HGF. If the lambda parameter (in the article I sent before) is set to 0 in the generalized HGF, it corresponds to changing the Gaussian random walk to a stationary Gaussian distribution where the mean is determined by the node above (the value parent). It would be nice to compare this numerically or formally - I don't think it should be too hard - but we haven't had the time yet. Do let us know if this is something you'd like to do, and we'd be happy to collaborate on it :) In any case - the interest is well-received :) |
I found this repo while looking for a Predictive Coding library, inspired by this paper. However, the focus here seems to be more on Generalized Hierarchical Gaussian Filters (HGF).
Could you clarify how HGFs relate to Predictive Coding? Are HGFs a specific instance of Predictive Coding, or do they represent a broader framework?
The text was updated successfully, but these errors were encountered: