-
Notifications
You must be signed in to change notification settings - Fork 657
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Neural representation similarities #683
Comments
Can you provide links to the relevant papers or code? I'm wondering if it's out of the scope of this library, or if there is an existing library that already serves this purpose well. |
CKA and CCA similarities have been implemented by jayroxis and moskomule. However the second link is a library with other neural network representation similarities i am not sure it is mantained anymore, but there are tests and pytorch implementation to which compare in the tests of a future implementation. |
Does it make sense to apply CKA and CCA to embeddings? |
I think it depends on the task. It may be used in some application to NAS or in some other model that could benefit from these similarities |
I mean can they be used as a drop-in replacement for any existing losses, like the contrastive loss? |
Probably not. For CKA one uses the data matrix X (num_examples x num_features), computes a Gram matrix with it (in the link they either use linear or rbf kernels, but i think any other similarity between embeddings can be adopted) and then compares it with another Gram matrix from another batch of data. I think it qualifies best as a network comparison method rather than a true similarity measure as it is intended in this library. |
Ok let's leave it out for now, unless other people express interest. |
Will you consider to add to this library some similarities among neural network representations? Like CCA or CKA for example.
The text was updated successfully, but these errors were encountered: