Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Softmax regression, weights going to infinity in deep HMEs #6

Open
AmazaspShumik opened this issue Jul 19, 2015 · 0 comments
Open

Softmax regression, weights going to infinity in deep HMEs #6

AmazaspShumik opened this issue Jul 19, 2015 · 0 comments
Labels

Comments

@AmazaspShumik
Copy link
Owner

In deep trees there are cases where there is complete separability of data , in that case softmax regression (version which is not overparameterised) suffers from the same problem as logistic regression.
Should use overparameterised version (slower convergence since non-unique solution for short trees).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant