Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement gradient for QR decomposition #1099

Open
jessegrabowski opened this issue Nov 22, 2024 · 0 comments
Open

Implement gradient for QR decomposition #1099

jessegrabowski opened this issue Nov 22, 2024 · 0 comments

Comments

@jessegrabowski
Copy link
Member

jessegrabowski commented Nov 22, 2024

Description

QR is one of the few remaining linalg ops that is missing a gradient. JAX code for the jvp is here, whic also includes this derivation. This paper also claims to derive the gradients for QR, but I find it unreadable.

Relatedly but perhaps worthy of a separate issue, this paper derives gradients for the LQ decomposition, $$A = LQ$$, where $L$ is lower triangular and $Q$ is orthonormal ($$Q^TQ=I$$.) Compare this to QR, which gives you $$A = QR$$, where $$Q$$ is again orthonormal, but $$R$$ is upper triangular, and you see why I mention it in this issue. It wouldn't be hard to offer LQ as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants