Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatic layer construction + initialization #70

Open
Ebanflo42 opened this issue Apr 4, 2024 · 1 comment
Open

Automatic layer construction + initialization #70

Ebanflo42 opened this issue Apr 4, 2024 · 1 comment
Assignees

Comments

@Ebanflo42
Copy link
Contributor

We should have utility functions for constructing dense/convolutional layers (eventually more complex layers like LSTM or multihead attention), which take a context, input node identifiers, and initialization instructions, and return node identifiers for the layer output and the parameters.

A basic example of this is visible in the mnist_xla example.

Intializers should use XLA RNGs.

@BradenEverson BradenEverson self-assigned this Apr 4, 2024
@Ebanflo42
Copy link
Contributor Author

relevant

As for construction, the main thing missing is convolutions. this should be an easy thing to finish up. the other thing is incorporating XLA RNGs which is a bit more math-heavy but still doable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants