You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Right now, Flux and Lux layers are just tiny wrappers are the forward pass implementation that lives in GNNlib.jl. While it is nice to share the code, I think that the price we pay in code readability is too high.
Maybe we should just have duplicated implementations in GraphNeuralNetworks.jl and GNNLux.jl.
The text was updated successfully, but these errors were encountered:
CarloLucibello
changed the title
duplicate layer's implementations in GNN.jl and GNNLux.jl
opt for duplicate layer's implementations in GNN.jl and GNNLux.jl
Dec 17, 2024
Right now, Flux and Lux layers are just tiny wrappers are the forward pass implementation that lives in GNNlib.jl. While it is nice to share the code, I think that the price we pay in code readability is too high.
Maybe we should just have duplicated implementations in GraphNeuralNetworks.jl and GNNLux.jl.
The text was updated successfully, but these errors were encountered: