Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LearnableSqueezer #94

Open
wants to merge 18 commits into
base: master
Choose a base branch
from
3 changes: 2 additions & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,11 +1,12 @@
name = "InvertibleNetworks"
uuid = "b7115f24-5f92-4794-81e8-23b0ddb121d3"
authors = ["Philipp Witte <[email protected]>", "Ali Siahkoohi <[email protected]>", "Mathias Louboutin <[email protected]>", "Gabrio Rizzuti <g.rizzuti@umcutrecht.nl>", "Rafael Orozco <[email protected]>", "Felix J. herrmann <[email protected]>"]
authors = ["Philipp Witte <[email protected]>", "Ali Siahkoohi <[email protected]>", "Mathias Louboutin <[email protected]>", "Gabrio Rizzuti <rizzuti[email protected]>", "Rafael Orozco <[email protected]>", "Felix J. herrmann <[email protected]>"]
version = "2.2.6"

[deps]
CUDA = "052768ef-5323-5732-b1bb-66c8b64840ba"
ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4"
ExponentialUtilities = "d4d017d3-3776-5f7e-afef-a10c40355c18"
Flux = "587475ba-b771-5e3f-ad9e-33799f191a9c"
JOLI = "bb331ad6-a1cf-11e9-23da-9bcb53c69f6f"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ This package uses functions from [NNlib.jl](https://github.com/FluxML/NNlib.jl),

- Philipp Witte, Georgia Institute of Technology (now Microsoft)

- Gabrio Rizzuti, Utrecht University
- Gabrio Rizzuti, Georgia Institute of Technology (now Shearwater Geoservices)

- Mathias Louboutin, Georgia Institute of Technology

Expand Down
2 changes: 1 addition & 1 deletion docs/src/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ Pages = ["dimensionality_operations.jl"]
```@autodocs
Modules = [InvertibleNetworks]
Order = [:type]
Filter = t -> t<:NeuralNetLayer
Filter = t -> t<:InvertibleNetwork
```

## Networks
Expand Down
5 changes: 4 additions & 1 deletion src/InvertibleNetworks.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ using LinearAlgebra, Random
using Statistics, Wavelets
using JOLI
using NNlib, Flux, ChainRulesCore
using ExponentialUtilities

# Overloads and reexports
import Base.size, Base.length, Base.getindex, Base.reverse, Base.reverse!, Base.getproperty
Expand Down Expand Up @@ -38,12 +39,13 @@ end

# Utils
include("utils/parameter.jl")
include("utils/neuralnet.jl")
include("utils/objective_functions.jl")
include("utils/dimensionality_operations.jl")
include("utils/activation_functions.jl")
include("utils/test_distributions.jl")
include("utils/neuralnet.jl")
include("utils/invertible_network_sequential.jl")

# AD rules
include("utils/chainrules.jl")

Expand All @@ -60,6 +62,7 @@ include("layers/invertible_layer_irim.jl")
include("layers/invertible_layer_glow.jl")
include("layers/invertible_layer_hyperbolic.jl")
include("layers/invertible_layer_hint.jl")
include("layers/learnable_squeezer.jl")

# Invertible network architectures
include("networks/invertible_network_hint_multiscale.jl")
Expand Down
2 changes: 1 addition & 1 deletion src/conditional_layers/conditional_layer_glow.jl
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ or

See also: [`Conv1x1`](@ref), [`ResidualBlock`](@ref), [`get_params`](@ref), [`clear_grad!`](@ref)
"""
struct ConditionalLayerGlow <: NeuralNetLayer
struct ConditionalLayerGlow <: InvertibleNetwork
C::Conv1x1
RB::ResidualBlock
logdet::Bool
Expand Down
2 changes: 1 addition & 1 deletion src/conditional_layers/conditional_layer_hint.jl
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ export ConditionalLayerHINT, ConditionalLayerHINT3D

See also: [`CouplingLayerBasic`](@ref), [`ResidualBlock`](@ref), [`get_params`](@ref), [`clear_grad!`](@ref)
"""
mutable struct ConditionalLayerHINT <: NeuralNetLayer
mutable struct ConditionalLayerHINT <: InvertibleNetwork
CL_X::CouplingLayerHINT
CL_Y::CouplingLayerHINT
CL_YX::CouplingLayerBasic
Expand Down
2 changes: 1 addition & 1 deletion src/conditional_layers/conditional_layer_residual_block.jl
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ or

See also: [`get_params`](@ref), [`clear_grad!`](@ref)
"""
struct ConditionalResidualBlock <: NeuralNetLayer
struct ConditionalResidualBlock <: NeuralNetwork
W0::Parameter
W1::Parameter
W2::Parameter
Expand Down
2 changes: 1 addition & 1 deletion src/layers/invertible_layer_actnorm.jl
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ export ActNorm, reset!

See also: [`get_params`](@ref), [`clear_grad!`](@ref)
"""
mutable struct ActNorm <: NeuralNetLayer
mutable struct ActNorm <: InvertibleNetwork
k::Integer
s::Parameter
b::Parameter
Expand Down
2 changes: 1 addition & 1 deletion src/layers/invertible_layer_basic.jl
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ or

See also: [`ResidualBlock`](@ref), [`get_params`](@ref), [`clear_grad!`](@ref)
"""
mutable struct CouplingLayerBasic <: NeuralNetLayer
mutable struct CouplingLayerBasic <: InvertibleNetwork
RB::Union{ResidualBlock, FluxBlock}
logdet::Bool
activation::ActivationFunction
Expand Down
6 changes: 3 additions & 3 deletions src/layers/invertible_layer_conv1x1.jl
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ export Conv1x1

See also: [`get_params`](@ref), [`clear_grad!`](@ref)
"""
struct Conv1x1 <: NeuralNetLayer
struct Conv1x1 <: InvertibleNetwork
k::Integer
v1::Parameter
v2::Parameter
Expand Down Expand Up @@ -159,8 +159,8 @@ function conv1x1_grad_v(X::AbstractArray{T, N}, ΔY::AbstractArray{T, N},
n_in, batchsize = size(X)[N-1:N]
prod_res = cuzeros(X, size(dV1, 1))
for i=1:batchsize
Xi = -2f0*reshape(selectdim(X, N, i), :, n_in)
ΔYi = reshape(selectdim(ΔY, N, i), :, n_in)
Xi = -2*reshape(selectdim(X, N, i), :, n_in)
ΔYi = 1*reshape(selectdim(ΔY, N, i), :, n_in) # 1* force conversion from ReshapedArray to array
mloubout marked this conversation as resolved.
Show resolved Hide resolved
broadcast!(+, dv1, dv1, mat_tens_i(prod_res, Xi, dV1, ΔYi))
broadcast!(+, dv2, dv2, mat_tens_i(prod_res, Xi, dV2, ΔYi))
broadcast!(+, dv3, dv3, mat_tens_i(prod_res, Xi, dV3, ΔYi))
Expand Down
2 changes: 1 addition & 1 deletion src/layers/invertible_layer_glow.jl
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ or

See also: [`Conv1x1`](@ref), [`ResidualBlock`](@ref), [`get_params`](@ref), [`clear_grad!`](@ref)
"""
struct CouplingLayerGlow <: NeuralNetLayer
struct CouplingLayerGlow <: InvertibleNetwork
C::Conv1x1
RB::Union{ResidualBlock, FluxBlock}
logdet::Bool
Expand Down
6 changes: 3 additions & 3 deletions src/layers/invertible_layer_hint.jl
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ export CouplingLayerHINT, CouplingLayerHINT3D

See also: [`CouplingLayerBasic`](@ref), [`ResidualBlock`](@ref), [`get_params`](@ref), [`clear_grad!`](@ref)
"""
mutable struct CouplingLayerHINT <: NeuralNetLayer
mutable struct CouplingLayerHINT <: InvertibleNetwork
CL::AbstractArray{CouplingLayerBasic, 1}
C::Union{Conv1x1, Nothing}
logdet::Bool
Expand All @@ -72,8 +72,8 @@ function get_depth(n_in)
end

# Constructor for given coupling layer and 1 x 1 convolution
CouplingLayerHINT(CL::AbstractArray{CouplingLayerBasic, 1}, C::Union{Conv1x1, Nothing};
logdet=false, permute="none", activation::ActivationFunction=SigmoidLayer()) = CouplingLayerHINT(CL, C, logdet, permute, false)
CouplingLayerHINT(CL::AbstractArray{CouplingLayerBasic, 1}, C::Union{Conv1x1, Nothing}; logdet=false, permute="none") =
CouplingLayerHINT(CL, C, logdet, permute, false)

# 2D Constructor from input dimensions
function CouplingLayerHINT(n_in::Int64, n_hidden::Int64; logdet=false, permute="none",
Expand Down
2 changes: 1 addition & 1 deletion src/layers/invertible_layer_hyperbolic.jl
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Create an invertible hyperbolic coupling layer.

See also: [`get_params`](@ref), [`clear_grad!`](@ref)
"""
struct HyperbolicLayer <: NeuralNetLayer
struct HyperbolicLayer <: InvertibleNetwork
W::Parameter
b::Parameter
α::Float32
Expand Down
2 changes: 1 addition & 1 deletion src/layers/invertible_layer_irim.jl
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ or

See also: [`Conv1x1`](@ref), [`ResidualBlock!`](@ref), [`get_params`](@ref), [`clear_grad!`](@ref)
"""
struct CouplingLayerIRIM <: NeuralNetLayer
struct CouplingLayerIRIM <: InvertibleNetwork
C::Conv1x1
RB::Union{ResidualBlock, FluxBlock}
end
Expand Down
2 changes: 1 addition & 1 deletion src/layers/invertible_layer_template.jl
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ export AffineLayer
# parameters of our layer (in this case S and B) or any other building blocks that
# you want to use in your layer (for example 1x1 convolutions). However, in this
# case we only have parameters S and B.
struct AffineLayer <: NeuralNetLayer
struct AffineLayer <: InvertibleNetwork
S::Parameter # trainable parameters are defined as Parameters.
B::Parameter # both S and B have two fields: S.data and S.grad
logdet::Bool # bool to indicate whether you want to compute the logdet
Expand Down
2 changes: 1 addition & 1 deletion src/layers/layer_affine.jl
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ export AffineLayer

See also: [`get_params`](@ref), [`clear_grad!`](@ref)
"""
struct AffineLayer <: NeuralNetLayer
struct AffineLayer <: InvertibleNetwork
s::Parameter
b::Parameter
logdet::Bool
Expand Down
2 changes: 1 addition & 1 deletion src/layers/layer_flux_block.jl
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ export FluxBlock

See also: [`Chain`](@ref), [`get_params`](@ref), [`clear_grad!`](@ref)
"""
mutable struct FluxBlock <: NeuralNetLayer
mutable struct FluxBlock <: NeuralNetwork
model::Chain
params::Array{Parameter, 1}
end
Expand Down
2 changes: 1 addition & 1 deletion src/layers/layer_residual_block.jl
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ or

See also: [`get_params`](@ref), [`clear_grad!`](@ref)
"""
struct ResidualBlock <: NeuralNetLayer
struct ResidualBlock <: NeuralNetwork
W1::Parameter
W2::Parameter
W3::Parameter
Expand Down
Loading
Loading