Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] Train a multi-input multi-output model with corregionalization. #63

Open
david-vicente opened this issue Sep 24, 2020 · 3 comments

Comments

@david-vicente
Copy link

david-vicente commented Sep 24, 2020

I managed to train a multi-input single-output with no issues. Consider the fake data:

x1 = -5:0.8:5
x2 = -5:0.8:5

xx1 = vec([i for i in x1, j in x2])
xx2 = vec([j for i in x1, j in x2])

X = [x1 x2]

h(x, y) = 10 * (sin(x^2 + y^2) / (x^2 + y^2)) + 10
Y = [h(i,j) for (i,j) in zip(xx1, xx2)];

i simply used this model

kernel = KernelFunctions.SqExponentialKernel()
m = GP(X, Y, kernel,opt_noise=false)

Now imagine that I want to a multi-output setting, such that my data is now

x1 = -5:0.8:5
x2 = -5:0.8:5

xx1 = vec([i for i in x1, j in x2])
xx2 = vec([j for i in x1, j in x2])

X = [x1 x2]

h(x, y) = 10 * (sin(x^2 + y^2) / (x^2 + y^2)) + 10
g(x, y) = -1 / (x^2 + y^2 + 1) + 10


y1 = [h(i,j) for (i,j) in zip(xx1, xx2)];
y2 = [g(i,j) for (i,j) in zip(xx1, xx2)];

Y = [y1 y2]

How should I proceed? There aren't any examples for this case in the docs.

@david-vicente david-vicente changed the title [Question] Train a multi input multi output model with corregionalization. [Question] Train a multi-input multi-output model with corregionalization. Sep 24, 2020
@david-vicente
Copy link
Author

david-vicente commented Sep 25, 2020

I also noticed that the signatures of the constructors for MOARGP, MOSVGP an MOVGP require that the input labels variable y is a AbstractVector{<:AbstractArray}, so we can't pass a Matrix.

@theogf
Copy link
Owner

theogf commented Sep 25, 2020

Hi, you're right, there's unfortunately no example for multiple output.
Given your provided example you can do
MOVGP(X, [y1, y2], kernel, Gaussian Likelihood(), AnalyticVI(), 2)
If you want to have an analytic solution I would however recommend you having a look at JuliaGaussianProcesses/AbstractGPs.jl#30

@david-vicente
Copy link
Author

Trying your suggestion gives me:

BoundsError: attempt to access 1-element Array{Int64,1} at index [2]

Stacktrace:
 [1] getindex at ./array.jl:809 [inlined]
 [2] #76 at /home/myuser/.julia/packages/AugmentedGaussianProcesses/93OMX/src/inference/analyticVI.jl:49 [inlined]
 [3] ntuple(::AugmentedGaussianProcesses.var"#76#77"{Float64,Array{Int64,1},Descent}, ::Int64) at ./ntuple.jl:18
 [4] AnalyticVI{Float64,1}(::Float64, ::Bool, ::Array{Int64,1}, ::Array{Int64,1}, ::Array{Int64,1}, ::Int64, ::Descent) at /home/myuser/.julia/packages/AugmentedGaussianProcesses/93OMX/src/inference/analyticVI.jl:49
 [5] tuple_inference(::AnalyticVI{Float64,1}, ::Int64, ::Array{Int64,1}, ::Array{Int64,1}, ::Array{Int64,1}) at /home/myuser/.julia/packages/AugmentedGaussianProcesses/93OMX/src/inference/analyticVI.jl:106
 [6] MOVGP(::Array{Float64,2}, ::Array{Array{Float64,1},1}, ::SqExponentialKernel, ::GaussianLikelihood{Float64,Nothing,Array{Float64,1}}, ::AnalyticVI{Float64,1}, ::Int64; verbose::Int64, optimiser::ADAM, atfrequency::Int64, mean::ZeroMean{Float64}, variance::Float64, Aoptimiser::ADAM, ArrayType::UnionAll) at /home/myuser/.julia/packages/AugmentedGaussianProcesses/93OMX/src/models/MOVGP.jl:147
 [7] MOVGP(::Array{Float64,2}, ::Array{Array{Float64,1},1}, ::SqExponentialKernel, ::GaussianLikelihood{Float64,Nothing,Array{Float64,1}}, ::AnalyticVI{Float64,1}, ::Int64) at /home/myuser/.julia/packages/AugmentedGaussianProcesses/93OMX/src/models/MOVGP.jl:76
 [8] top-level scope at In[9]:4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants