-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
@ffg macro #131
base: master
Are you sure you want to change the base?
@ffg macro #131
Conversation
…Added proper check for whether or not options are defined.
Alternative proposal for options specificator syntax (by @bvdmitri): y = x + 1 where { id = :y } |
Nice! The specification of the graph is more relaxed. Although, I don't like the proposed specificator syntax. |
I am absolutely not a code expert, but here are some thoughts that I have when reading this proposal.
over
The former does not need an explanation of what
|
Thanks for your comment, @bertdv. I agree that As for arrow assignment - originally I wanted a construction that works the same way as an assignment under |
…Added capture-based rewrites.
This branch now supports |
Hi @ivan-bocharov , looks cool, I like how this enables the user to pass hyper-parameters to the model constructor. I tried to rewrite the Kalman smoother demo with the
However, the |
Hi @ThijsvdLaar, thanks a lot for the feedback. Truth is that it is very hard (I think impossible if you consider all the possible cases) to figure out whether an assignment statement results in a construction of a new variable at parse time. So, if you want to have meaningful ids for variables that are created with assignment statements, you should use In the past week I have realized, however, that we can let the user return anything they want from the function. The graph will always be returned first, and the rest will be concatenated to it. So, the demo could look something like: @ffg function stateSmoother(n_samples)
# Prior statistics
m_x_0 = placeholder(:m_x_0)
v_x_0 = placeholder(:v_x_0)
# State prior
x_0 ~ GaussianMeanVariance(m_x_0, v_x_0)
# Transition and observation model
x = Vector{Variable}(undef, n_samples)
y = Vector{Variable}(undef, n_samples)
x_t_min = x_0
for t = 1:n_samples
n_t ~ GaussianMeanVariance(0.0, 200.0) # observation noise
x[t] = x_t_min + 1.0 where { id = :x_*t }
y[t] = x[t] + n_t
# Data placeholder
placeholder(y[t], :y, index=t)
# Reset state for next step
x_t_min = x[t]
end
return x
end
...
(graph, x) = stateSmoother(10)
messagePassingAlgorithm(x, ...) |
I see, nice, simply returning To me, the |
@ffg
macroThis PR implements
@ffg
macro for defining Forney-style factor graphs.Example:
@ffg
macro rewrites the function body, replacing statements that result in a variable construction with ForneyLab-specific code.Model specification language description
Model specification macro supports following constructions:
This construction behaves the same way as current version of ForneyLab.
This expression always creates a new variable. By default it uses autogenerated variable ids (
variable_1
,variable_2
, etc.). If you want to overload the variable's id, use options specificator∥
("parallel to" Unicode symbol) in a following way:This behaviour allows execution of arbitrary Julia code inside
@ffg
macro.You can use LaTeX command for the options specificator in Julia mode (
\parallel
).This expression always creates a new variable. By default it uses extracted variable id (
x
for the example above). If you want to overload the variable's id, use the same options specificator (∥
) as in the previous construction:You can use LaTeX command for the arrow in Julia mode (
\leftarrow
).Inference algorithm and posterior factorization construction
Since defined variables no longer belong to the global scope,
InferenceAlgorithm
andPosteriorFactorization
constructors now accept variable ids as inputs. For example:Demo
Nonlinear Kalman filter demo has been adapted to showcase application of the new model specification language.
Known caveats