Skip to content

Commit

Permalink
dependency updates (#95)
Browse files Browse the repository at this point in the history
* dependency updates

* reverted changes in notebook

* cleared notebook to enable build process for docu

* clean up dead references in docu

* updated readme

* removed dor

* adjustments because of dependency changes

* check for ReverseDiffAdjoint()

* added testing benchmark for different sensitivities

* Quadrature Adjoint test

* removed dead line

* cleaned up tests

* more decimals for gradient printing

* added different gradient tests

* fixed runtest

* test fix

* test new sensitivity
  • Loading branch information
ThummeTo committed Aug 8, 2023
1 parent e530e69 commit 52a0294
Show file tree
Hide file tree
Showing 19 changed files with 301 additions and 1,994 deletions.
8 changes: 3 additions & 5 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,9 +1,8 @@
name = "FMIFlux"
uuid = "fabad875-0d53-4e47-9446-963b74cae21f"
version = "0.10.5"
version = "0.10.6"

[deps]
ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4"
Colors = "5ae59095-9a9b-59fe-a467-6f913c188581"
DiffEqCallbacks = "459566f4-90b8-5000-8ac3-15dfb0a30def"
DifferentiableEigen = "73a20539-4e65-4dcb-a56d-dc20f210a01b"
Expand All @@ -18,13 +17,12 @@ Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
ThreadPools = "b189fb0b-2eb5-4ed4-bc0c-d34c51242431"

[compat]
ChainRulesCore = "1.16.0"
Colors = "0.12.8"
DiffEqCallbacks = "2.26.0"
DifferentiableEigen = "0.2.0"
DifferentialEquations = "7.8.0"
FMIImport = "0.15.7"
Flux = "0.13.17"
FMIImport = "0.15.8"
Flux = "0.13, 0.14"
Optim = "1.7.0"
ProgressMeter = "1.7.0"
Requires = "1.3.0"
Expand Down
26 changes: 17 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,12 @@
# FMIFlux.jl

## What is FMIFlux.jl?
[*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl) is a free-to-use software library for the Julia programming language, which offers the ability to set up NeuralFMUs just like NeuralODEs: You can place FMUs ([fmi-standard.org](http://fmi-standard.org/)) simply inside any feed-forward ANN topology and still keep the resulting hybrid model trainable with a standard (or custom) FluxML training process.
[*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl) is a free-to-use software library for the Julia programming language, which offers the ability to simply place your FMU ([fmi-standard.org](http://fmi-standard.org/)) everywhere inside of your ML topologies and still keep the resulting models trainable with a standard (or custom) FluxML training process. This includes for example:
- NeuralODEs including FMUs, so called *Neural Functional Mock-up Units* (NeuralFMUs):
You can place FMUs inside of your ML topology.
- PINNs including FMUs, so called *Functional Mock-Up Unit informed Neural Networks* (FMUINNs):
You can evaluate FMUs inside of your loss function.


[![Dev Docs](https://img.shields.io/badge/docs-dev-blue.svg)](https://ThummeTo.github.io/FMIFlux.jl/dev)
[![Test (latest)](https://github.com/ThummeTo/FMIFlux.jl/actions/workflows/TestLatest.yml/badge.svg)](https://github.com/ThummeTo/FMIFlux.jl/actions/workflows/TestLatest.yml)
Expand All @@ -12,40 +17,43 @@
[![Run PkgEval](https://github.com/ThummeTo/FMIFlux.jl/actions/workflows/Eval.yml/badge.svg)](https://github.com/ThummeTo/FMIFlux.jl/actions/workflows/Eval.yml)
[![Coverage](https://codecov.io/gh/ThummeTo/FMIFlux.jl/branch/main/graph/badge.svg)](https://codecov.io/gh/ThummeTo/FMIFlux.jl)
[![ColPrac: Contributor's Guide on Collaborative Practices for Community Packages](https://img.shields.io/badge/ColPrac-Contributor's%20Guide-blueviolet)](https://github.com/SciML/ColPrac)

[![FMIFlux Downloads](https://shields.io/endpoint?url=https://pkgs.genieframework.com/api/v1/badge/FMIFlux)](https://pkgs.genieframework.com?packages=FMIFlux)

## How can I use FMIFlux.jl?

1\. Open a Julia-REPL, switch to package mode using `]`, activate your preferred environment.

2\. Install [*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl):
```julia-repl
(@v1.x) pkg> add FMIFlux
(@v1) pkg> add FMIFlux
```

3\. If you want to check that everything works correctly, you can run the tests bundled with [*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl):
```julia-repl
(@v1.x) pkg> test FMIFlux
(@v1) pkg> test FMIFlux
```

4\. Have a look inside the [examples folder](https://github.com/ThummeTo/FMIFlux.jl/tree/examples/examples) in the examples branch or the [examples section](https://thummeto.github.io/FMIFlux.jl/dev/examples/overview/) of the documentation. All examples are available as Julia-Script (*.jl*), Jupyter-Notebook (*.ipynb*) and Markdown (*.md*).

## What is currently supported in FMIFlux.jl?
- building and training ME-NeuralFMUs (event-handling is supported) with the default Flux-Front-End
- building and training CS-NeuralFMUs with the default Flux-Front-End
- different AD-frameworks: ForwardDiff.jl, ReverseDiff.jl (default setting) and Zygote.jl
- building and training ME-NeuralFMUs (NeuralODEs) with support for event-handling (*DiffEqCallbacks.jl*) and discontinuous sensitivity analysis (*SciMLSensitivity.jl*)
- building and training CS-NeuralFMUs
- building and training NeuralFMUs consisiting of multiple FMUs
- building and training FMUINNs (PINNs)
- different AD-frameworks: ForwardDiff.jl (CI-tested), ReverseDiff.jl (CI-tested, default setting), FiniteDiff.jl (not CI-tested) and Zygote.jl (not CI-tested)
- ...

## What is under development in FMIFlux.jl?
- performance optimizations
- improved documentation
- more examples
- FMI3 integration
- ...

## What Platforms are supported?
[*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl) is tested (and testing) under Julia versions *1.6* (LTS) and *1.8* (latest) on Windows (latest) and Ubuntu (latest). MacOS should work, but untested.
[*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl) is tested (and testing) under Julia versions *v1.6* (LTS) and *v1* (latest) on Windows (latest) and Ubuntu (latest). MacOS should work, but untested.
[*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl) currently only works with FMI2-FMUs.
All shipped examples are automatically tested under Julia version *1.8* (latest) on Windows (latest).
All shipped examples are automatically tested under Julia version *v1* (latest) on Windows (latest).

## What FMI.jl-Library should I use?
![FMI.jl Family](https://github.com/ThummeTo/FMI.jl/blob/main/docs/src/assets/FMI_JL_family.png?raw=true "FMI.jl Family")
Expand Down
3 changes: 2 additions & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,9 @@ makedocs(sitename="FMIFlux.jl",
"Simple CS-NeuralFMU" => "examples/simple_hybrid_CS.md"
"Simple ME-NeuralFMU" => "examples/simple_hybrid_ME.md"
"Advanced ME-NeuralFMU" => "examples/advanced_hybrid_ME.md"
"JuliaCon 2023" => "examples/julicon_2023.md"
"MDPI 2022" => "examples/mdpi_2022.md"
"Modelica Conference 2021" => "examples/modelica_conference_2021.md"
"Physics-enhanced NeuralODEs in real-world applications" => "examples/mdpi_2022.md"
]
"FAQ" => "faq.md"
"Library Functions" => "library.md"
Expand Down
3 changes: 3 additions & 0 deletions docs/src/examples/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,6 @@ The examples show how to combine FMUs with machine learning ("NeuralFMU") and il
- [__JuliaCon 2023: Using NeuralODEs in real life applications__](https://thummeto.github.io/FMIFlux.jl/dev/examples/juliacon_2023/): An example for a NeuralODE in a real world engineering scenario.
- [__MDPI 2022: Physics-enhanced NeuralODEs in real-world applications__](https://thummeto.github.io/FMIFlux.jl/dev/examples/mdpi_2022/): An example for a NeuralODE in a real world modeling scenario (Contribution in *MDPI Electronics 2022*).
- [__Modelica Conference 2021: NeuralFMUs__](https://thummeto.github.io/FMIFlux.jl/dev/examples/modelica_conference_2021/): Showing basics on how to train a NeuralFMU (Contribution for the *Modelica Conference 2021*).

## Archived
- ...
1,918 changes: 87 additions & 1,831 deletions examples/src/juliacon_2023.ipynb

Large diffs are not rendered by default.

24 changes: 12 additions & 12 deletions src/batch.jl
Original file line number Diff line number Diff line change
Expand Up @@ -4,34 +4,34 @@
#

import FMIImport: fmi2Real, fmi2FMUstate, fmi2EventInfo, fmi2ComponentState
import ChainRulesCore: ignore_derivatives
import FMIImport.ChainRulesCore: ignore_derivatives
using DiffEqCallbacks: FunctionCallingCallback
using FMIImport.ForwardDiff
import FMIImport: unsense

struct FMU2Loss{T}
struct FMULoss{T}
loss::T
step::Integer
time::Real

function FMU2Loss{T}(loss::T, step::Integer=0, time::Real=time()) where {T}
function FMULoss{T}(loss::T, step::Integer=0, time::Real=time()) where {T}
inst = new{T}(loss, step, time)
return inst
end

function FMU2Loss(loss, step::Integer=0, time::Real=time())
function FMULoss(loss, step::Integer=0, time::Real=time())
loss = unsense(loss)
T = typeof(loss)
inst = new{T}(loss, step, time)
return inst
end
end

function nominalLoss(l::FMU2Loss{T}) where T <: AbstractArray
function nominalLoss(l::FMULoss{T}) where T <: AbstractArray
return unsense(sum(l.loss))
end

function nominalLoss(l::FMU2Loss{T}) where T <: Real
function nominalLoss(l::FMULoss{T}) where T <: Real
return unsense(l.loss)
end

Expand All @@ -45,7 +45,7 @@ mutable struct FMU2SolutionBatchElement <: FMU2BatchElement
initialState::Union{fmi2FMUstate, Nothing}
initialComponentState::fmi2ComponentState
initialEventInfo::Union{fmi2EventInfo, Nothing}
losses::Array{<:FMU2Loss}
losses::Array{<:FMULoss}
step::Integer

saveat::Union{AbstractVector{<:Real}, Nothing}
Expand All @@ -65,7 +65,7 @@ mutable struct FMU2SolutionBatchElement <: FMU2BatchElement

inst.initialState = nothing
inst.initialEventInfo = nothing
inst.losses = Array{FMU2Loss,1}()
inst.losses = Array{FMULoss,1}()
inst.step = 0

inst.saveat = nothing
Expand All @@ -83,7 +83,7 @@ mutable struct FMU2EvaluationBatchElement <: FMU2BatchElement
tStart::fmi2Real
tStop::fmi2Real

losses::Array{<:FMU2Loss}
losses::Array{<:FMULoss}
step::Integer

saveat::Union{AbstractVector{<:Real}, Nothing}
Expand All @@ -102,7 +102,7 @@ mutable struct FMU2EvaluationBatchElement <: FMU2BatchElement
inst.tStart = -Inf
inst.tStop = Inf

inst.losses = Array{FMU2Loss,1}()
inst.losses = Array{FMULoss,1}()
inst.step = 0

inst.saveat = nothing
Expand Down Expand Up @@ -335,7 +335,7 @@ function loss!(batchElement::FMU2SolutionBatchElement, lossFct; logLoss::Bool=tr

ignore_derivatives() do
if logLoss
push!(batchElement.losses, FMU2Loss(loss, batchElement.step))
push!(batchElement.losses, FMULoss(loss, batchElement.step))
end
end

Expand Down Expand Up @@ -370,7 +370,7 @@ function loss!(batchElement::FMU2EvaluationBatchElement, lossFct; logLoss::Bool=

ignore_derivatives() do
if logLoss
push!(batchElement.losses, FMU2Loss(loss, batchElement.step))
push!(batchElement.losses, FMULoss(loss, batchElement.step))
end
end

Expand Down
2 changes: 1 addition & 1 deletion src/flux_overload.jl
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
#

import Flux
import ChainRulesCore
import FMIImport.ChainRulesCore
import Flux.Random: AbstractRNG
import Flux.LinearAlgebra: I

Expand Down
Loading

2 comments on commit 52a0294

@ThummeTo
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/89275

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.10.6 -m "<description of version>" 52a0294279bd6e85b13d921e18c045d51bed928e
git push origin v0.10.6

Please sign in to comment.