Skip to content

Performance evaluation tools for multiclass, multirater classification models

License

Notifications You must be signed in to change notification settings

beacon-biosignals/Lighthouse.jl

Repository files navigation

Lighthouse.jl

Lighthouse.jl

CI codecov Docs: stable Docs: development

Lighthouse.jl is a Julia package that standardizes and automates performance evaluation for multiclass, multirater classification models. By implementing a minimal interface, your classifier automagically gains a thoroughly instrumented training/testing harness (Lighthouse.learn!) that computes and logs tons of meaningful performance metrics to TensorBoard in real-time, including:

  • test set loss
  • inter-rater agreement (e.g. Cohen's Kappa)
  • PR curves
  • ROC curves
  • calibration curves

Lighthouse itself is framework-agnostic; end-users should use whichever extension package matches their desired framework (e.g. https://github.com/beacon-biosignals/LighthouseFlux.jl).

This package follows the YASGuide.

Installation

To install Lighthouse for development, run:

julia -e 'using Pkg; Pkg.develop(PackageSpec(url="https://github.com/beacon-biosignals/Lighthouse.jl"))'

This will install Lighthouse to the default package development directory, ~/.julia/dev/Lighthouse.

TensorBoard

Note that Lighthouse's LearnLogger logs metrics to a user-specified path in TensorBoard's logdir format. TensorBoard can be installed via python3 -m pip install tensorboard (note: if you have tensorflow>=1.14, you should already have tensorboard). Once TensorBoard is installed, you can view Lighthouse-generated metrics via tensorboard --logdir path where path is the path specified by Lighthouse.LearnLogger. From there, TensorBoard itself can be used/configured however you like; see https://github.com/tensorflow/tensorboard for more information.

You can use alternative loggers, as long as they comply with the logging interface.