Skip to content

adithya604/mixture-of-experts

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Mixture of experts layer for Keras

This repository contains a Keras layer implementing a dense mixture of experts model:

Some of the main arguments are as follows:

  • units: the output dimensionality
  • n_experts: the number of experts ()
  • expert_activation: activation function for the expert model ()
  • gating_activation: activation function for the gating model ()

Please see MixtureOfExperts.py for additional arguments. The file moe_demo.py contains an example demonstrating how to use this layer. The example there essentially implements the simulations reported in this blog post.

About

Mixture of experts layer for Keras

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%