Skip to content

Commit

Permalink
computations: Support ELU and SELU activations
Browse files Browse the repository at this point in the history
Part of #12 (SNNs)
  • Loading branch information
mdraw committed Jul 15, 2017
1 parent cd93ad8 commit e27276a
Showing 1 changed file with 14 additions and 2 deletions.
16 changes: 14 additions & 2 deletions elektronn2/neuromancer/computations.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,11 +79,23 @@ def apply_activation(x, activation_func, b1=None):
func = T.tanh
elif activation_func=='relu': # rectified linear unit ,range = [0,inf]
func = T.nnet.relu

elif activation_func=='prelu': # parameterised relu
# T.nnet.relu also implements prelu (with extra "alpha" parameter)
func = T.nnet.relu

elif activation_func=='elu': # exponential linear unit
func = T.nnet.elu
elif activation_func=='selu': # scaled exponential linear unit (for SNNs)
def selu(y):
"""
SELU for Self-Normalizing Networks (https://arxiv.org/abs/1706.02515)
T.nnet.selu() is only available in Theano 0.10+, so we implement
it ourselves.
"""
alpha = 1.6732632423543772848170429916717
scale = 1.0507009873554804934193349852946
return scale * T.nnet.elu(x, alpha)
func = selu
elif activation_func=='abs': # abs unit ,range = [0,inf]
func = T.abs_
elif activation_func in ['sig', 'logistic', 'sigmoid']: # range = [0,1]
Expand Down

0 comments on commit e27276a

Please sign in to comment.