Skip to content

Commit

Permalink
updating recurrent transfer mechanism and transfer mechanism document…
Browse files Browse the repository at this point in the history
…ation
  • Loading branch information
KristenManning committed Dec 1, 2017
1 parent ad8b98b commit 84b60a8
Show file tree
Hide file tree
Showing 4 changed files with 127 additions and 59 deletions.
16 changes: 3 additions & 13 deletions psyneulink/library/mechanisms/processing/transfer/kwta.py
Original file line number Diff line number Diff line change
Expand Up @@ -100,12 +100,12 @@
(including positive offsets). If `inhibition_only <KWTA.inhibition_only>` is `True`, then any positive offset
selected is "clipped" at (i.e re-assigned a value of) 0. This ensures that the values of the elements of the KWTA's
`variable <KWTA.variable>` are never increased.
COMMENT:
.. note::
If the `inhibition_only <KWTA.inhibition_only>` option is set to `True`, the number of elements at or above the
`threshold <KWTA.threshold>` may fall below `k_value <KWTA.k_value>`; and, if the input to the KWTA is sufficiently
low, the value of all elements may decay to 0 (depending on the value of the `decay <KWTA.decay>` parameter.
COMMENT
In all other respects, a KWTA has the same attributes and is specified in the same way as a standard
`RecurrentTransferMechanism`.
Expand Down Expand Up @@ -185,7 +185,6 @@ class KWTA(RecurrentTransferMechanism):
auto=None, \
hetero=None, \
initial_value=None, \
decay=1.0, \
noise=0.0, \
time_constant=1.0, \
k_value=0.5, \
Expand Down Expand Up @@ -249,9 +248,6 @@ class KWTA(RecurrentTransferMechanism):
Transfer_DEFAULT_BIAS SHOULD RESOLVE TO A VALUE
COMMENT
decay : number : default 1.0
specifies the amount by which to decrement its `previous_input <KWTA.previous_input>` each time it is executed.
noise : float or function : default 0.0
specifies a stochastically-sampled value added to the result of the `function <KWTA.function>`:
if it is a float, it must be in the interval [0,1] and is used to scale the variance of a zero-mean Gaussian;
Expand Down Expand Up @@ -326,10 +322,6 @@ class KWTA(RecurrentTransferMechanism):
an `AutoAssociativeProjection` that projects from the Mechanism's `primary OutputState <OutputState_Primary>`
back to its `primary inputState <Mechanism_InputStates>`.
decay : float : default 1.0
determines the amount by which to multiply the `previous_input <KWTA.previous_input>` value
each time it is executed.
COMMENT:
THE FOLLOWING IS THE CURRENT ASSIGNMENT
COMMENT
Expand Down Expand Up @@ -447,7 +439,6 @@ def __init__(self,
auto: is_numeric_or_none=None,
hetero: is_numeric_or_none=None,
initial_value=None,
decay: tc.optional(tc.any(int, float)) = 1.0,
noise: is_numeric_or_none = 0.0,
time_constant: is_numeric_or_none = 1.0,
integrator_mode=False,
Expand Down Expand Up @@ -495,7 +486,6 @@ def __init__(self,
hetero=hetero,
integrator_mode=integrator_mode,
initial_value=initial_value,
decay=decay,
noise=noise,
time_constant=time_constant,
clip=clip,
Expand Down Expand Up @@ -582,7 +572,7 @@ def _kwta_scale(self, current_input, context=None):
return np.atleast_2d(new_input)

def _validate_params(self, request_set, target_set=None, context=None):
"""Validate shape and size of matrix and decay.
"""Validate shape and size of matrix.
"""

super()._validate_params(request_set=request_set, target_set=target_set, context=context)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,22 +22,48 @@
A RecurrentTransferMechanism is a subclass of `TransferMechanism` that implements a single-layered recurrent
network, in which each element is connected to every other element (instantiated in a recurrent
`AutoAssociativeProjection` referenced by the Mechanism's `matrix <RecurrentTransferMechanism.matrix>` parameter).
It allows its previous input to be decayed, can report the energy and, if appropriate, the entropy of its output,
and can be configured to implement autoassociative (e.g., Hebbian) learning.
It can report the energy and, if appropriate, the entropy of its output, and can be configured to implement
autoassociative (e.g., Hebbian) learning.
.. _Recurrent_Transfer_Creation:
Creating a RecurrentTransferMechanism
-------------------------------------
A RecurrentTransferMechanism can be created directly by calling its constructor, or using the `mechanism` command and
specifying RECURRENT_TRANSFER_MECHANISM as its **mech_spec** argument. The recurrent projection is automatically
created using the **matrix** (or **auto** and **hetero**) argument of the Mechanism's constructor, and assigned to
its `recurrent_projection <RecurrentTransferMechanism.recurrent_projection>` attribute. If used the **matrix** is used,
it must specify either a square matrix or a `AutoAssociativeProjection` that uses one (the default is
`FULL_CONNECTIVITY_MATRIX`). Alternatively, **auto** and **hetero** can be specified: these set the diagonal and
off-diagonal terms, respectively. In all other respects, a RecurrentTransferMechanism is specified in the same way as a
standard `TransferMechanism`.
A RecurrentTransferMechanism is created directly by calling its constructor.::
import psyneulink as pnl
my_linear_recurrent_transfer_mechanism = pnl.RecurrentTransferMechanism(function=pnl.Linear)
my_logistic_recurrent_transfer_mechanism = pnl.RecurrentTransferMechanism(function=pnl.Logistic(gain=1.0,
bias=-4.0))
The recurrent projection is automatically created using (1) the **matrix** argument or (2) the **auto** and **hetero**
arguments of the Mechanism's constructor, and is assigned to the mechanism's `recurrent_projection
<RecurrentTransferMechanism.recurrent_projection>` attribute.
If the **matrix** argument is used to create the recurrent projection, it must specify either a square matrix or an
`AutoAssociativeProjection` that uses one (the default is `FULL_CONNECTIVITY_MATRIX`).::
recurrent_mech_1 = pnl.RecurrentTransferMechanism(default_variable=[[0.0, 0.0, 0.0]],
matrix=[[1.0, 2.0, 2.0],
[2.0, 1.0, 2.0],
[2.0, 2.0, 1.0]])
recurrent_mech_2 = pnl.RecurrentTransferMechanism(default_variable=[[0.0, 0.0, 0.0]],
matrix=pnl.AutoAssociativeProjection)
If the **auto** and **hetero** arguments are used to create the recurrent projection, they set the diagonal and
off-diagonal terms, respectively.::
recurrent_mech_3 = pnl.RecurrentTransferMechanism(default_variable=[[0.0, 0.0, 0.0]],
auto=1.0,
hetero=2.0)
.. note::
In the examples above, recurrent_mech_1 and recurrent_mech_3 are identical.
In all other respects, a RecurrentTransferMechanism is specified in the same way as a standard `TransferMechanism`.
.. _Recurrent_Transfer_Learning:
Expand Down Expand Up @@ -82,41 +108,48 @@
InputState <InputState_Primary>`. This can be parametrized using its `matrix <RecurrentTransferMechanism.matrix>`,
`auto <RecurrentTransferMechanism.auto>`, and `hetero <RecurrentTransferMechanism.hetero>` attributes, and is
stored in its `recurrent_projection <RecurrentTransferMechanism.recurrent_projection>` attribute.
A RecurrentTransferMechanism also has a `decay` <RecurrentTransferMechanism.decay>' attribute, that multiplies its
`previous_input <RecurrentTransferMechanism.previous_input>` value by the specified factor each time it
is executed. It also has two additional `OutputStates <OutputState>: an *ENERGY* OutputState and, if its `function
<RecurrentTransferMechanism.function>` is bounded between 0 and 1 (e.g., a `Logistic` function), an *ENTROPY*
A RecurrentTransferMechanism also has two additional `OutputStates <OutputState>: an *ENERGY* OutputState and, if its
`function <RecurrentTransferMechanism.function>` is bounded between 0 and 1 (e.g., a `Logistic` function), an *ENTROPY*
OutputState. Each of these report the respective values of the vector in it its *RESULTS* (`primary
<OutputState_Primary>`) OutputState. Finally, if it has been `specified for learning <Recurrent_Transfer_Learning>`,
it is associated with a `AutoAssociativeLearningMechanism` that is used to train its `AutoAssociativeProjection`.
<OutputState_Primary>`) OutputState.
Finally, if it has been `specified for learning <Recurrent_Transfer_Learning>`, the RecurrentTransferMechanism is
associated with an `AutoAssociativeLearningMechanism` that is used to train its `AutoAssociativeProjection`.
The `learning_enabled <RecurrentTransferMechanism.learning_enabled>` attribute indicates whether learning
is enabled or disabled for the Mechanism. If learning was not configure when the Mechanism was created, then it cannot
is enabled or disabled for the Mechanism. If learning was not configured when the Mechanism was created, then it cannot
be enabled until the Mechanism is `configured for learning <Recurrent_Transfer_Learning>`.
In all other respects the Mechanism is identical to a standard `TransferMechanism`.
.. _Recurrent_Transfer_Execution:
Execution
---------
When a RecurrentTransferMechanism executes, it includes in its input the value of its
`primary OutputState <OutputState_Primary>` (after multiplication by the `matrix` of the recurrent projection) from its
last execution.
When a RecurrentTransferMechanism executes, its variable, as is the case with all mechanisms, is determined by the
projections the mechanism receives. This means that a RecurrentTransferMechanism's variable is determined in part by the
value of its own `primary OutputState <OutputState_Primary>` on the previous execution, and the `matrix` of the
recurrent projection.
COMMENT:
Previous version of sentence above: "When a RecurrentTransferMechanism executes, it includes in its input the value of
its `primary OutputState <OutputState_Primary>` from its last execution."
8/9/17 CW: Changed the sentence above. Rationale: If we're referring to the fact that the recurrent projection
takes the previous output before adding it to the next input, we should specifically mention the matrix transformation
that occurs along the way.
12/1/17 KAM: Changed the above to describe the RecurrentTransferMechanism's variable on this execution in terms of
projections received, which happens to include a recurrent projection from its own primary output state on the previous
execution
COMMENT
Like a `TransferMechanism`, the function used to update each element can be assigned using its `function
<RecurrentTransferMechanism.function>` parameter. When a RecurrentTransferMechanism is executed, if its `decay
<RecurrentTransferMechanism.decay>` parameter is specified (and is not 1.0), it decays the value of its `previous_input
<RecurrentTransferMechanism.previous_input>` parameter by the specified factor. It then transforms its input
<RecurrentTransferMechanism.function>` parameter. It then transforms its input
(including from the recurrent projection) using the specified function and parameters (see `Transfer_Execution`),
and returns the results in its OutputStates. If it has been `configured for learning <Recurrent_Transfer_Learning>`
and returns the results in its OutputStates.
If it has been `configured for learning <Recurrent_Transfer_Learning>`
and is executed as part of a `System`, then its associated `LearningMechanism` is executed during the `learning phase
<System_Learning>` of the `System's execution <System_Execution>`.
Expand Down Expand Up @@ -222,12 +255,12 @@ class RecurrentTransferMechanism(TransferMechanism):
auto=None, \
hetero=None, \
initial_value=None, \
decay=None, \
noise=0.0, \
time_constant=1.0, \
clip=(float:min, float:max), \
learning_rate=None, \
learning_function=Hebbian, \
integrator_mode=False, \
params=None, \
name=None, \
prefs=None)
Expand Down Expand Up @@ -287,15 +320,11 @@ class RecurrentTransferMechanism(TransferMechanism):
initial_value : value, list or np.ndarray : default Transfer_DEFAULT_BIAS
specifies the starting value for time-averaged input (only relevant if
`time_constant <RecurrentTransferMechanism.time_constant>` is not 1.0).
`integrator_mode <RecurrentTransferMechanism.integrator_mode>` is True).
COMMENT:
Transfer_DEFAULT_BIAS SHOULD RESOLVE TO A VALUE
COMMENT
decay : number : default 1.0
specifies the amount by which to decrement its `previous_input <RecurrentTransferMechanism.previous_input>`
each time it is executed.
noise : float or function : default 0.0
a stochastically-sampled value added to the result of the `function <RecurrentTransferMechanism.function>`:
if it is a float, it must be in the interval [0,1] and is used to scale the variance of a zero-mean Gaussian;
Expand All @@ -305,8 +334,8 @@ class RecurrentTransferMechanism(TransferMechanism):
the time constant for exponential time averaging of input when `integrator_mode
<RecurrentTransferMechanism.integrator_mode>` is set to True::
result = (time_constant * current input) +
(1-time_constant * result on previous time_step)
result = (time_constant * variable) +
(1-time_constant * input to mechanism's function on the previous time step)
clip : Optional[Tuple[float, float]]
specifies the allowable range for the result of `function <RecurrentTransferMechanism.function>`:
Expand Down Expand Up @@ -362,10 +391,6 @@ class RecurrentTransferMechanism(TransferMechanism):
an `AutoAssociativeProjection` that projects from the Mechanism's `primary OutputState <OutputState_Primary>`
back to its `primary inputState <Mechanism_InputStates>`.
decay : float : default 1.0
determines the amount by which to multiply the `previous_input <RecurrentTransferMechanism.previous_input>`
value each time it is executed.
COMMENT:
THE FOLLOWING IS THE CURRENT ASSIGNMENT
COMMENT
Expand Down Expand Up @@ -488,7 +513,6 @@ def __init__(self,
auto=None,
hetero=None,
initial_value=None,
decay: is_numeric_or_none=None,
noise=0.0,
time_constant: is_numeric_or_none=1.0,
integrator_mode=False,
Expand Down Expand Up @@ -521,7 +545,6 @@ def __init__(self,
params = self._assign_args_to_param_dicts(input_states=input_states,
initial_value=initial_value,
matrix=matrix,
decay=decay,
integrator_mode=integrator_mode,
learning_rate=learning_rate,
learning_function=learning_function,
Expand Down Expand Up @@ -554,7 +577,7 @@ def __init__(self,
context=context)

def _validate_params(self, request_set, target_set=None, context=None):
"""Validate shape and size of auto, hetero, matrix and decay.
"""Validate shape and size of auto, hetero, matrix.
"""
from psyneulink.library.projections.pathway.autoassociativeprojection import AutoAssociativeProjection

Expand Down Expand Up @@ -630,12 +653,12 @@ def _validate_params(self, request_set, target_set=None, context=None):
raise RecurrentTransferError(err_msg)

# Validate DECAY
if DECAY in target_set and target_set[DECAY] is not None:

decay = target_set[DECAY]
if not (0.0 <= decay and decay <= 1.0):
raise RecurrentTransferError("{} argument for {} ({}) must be from 0.0 to 1.0".
format(DECAY, self.name, decay))
# if DECAY in target_set and target_set[DECAY] is not None:
#
# decay = target_set[DECAY]
# if not (0.0 <= decay and decay <= 1.0):
# raise RecurrentTransferError("{} argument for {} ({}) must be from 0.0 to 1.0".
# format(DECAY, self.name, decay))

# FIX: validate learning_function and learning_rate here (use Hebbian as template for learning_rate

Expand Down
1 change: 0 additions & 1 deletion tests/mechanisms/test_kwta.py
Original file line number Diff line number Diff line change
Expand Up @@ -425,7 +425,6 @@ def test_kwta_size_10_k_3_threshold_1(self):
size=10,
k_value=3,
threshold=1,
decay=0.3,
time_scale=TimeScale.TIME_STEP
)
p = Process(pathway=[K], prefs=TestKWTALongTerm.simple_prefs)
Expand Down
56 changes: 56 additions & 0 deletions tests/mechanisms/test_recurrent_transfer_mechanism.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,62 @@
from psyneulink.globals.preferences.componentpreferenceset import REPORT_OUTPUT_PREF, VERBOSE_PREF
from psyneulink.globals.utilities import UtilitiesError
from psyneulink.library.mechanisms.processing.transfer.recurrenttransfermechanism import RecurrentTransferError, RecurrentTransferMechanism
from psyneulink.library.projections.pathway.autoassociativeprojection import AutoAssociativeProjection
class TestMatrixSpec:
def test_recurrent_mech_matrix(self):

T = TransferMechanism(default_variable=[[0.0, 0.0, 0.0]])
recurrent_mech = RecurrentTransferMechanism(default_variable=[[0.0, 0.0, 0.0]],
matrix=[[1.0, 2.0, 3.0],
[2.0, 1.0, 2.0],
[3.0, 2.0, 1.0]])
p = Process(pathway=[T, recurrent_mech])

s = System(processes=[p])

results = []
def record_trial():
results.append(recurrent_mech.value)
s.run(inputs=[[1.0, 1.0, 1.0], [2.0, 2.0, 2.0]],
call_after_trial=record_trial)

def test_recurrent_mech_auto_associative_projection(self):

T = TransferMechanism(default_variable=[[0.0, 0.0, 0.0]])
recurrent_mech = RecurrentTransferMechanism(default_variable=[[0.0, 0.0, 0.0]],
matrix=AutoAssociativeProjection)
p = Process(pathway=[T, recurrent_mech])

s = System(processes=[p])

results = []
def record_trial():
results.append(recurrent_mech.value)
s.run(inputs=[[1.0, 1.0, 1.0], [2.0, 2.0, 2.0]],
call_after_trial=record_trial)
print(results)

def test_recurrent_mech_auto_auto_hetero(self):

T = TransferMechanism(default_variable=[[0.0, 0.0, 0.0]])
recurrent_mech = RecurrentTransferMechanism(default_variable=[[0.0, 0.0, 0.0]],
auto=3.0,
hetero=-7.0)

print(recurrent_mech.recurrent_projection)
p = Process(pathway=[T, recurrent_mech])

s = System(processes=[p])

results = []
def record_trial():
results.append(recurrent_mech.value)
s.run(inputs=[[1.0, 1.0, 1.0], [2.0, 2.0, 2.0]],
call_after_trial=record_trial)
print(results)




class TestRecurrentTransferMechanismInputs:

Expand Down

0 comments on commit 84b60a8

Please sign in to comment.