6.1 Episodic Memory#
In the previous chapters, learning was based on gradually adjusting weights in a neural network. However, humans have the ability to learn from a single experience without the need for repetition. One could argue that this “one-shot” learning can be achieved by a high learning rate. However, a high learning rate can lead to catastrophic forgetting, where the network forgets previously learned associations.
Consider for example, the Rumelhart Semantic Network from the previous chapter. If we first train the network to associate various birds with flying, and then train the network with a single example of a penguin with a very high learning rate, the network will forget the previous association with birds and flying. In general, we can mitigate this problem by using interleaved training, where we mix examples from the bird-category with the penguin example. However, this doesn’t reflect human learning, where we can learn from a single example without forgetting previous associations.
McClelland et al, 1995 proposed two complemantary learning systems: A slow learning system that learns gradually from repetition (in form of weight adjustments in the Neocortex) and a fast learning system that learns from single experiences (in form of episodic memory in the hippocampus). Here, we will explore how such a episodic memory system can be modeled in PsyNeuLink.
Installation and Setup
If the following code fails, you might have to restart the kernel/session and run it again. This is a known issue when installing PsyNeulink in google colab.
%%capture
%pip install psyneulink
import psyneulink as pnl
import numpy as np
# from torch import nn
# import torch
import matplotlib.pyplot as plt
Episodic Memory - Python Implementation#
We start with implementing a epispdic memory as python dictionary. The memory stores key-value pairs, that can be retrieved by querying the memory with a key:
em_dict = {
'morning': 'breakfast',
'afternoon': 'lunch',
'evening': 'dinner'
}
We can retrieve memory by accessing the dictionary with a key:
food = em_dict['morning']
print(food)
breakfast
However, this implementation is limited in that the key has to be an exact match. For example, we can not just access “late afternoon” and get “brunch”:
food = em_dict['late afternoon']
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
Cell In[4], line 1
----> 1 food = em_dict['late afternoon']
KeyError: 'late afternoon'
Also, to make this more general, instead of strings, we can use (one-hot encoded) vectors as keys and values. Let’s build our own episodic memory that stores key-value pairs as list:
morning_key = np.array([1, 0, 0]) # morning
morning_value = np.array([0, 0, 1]) # breakfast
afternoon_key = np.array([0, 1, 0]) # afternoon
afternoon_value = np.array([1, 0, 0]) # lunch
evening_key = np.array([0, 0, 1]) # evening
evening_value = np.array([0, 0, 1]) # dinner
em = np.array([(morning_key, morning_value), (afternoon_key, afternoon_value), (evening_key, evening_value)])
print(em)
[[[1 0 0]
[0 0 1]]
[[0 1 0]
[1 0 0]]
[[0 0 1]
[0 0 1]]]
def plot(memory):
"""
Plot the episodic memory as a matrix
"""
def flatten(el):
x = el[0]
for i in el[1]:
x = np.append(x, i)
return x
# Create a 6-row matrix (3 keys and 8 values) by padding the rows appropriately
episodic_memory_matrix = [flatten(el) for el in memory]
plt.imshow(episodic_memory_matrix, cmap='Reds', aspect='auto')
labels = [f"Key {i}" for i in range(len(episodic_memory_matrix[0])//2)] + [f"Value {i}" for i in range(len(episodic_memory_matrix[0])//2)]
plt.xticks(ticks=range(len(episodic_memory_matrix[0])),
labels=labels, rotation=45)
plt.yticks(ticks=range(len(episodic_memory_matrix)),
labels=[f"Memory Entry {i}" for i in range(len(episodic_memory_matrix))])
# Add vertical dashed line to separate groups
plt.axvline(x=(len(labels)-1)/2, color='black', linestyle='--', linewidth=1)
plt.grid(visible=True)
plt.title('Episodic Memory')
plt.xlabel('Columns')
plt.ylabel('Rows')
plt.show()
#
plot(em)

To retrieve a value, we can first search for the key that is most similar to the query key and return the corresponding value. A good measure for similarity between two vectors is the dot product.
# We use a query key that the model has never seen before (for exampl a interpolation between the morning and afternoon key)
query_key = np.array([.9, .1, 0]) # between morning and afternoon
matches = [np.dot(query_key, key) for key, _ in em]
# our matches are:
print('Matches: ',matches)
# and the index with the highest match is:
max_match_index = matches.index(max(matches))
print('Index where key matches the most:', max_match_index)
# matched key:
key = em[max_match_index][0]
print('Matched key:', key)
# so the value is:
value = em[max_match_index][1]
print('Matched value:', value)
Matches: [0.9, 0.1, 0.0]
Index where key matches the most: 0
Matched key: [1 0 0]
Matched value: [0 0 1]
As expected, the most close match to “between morning and afternoon” is “morning” and we get the value “breakfast”
We can go a step further, instead of using the argmax of the matches, we can also weigh the values by their respective match. But to do that, we first need to normalize the matches (in this case we use the softmax function):
import math
import numpy as np
def softmax(x):
# calculate the sum
_sum = 0
for el in x:
_sum += math.exp(el)
res = []
for el in x:
res.append(math.exp(el) / _sum)
return res
query_key = [1., 1., 0.] # half way between morning and afternoon
matches = [np.dot(query_key, key) for key, _ in em]
print('Matches:', matches)
soft_maxed_matches = softmax(matches)
print('Soft maxed Matches:', soft_maxed_matches)
weighted_values = np.sum(np.array([soft_maxed_matches[i] * np.array(value) for i, (_, value) in enumerate(em)]), axis=0)
print('Weighted Values:', weighted_values)
Matches: [1.0, 1.0, 0.0]
Soft maxed Matches: [0.4223187982515182, 0.4223187982515182, 0.15536240349696362]
Weighted Values: [0.4223188 0. 0.5776812]
Here, instead of choosing the best key (that is in memory), we calculate a weighted between the values stored in memory. The weights, in this case are similarity scores between the query key and the stored key in memory.
Instead of the above implementation of the softmax, in use cases, we use a softmax that masks values below a certain threshold. Why is this necessary? What would happen if we didn’t mask values?
💡 Hint
In our toy example we have only a few memory slots. However, in a most scenarios, we want to model a large number of episodic memory slots. In this case, the match_weight
vector will have a large number of entries with a lot of zeros (or near zeros). Think about why that is problematic.
✅ Solution 1
The “flattening” effect of the softmax function is dependent on the length of the vector. For example, try running the following code:
res = softmax(torch.tensor([1] + [0.01] * 10))
res_2 = softmax(torch.tensor([1] + [0.01] * 100))
print(res[0])
print(res_2[0])
Try implementing a larger memory with more key value pairs. How does the memory retrieval hold up without masking?
PsyNeuLink - EMComposition#
PsyNeuLink
provides build-in support for episodic memory through the EMComposition
. Here, we explain the most important parameters of the EMComposition
. It can be used to easily build more complex memory structures, and we will use it in the next tutorial in the EGO model.
em = pnl.EMComposition(name='EM', # name
memory_capacity=1000, # number of key-value pairs
memory_template=[[0, 0], [0, 0, 0, 0], [0, 0, 0]],
# template for the memory. Note: Here we use 3 memory slots (instead of just a key value pair, we can store as many keys and pairs as we want.)
fields={'1':
{pnl.FIELD_WEIGHT: .33,
# weight of the key. This determines how much this "slot" influences the retrieval
pnl.LEARN_FIELD_WEIGHT: False, # The weight can be learned via backpropagation
pnl.TARGET_FIELD: False
# If this is a target field, the error is calculated here, and backpropagated
},
'2': {pnl.FIELD_WEIGHT: .33,
pnl.LEARN_FIELD_WEIGHT: False,
pnl.TARGET_FIELD: False},
'3': {pnl.FIELD_WEIGHT: .33,
pnl.LEARN_FIELD_WEIGHT: False,
pnl.TARGET_FIELD: False},
},
memory_fill=.001, # fill the memory with this value
normalize_memories=True, # normalize the memories
softmax_gain=1., # gain of the softmax function
softmax_threshold=0.1, # threshold of the softmax function
memory_decay=0, # memory can be decayed over time
)
em.show_graph(output_fmt='jupyter')
---------------------------------------------------------------------------
KeyboardInterrupt Traceback (most recent call last)
Cell In[9], line 1
----> 1 em = pnl.EMComposition(name='EM', # name
2 memory_capacity=1000, # number of key-value pairs
3 memory_template=[[0, 0], [0, 0, 0, 0], [0, 0, 0]],
4 # template for the memory. Note: Here we use 3 memory slots (instead of just a key value pair, we can store as many keys and pairs as we want.)
5 fields={'1':
6 {pnl.FIELD_WEIGHT: .33,
7 # weight of the key. This determines how much this "slot" influences the retrieval
8 pnl.LEARN_FIELD_WEIGHT: False, # The weight can be learned via backpropagation
9 pnl.TARGET_FIELD: False
10 # If this is a target field, the error is calculated here, and backpropagated
11 },
12 '2': {pnl.FIELD_WEIGHT: .33,
13 pnl.LEARN_FIELD_WEIGHT: False,
14 pnl.TARGET_FIELD: False},
15 '3': {pnl.FIELD_WEIGHT: .33,
16 pnl.LEARN_FIELD_WEIGHT: False,
17 pnl.TARGET_FIELD: False},
18
19 },
20 memory_fill=.001, # fill the memory with this value
21 normalize_memories=True, # normalize the memories
22 softmax_gain=1., # gain of the softmax function
23 softmax_threshold=0.1, # threshold of the softmax function
24 memory_decay=0, # memory can be decayed over time
25 )
26 em.show_graph(output_fmt='jupyter')
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/globals/parameters.py:506, in check_user_specified.<locals>.check_user_specified_wrapper(self, *args, **kwargs)
503 self._prev_constructor = constructor
505 self._prev_kwargs = kwargs
--> 506 return func(self, *args, **orig_kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/library/compositions/emcomposition.py:1758, in EMComposition.__init__(self, memory_template, memory_capacity, memory_fill, fields, field_names, field_weights, learn_field_weights, learning_rate, normalize_field_weights, concatenate_queries, normalize_memories, softmax_gain, softmax_threshold, softmax_choice, storage_prob, memory_decay_rate, purge_by_field_weights, enable_learning, target_fields, use_storage_node, use_gating_for_weighting, random_state, seed, name, **kwargs)
1731 super().__init__(name=name,
1732 memory_template = memory_template,
1733 memory_capacity = memory_capacity,
(...) 1751 **kwargs
1752 )
1754 self._validate_options_with_learning(use_gating_for_weighting,
1755 enable_learning,
1756 softmax_choice)
-> 1758 self._construct_pathways(self.memory_template,
1759 self.memory_capacity,
1760 self.field_weights,
1761 self.concatenate_queries,
1762 self.normalize_memories,
1763 self.softmax_gain,
1764 self.softmax_threshold,
1765 self.softmax_choice,
1766 self.storage_prob,
1767 self.memory_decay_rate,
1768 self._use_storage_node,
1769 self.learn_field_weights,
1770 self.enable_learning,
1771 self._use_gating_for_weighting)
1773 # if torch_available:
1774 # from psyneulink.library.compositions.pytorchEMcompositionwrapper import PytorchEMCompositionWrapper
1775 # self.pytorch_composition_wrapper_type = PytorchEMCompositionWrapper
(...) 1778
1779 # Assign learning-related attributes
1780 self._set_learning_attributes()
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/library/compositions/emcomposition.py:2217, in EMComposition._construct_pathways(self, memory_template, memory_capacity, field_weights, concatenate_queries, normalize_memories, softmax_gain, softmax_threshold, softmax_choice, storage_prob, memory_decay_rate, use_storage_node, learn_field_weights, enable_learning, use_gating_for_weighting)
2215 self._construct_softmax_node(memory_capacity, softmax_gain, softmax_threshold, softmax_choice)
2216 self._construct_softmax_gain_control_node(softmax_gain)
-> 2217 self._construct_retrieved_nodes(memory_template)
2218 self._construct_storage_node(use_storage_node, memory_template, memory_decay_rate, storage_prob)
2220 # Do some validation and get singleton softmax and match Nodes for concatenated queries
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/library/compositions/emcomposition.py:2518, in EMComposition._construct_retrieved_nodes(self, memory_template)
2514 """Create nodes that report the value field(s) for the item(s) matched in memory.
2515 """
2516 for field in self.fields:
2517 field.retrieved_node = (
-> 2518 ProcessingMechanism(name=field.name + RETRIEVED_AFFIX,
2519 input_ports={INPUT_SHAPES: len(field.input_node.variable[0]),
2520 PROJECTIONS:
2521 MappingProjection(
2522 sender=self.softmax_node,
2523 matrix=memory_template[:,field.index],
2524 name=f'MEMORY FOR {field.name} '
2525 f'[RETRIEVE {field.type.name}]')}))
2526 field.retrieve_projection = field.retrieved_node.path_afferents[0]
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/globals/parameters.py:506, in check_user_specified.<locals>.check_user_specified_wrapper(self, *args, **kwargs)
503 self._prev_constructor = constructor
505 self._prev_kwargs = kwargs
--> 506 return func(self, *args, **orig_kwargs)
File <@beartype(psyneulink.core.components.mechanisms.processing.processingmechanism.ProcessingMechanism.__init__) at 0x7f8a6c436f20>:110, in __init__(__beartype_object_140232498468064, __beartype_get_violation, __beartype_conf, __beartype_object_140232498331264, __beartype_object_140233607268992, __beartype_object_140233714771616, __beartype_object_140233714789552, __beartype_object_140233714789616, __beartype_object_140233714771696, __beartype_object_140233714789680, __beartype_object_140233714771776, __beartype_args_name_keywordable, __beartype_check_meta, __beartype_func, *args, **kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/mechanisms/processing/processingmechanism.py:390, in ProcessingMechanism.__init__(self, default_variable, input_shapes, input_ports, output_ports, function, params, name, prefs, **kwargs)
378 @check_user_specified
379 @beartype
380 def __init__(self,
(...) 388 prefs: Optional[ValidPrefSet] = None,
389 **kwargs):
--> 390 super(ProcessingMechanism, self).__init__(default_variable=default_variable,
391 input_shapes=input_shapes,
392 input_ports=input_ports,
393 function=function,
394 output_ports=output_ports,
395 params=params,
396 name=name,
397 prefs=prefs,
398 **kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/globals/parameters.py:506, in check_user_specified.<locals>.check_user_specified_wrapper(self, *args, **kwargs)
503 self._prev_constructor = constructor
505 self._prev_kwargs = kwargs
--> 506 return func(self, *args, **orig_kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/mechanisms/processing/processingmechanism.py:293, in ProcessingMechanism_Base.__init__(self, default_variable, input_shapes, input_ports, function, output_ports, params, name, prefs, context, **kwargs)
270 @check_user_specified
271 def __init__(self,
272 default_variable=None,
(...) 281 **kwargs
282 ):
283 """Abstract class for processing mechanisms
284
285 :param variable: (value)
(...) 290 :param context: (str)
291 """
--> 293 super().__init__(default_variable=default_variable,
294 input_shapes=input_shapes,
295 input_ports=input_ports,
296 function=function,
297 output_ports=output_ports,
298 params=params,
299 name=name,
300 prefs=prefs,
301 context=context,
302 **kwargs
303 )
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/globals/parameters.py:506, in check_user_specified.<locals>.check_user_specified_wrapper(self, *args, **kwargs)
503 self._prev_constructor = constructor
505 self._prev_kwargs = kwargs
--> 506 return func(self, *args, **orig_kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/mechanisms/mechanism.py:1762, in Mechanism_Base.__init__(self, default_variable, input_shapes, input_ports, input_labels, function, output_ports, output_labels, params, name, prefs, context, **kwargs)
1756 from psyneulink.core.components.ports.outputport import OutputPort
1757 register_category(entry=OutputPort,
1758 base_class=Port_Base,
1759 registry=self._portRegistry,
1760 )
-> 1762 super(Mechanism_Base, self).__init__(
1763 default_variable=default_variable,
1764 input_shapes=input_shapes,
1765 function=function,
1766 param_defaults=params,
1767 prefs=prefs,
1768 name=name,
1769 input_ports=input_ports,
1770 output_ports=output_ports,
1771 input_labels_dict=input_labels,
1772 output_labels_dict=output_labels,
1773 **kwargs
1774 )
1776 # FIX: 10/3/17 - IS THIS CORRECT? SHOULD IT BE INITIALIZED??
1777 self._status = INITIALIZING
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/globals/parameters.py:506, in check_user_specified.<locals>.check_user_specified_wrapper(self, *args, **kwargs)
503 self._prev_constructor = constructor
505 self._prev_kwargs = kwargs
--> 506 return func(self, *args, **orig_kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/shellclasses.py:82, in Mechanism.__init__(self, default_variable, input_shapes, function, param_defaults, name, prefs, **kwargs)
73 @check_user_specified
74 def __init__(self,
75 default_variable=None,
(...) 80 prefs=None,
81 **kwargs):
---> 82 super().__init__(default_variable=default_variable,
83 input_shapes=input_shapes,
84 function=function,
85 param_defaults=param_defaults,
86 name=name,
87 prefs=prefs,
88 **kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/globals/parameters.py:506, in check_user_specified.<locals>.check_user_specified_wrapper(self, *args, **kwargs)
503 self._prev_constructor = constructor
505 self._prev_kwargs = kwargs
--> 506 return func(self, *args, **orig_kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/component.py:1261, in Component.__init__(self, default_variable, param_defaults, input_shapes, function, name, reset_stateful_function_when, prefs, function_params, **kwargs)
1254 self.most_recent_context = context
1256 # INSTANTIATE ATTRIBUTES BEFORE FUNCTION
1257 # Stub for methods that need to be executed before instantiating function
1258 # (e.g., _instantiate_sender and _instantiate_receiver in Projection)
1259 # Allow _instantiate_attributes_before_function of subclass
1260 # to modify/replace function arg provided in constructor (e.g. TransferWithCosts)
-> 1261 function = self._instantiate_attributes_before_function(function=function, context=context) or function
1263 # INSTANTIATE FUNCTION
1264 # - assign initial function parameter values from ParameterPorts,
1265 # - assign function's output to self.defaults.value (based on call of self.execute)
1266 self._instantiate_function(function=function, function_params=function_params, context=context)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/mechanisms/mechanism.py:2132, in Mechanism_Base._instantiate_attributes_before_function(self, function, context)
2131 def _instantiate_attributes_before_function(self, function=None, context=None):
-> 2132 self._instantiate_input_ports(context=context)
2133 self._instantiate_parameter_ports(function=function, context=context)
2134 super()._instantiate_attributes_before_function(function=function, context=context)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/mechanisms/mechanism.py:2234, in Mechanism_Base._instantiate_input_ports(self, input_ports, reference_value, context)
2228 """Call Port._instantiate_input_ports to instantiate orderedDict of InputPort(s)
2229
2230 This is a stub, implemented to allow Mechanism subclasses to override _instantiate_input_ports
2231 or process InputPorts before and/or after call to _instantiate_input_ports
2232 """
2233 from psyneulink.core.components.ports.inputport import _instantiate_input_ports
-> 2234 return _instantiate_input_ports(owner=self,
2235 input_ports=input_ports or self.input_ports,
2236 reference_value=reference_value,
2237 context=context)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/ports/inputport.py:1566, in _instantiate_input_ports(owner, input_ports, reference_value, context)
1563 input_ports = _parse_shadow_inputs(owner, input_ports)
1565 context.string = context.string or '_instantiate_input_ports'
-> 1566 port_list = _instantiate_port_list(owner=owner,
1567 port_list=input_ports,
1568 port_types=InputPort,
1569 port_Param_identifier=INPUT_PORT,
1570 reference_value=reference_value if reference_value is not None
1571 else owner.defaults.variable,
1572 reference_value_name=VALUE,
1573 context=context)
1575 # Call from Mechanism.add_ports, so add to rather than assign input_ports (i.e., don't replace)
1576 if context.source & (ContextFlags.METHOD | ContextFlags.COMMAND_LINE):
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/ports/port.py:2548, in _instantiate_port_list(owner, port_list, port_types, port_Param_identifier, reference_value, reference_value_name, context)
2542 # for index, port_spec, port_type in enumerate(zip(port_list, port_types)):
2543 for index, port_spec, port_type in zip(list(range(len(port_list))), port_list, port_types):
2544 # # Get name of port, and use as index to assign to ports ContentAddressableList
2545 # default_name = port_type._assign_default_port_Name(port_type)
2546 # name = default_name or None
-> 2548 port = _instantiate_port(port_type=port_type,
2549 owner=owner,
2550 reference_value=reference_value[index],
2551 reference_value_name=reference_value_name,
2552 port_spec=port_spec,
2553 # name=name,
2554 context=context)
2555 # automatically generate any Projections to InputPort or ParameterPort
2556 # (e.g. if InputPort was specified using the OutputPort of another Mechanism,
2557 # or a ParameterPort was specified using the ControlSignal of a ControlMechanism)
2558 try:
File <@beartype(psyneulink.core.components.ports.port._instantiate_port) at 0x7f8a6c52b100>:82, in _instantiate_port(__beartype_object_94875257444032, __beartype_get_violation, __beartype_conf, __beartype_object_140232499638464, __beartype_object_140236367396608, __beartype_args_name_keywordable, __beartype_check_meta, __beartype_func, *args, **kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/ports/port.py:2731, in _instantiate_port(port_type, owner, reference_value, name, variable, params, prefs, context, **port_spec)
2728 del port_spec_dict[PARAMS][REFERENCE_VALUE_NAME]
2730 # Implement default Port
-> 2731 port = port_type(**port_spec_dict, context=context)
2733 # FIX LOG: ADD NAME TO LIST OF MECHANISM'S VALUE ATTRIBUTES FOR USE BY LOGGING ENTRIES
2734 # This is done here to register name with Mechanism's portValues[] list
2735 # It must be consistent with value setter method in Port
(...) 2739 # NOT SURE WHAT THE PURPOSE IS
2740 # setattr(owner, port.name+'.value', port.value)
2742 port._validate_against_reference_value(reference_value)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/globals/context.py:742, in handle_external_context.<locals>.decorator.<locals>.wrapper(context, *args, **kwargs)
739 pass
741 try:
--> 742 return func(*args, context=context, **kwargs)
743 except TypeError as e:
744 # context parameter may be passed as a positional arg
745 if (
746 f"{func.__name__}() got multiple values for argument"
747 not in str(e)
748 ):
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/globals/parameters.py:506, in check_user_specified.<locals>.check_user_specified_wrapper(self, *args, **kwargs)
503 self._prev_constructor = constructor
505 self._prev_kwargs = kwargs
--> 506 return func(self, *args, **orig_kwargs)
File <@beartype(psyneulink.core.components.ports.inputport.InputPort.__init__) at 0x7f8a6c5c8400>:120, in __init__(__beartype_object_140236808276160, __beartype_object_140236808746496, __beartype_object_140236798963184, __beartype_get_violation, __beartype_conf, __beartype_object_140233607271232, __beartype_object_140233607268992, __beartype_object_140233714771616, __beartype_object_140233714789552, __beartype_object_140233714789616, __beartype_object_140233714771696, __beartype_object_140233714789680, __beartype_object_140233714771776, __beartype_args_name_keywordable, __beartype_check_meta, __beartype_func, *args, **kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/ports/inputport.py:923, in InputPort.__init__(self, owner, reference_value, variable, input_shapes, default_input, function, projections, combine, weight, exponent, internal_only, params, name, prefs, context, **kwargs)
919 self.reference_value = reference_value
921 # Validate sender (as variable) and params, and assign to variable
922 # Note: pass name of owner (to override assignment of componentName in super.__init__)
--> 923 super(InputPort, self).__init__(
924 owner,
925 variable=variable,
926 input_shapes=input_shapes,
927 projections=projections,
928 function=function,
929 weight=weight,
930 exponent=exponent,
931 internal_only=internal_only,
932 shadow_inputs=None,
933 params=params,
934 name=name,
935 prefs=prefs,
936 context=context,
937 )
939 if self.name is self.componentName or self.componentName + '-' in self.name:
940 self._assign_default_port_Name()
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/globals/parameters.py:506, in check_user_specified.<locals>.check_user_specified_wrapper(self, *args, **kwargs)
503 self._prev_constructor = constructor
505 self._prev_kwargs = kwargs
--> 506 return func(self, *args, **orig_kwargs)
File <@beartype(psyneulink.core.components.ports.port.Port_Base.__init__) at 0x7f8a6c528f40>:36, in __init__(__beartype_object_140232499638464, __beartype_get_violation, __beartype_conf, __beartype_args_name_keywordable, __beartype_check_meta, __beartype_func, *args, **kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/ports/port.py:1115, in Port_Base.__init__(self, owner, variable, input_shapes, projections, function, params, name, prefs, context, **kwargs)
1112 # IMPLEMENTATION NOTE: MOVE TO COMPOSITION ONCE THAT IS IMPLEMENTED
1113 # INSTANTIATE PROJECTIONS SPECIFIED IN projections ARG OR params[PROJECTIONS:<>]
1114 if self.projections is not None:
-> 1115 self._instantiate_projections(self.projections, context=context)
1116 else:
1117 # No projections specified, so none will be created here
1118 # IMPLEMENTATION NOTE: This is where a default projection would be implemented
1119 # if params = NotImplemented or there is no param[PROJECTIONS]
1120 pass
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/ports/inputport.py:1049, in InputPort._instantiate_projections(self, projections, context)
1042 def _instantiate_projections(self, projections, context=None):
1043 """Instantiate Projections specified in PROJECTIONS entry of params arg of Port's constructor
1044
1045 Call _instantiate_projections_to_port to assign:
1046 PathwayProjections to .path_afferents
1047 ModulatoryProjections to .mod_afferents
1048 """
-> 1049 self._instantiate_projections_to_port(projections=projections, context=context)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/ports/port.py:1394, in Port_Base._instantiate_projections_to_port(self, projections, context)
1391 # If sender has been instantiated, try to complete initialization
1392 # If not, assume it will be handled later (by Mechanism or Composition)
1393 if isinstance(sender, Port) and sender.initialization_status == ContextFlags.INITIALIZED:
-> 1394 projection._deferred_init(context=context)
1397 # VALIDATE (if initialized)
1399 if projection.initialization_status == ContextFlags.INITIALIZED:
1400
1401 # FIX: 10/3/17 - VERIFY THE FOLLOWING:
(...) 1413 # PathwayProjection:
1414 # - check that projection's value is compatible with the Port's variable
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/component.py:1869, in Component._deferred_init(self, **kwargs)
1866 self._init_args.update(kwargs)
1868 # Complete initialization
-> 1869 super(self.__class__,self).__init__(**self._init_args)
1871 # If name was assigned, "[DEFERRED INITIALIZATION]" was appended to it, so remove it
1872 if DEFERRED_INITIALIZATION in self.name:
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/globals/parameters.py:506, in check_user_specified.<locals>.check_user_specified_wrapper(self, *args, **kwargs)
503 self._prev_constructor = constructor
505 self._prev_kwargs = kwargs
--> 506 return func(self, *args, **orig_kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/projections/projection.py:776, in Projection_Base.__init__(self, receiver, sender, weight, exponent, function, feedback, exclude_in_autodiff, params, name, prefs, context, **kwargs)
772 self._creates_scheduling_dependency = True
774 # Validate variable, function and params
775 # Note: pass name of Projection (to override assignment of componentName in super.__init__)
--> 776 super(Projection_Base, self).__init__(
777 default_variable=variable,
778 function=function,
779 param_defaults=params,
780 weight=weight,
781 exponent=exponent,
782 name=self.name,
783 prefs=prefs,
784 **kwargs
785 )
787 self._assign_default_projection_name()
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/globals/parameters.py:506, in check_user_specified.<locals>.check_user_specified_wrapper(self, *args, **kwargs)
503 self._prev_constructor = constructor
505 self._prev_kwargs = kwargs
--> 506 return func(self, *args, **orig_kwargs)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/component.py:1261, in Component.__init__(self, default_variable, param_defaults, input_shapes, function, name, reset_stateful_function_when, prefs, function_params, **kwargs)
1254 self.most_recent_context = context
1256 # INSTANTIATE ATTRIBUTES BEFORE FUNCTION
1257 # Stub for methods that need to be executed before instantiating function
1258 # (e.g., _instantiate_sender and _instantiate_receiver in Projection)
1259 # Allow _instantiate_attributes_before_function of subclass
1260 # to modify/replace function arg provided in constructor (e.g. TransferWithCosts)
-> 1261 function = self._instantiate_attributes_before_function(function=function, context=context) or function
1263 # INSTANTIATE FUNCTION
1264 # - assign initial function parameter values from ParameterPorts,
1265 # - assign function's output to self.defaults.value (based on call of self.execute)
1266 self._instantiate_function(function=function, function_params=function_params, context=context)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/projections/projection.py:867, in Projection_Base._instantiate_attributes_before_function(self, function, context)
865 def _instantiate_attributes_before_function(self, function=None, context=None):
--> 867 self._instantiate_parameter_ports(function=function, context=context)
869 # If Projection has a matrix parameter, it is specified as a keyword arg in the constructor,
870 # and sender and receiver have been instantiated, then implement it:
871 if hasattr(self.parameters, MATRIX) and self.parameters.matrix._user_specified:
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/projections/pathway/mappingprojection.py:501, in MappingProjection._instantiate_parameter_ports(self, function, context)
499 def _instantiate_parameter_ports(self, function=None, context=None):
--> 501 super()._instantiate_parameter_ports(function=function, context=context)
503 # FIX: UPDATE FOR LEARNING
504 # FIX: UPDATE WITH MODULATION_MODS
505 # FIX: MOVE THIS TO MappingProjection.__init__;
(...) 508 # specific reason, it should be variable, but this affects the values
509 # tests/mechanisms/test_gating_mechanism.py::test_gating_with_composition
510 new_variable = copy.deepcopy(self._parameter_ports[MATRIX].defaults.value)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/projections/projection.py:882, in Projection_Base._instantiate_parameter_ports(self, function, context)
879 def _instantiate_parameter_ports(self, function=None, context=None):
881 from psyneulink.core.components.ports.parameterport import _instantiate_parameter_ports
--> 882 _instantiate_parameter_ports(owner=self, function=function, context=context)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/ports/parameterport.py:1138, in _instantiate_parameter_ports(owner, function, context)
1135 else:
1136 explicit_name = p.name
-> 1138 _instantiate_parameter_port(
1139 owner,
1140 p.name,
1141 value,
1142 context=context,
1143 function=corresponding_parameter_component,
1144 source=source,
1145 explicit_name=explicit_name
1146 )
1148 owner.parameter_ports.sort(key=lambda port: port.name)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/site-packages/psyneulink/core/components/ports/parameterport.py:1307, in _instantiate_parameter_port(owner, param_name, param_value, context, function, source, explicit_name)
1305 reference_value = param_value
1306 else:
-> 1307 reference_value = deepcopy(param_value)
1309 # Assign parameterPort for function_param to the component
1310 port = _instantiate_port(
1311 owner=owner,
1312 port_type=ParameterPort,
(...) 1318 context=context
1319 )
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/copy.py:153, in deepcopy(x, memo, _nil)
151 copier = getattr(x, "__deepcopy__", None)
152 if copier is not None:
--> 153 y = copier(memo)
154 else:
155 reductor = dispatch_table.get(cls)
File /opt/hostedtoolcache/Python/3.11.11/x64/lib/python3.11/copy.py:128, in deepcopy(x, memo, _nil)
124 d[PyStringMap] = PyStringMap.copy
126 del d, t
--> 128 def deepcopy(x, memo=None, _nil=[]):
129 """Deep copy operation on arbitrary Python objects.
130
131 See the module's __doc__ string for more info.
132 """
134 if memo is None:
KeyboardInterrupt:
The above figure seems complicated at first, but it follows the same principle as the torch implementation: We look it from the bottom to the top:
The arrows from the 1, 2, and 3 query to the “STORE” node, represent that these values are stored in memory
All of them ara also passed through a “MATCH” node, wich calculates the similarity between the query and the stored values (just as desibed above for the keys)
The “MATCH” nodes are then weighted and combined. (Here they are also softmaxed)
Then the result is used to retrieve the memory by multiplying the “combined matchse” with the stored values.
In our implementation, we specified input node 1 as having 2 entries (1x2 vector), the input node 2 with 4 entries (1x4 vector), and the input node 3 with 3 entries (1x3 vector). Yet, In the explanation above, I talked about adding weighted vectors together. How can that be?
💡 Hint
We are not adding the query vectors together but the matched weights. What is the shape of these weights?
✅ Solution 4
The matched weights have the shape of the number of memory slots. Their entries don’t represent the query vectors themselves, the i
-th entry signifies how similar the memory in slot i
is to the query vector.
This is why a weighted sum makes sense, we are literally weighing how similar 1, 2, and 3 is and then combining them. This way the retrieavel searches for the most combined (weighted) similarity
Marking a Field as Value (non-query)#
If we don’t want specific fields to be taken into account on retrieval (for example if they are the “target” fields that the model is supposed to predict), we can set their retrieval weight to “None”:
em.memory
em = pnl.EMComposition(name='EM with Target', # name
memory_capacity=1000, # number of key-value pairs
memory_template=[[0, 0], [0, 0, 0, 0], [0, 0, 0]],
# template for the memory. Note: Here we use 3 memory slots (instead of just a key value pair, we can store as many keys and pairs as we want.)
fields={'1':
{pnl.FIELD_WEIGHT: .5,
# weight of the key. This determines how much this "slot" influences the retrieval
pnl.LEARN_FIELD_WEIGHT: False, # The weight can be learned via backpropagation
pnl.TARGET_FIELD: False
# If this is a target field, the error is calculated here, and backpropagated
},
'2': {pnl.FIELD_WEIGHT: .5,
pnl.LEARN_FIELD_WEIGHT: False,
pnl.TARGET_FIELD: False},
'3': {pnl.FIELD_WEIGHT: None,
pnl.LEARN_FIELD_WEIGHT: False,
pnl.TARGET_FIELD: True},
},
memory_fill=.001, # fill the memory with this value
normalize_memories=True, # normalize the memories
softmax_gain=1., # gain of the softmax function
softmax_threshold=0.1, # threshold of the softmax function
memory_decay=0, # memory can be decayed over time
)
em.show_graph(output_fmt='jupyter')
As you see, this way 3 is stored (and retrieved) but is not taken into account when calculating the matched similarity (it is not “used” to retrieve from memory).