From 564479a094e7feb3e1e91072db943d477f0fee48 Mon Sep 17 00:00:00 2001 From: Jim Martens Date: Thu, 24 May 2018 11:48:26 +0200 Subject: [PATCH] [NN] Added info about Hebbian learning to localized learning section Signed-off-by: Jim Martens --- neural-networks/seminarpaper.tex | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/neural-networks/seminarpaper.tex b/neural-networks/seminarpaper.tex index d150627..f91946f 100644 --- a/neural-networks/seminarpaper.tex +++ b/neural-networks/seminarpaper.tex @@ -455,6 +455,14 @@ is decreasing with further distance from the source. The sources are the second environmental feedback loop in this example as they tell the network or a part of it when to learn. +How does the actual learning happen? The weight change between two neurons +is dependent on the activation of both neurons, the learning rate and the concentration +of neuromodulators. In short Hebbian learning is employed. + +\[ + \Delta w_{ij} = \eta \cdot m_i \cdot a_i \cdot a_j +\] + This explanation should suffice for the general understanding of their method. The neurons within the vicinity of these sources only update their weights in one of the seasons. Therefore they only learn for one season and are unaffected