You are viewing a single comment's thread from:

RE: Would you love to see how I trained my small, tiny, powerful neural network?

in #steemstem6 years ago (edited)

Hm. I tend to disagree with that statement:

The idea about weights started in 1940s with Hebb’s learning law (Donald Olding Hebb, a Canadian neuropsychologist) also known as "cells that fire together wire together" which says that if one neuron A is causing activity of other neuron B, then their connection is getting stronger, otherwise - it’s weakening.

This is somewhat misleading. As far as I remember, Hebb did not write that two neurons need to fire together to increase the efficiency of their connection.
To understand this clearly, it is important to be aware of the concepts of causality and consistency.
He stated, that neuron A needs to repeatedly (consistency) take part in firing (causality) neuron B. That is an important difference to make. The presynaptic neuron needs to fire repeatedly just before the postsynaptic one, to have a potentiating effect on the synapse in the end. This mechanism is strongly connected to spike-timing-dependent plasticity (STDP) - a neurological process, which is responsible for adjusting the strength of neuron connections. This is based on the relative timing of a neuron’s output and input potentials (spikes).

Sort:  

Hi @egotheist,

Thank you for your insight, you are right - it is a little bit misleading, but if it's taken too literally.

Hebb really never said "cells that fire together wire together", neither I wrote that it's stated in his law - it's just part of the "slang" in academic circles because it's easy to remember.

The fact is that those two neurons can't fire exactly at the same time, so the presynaptic neuron have to fire a bit earlier (because of the causality, like you said) and that consistency is a big factor for weights setting.

I just wanted to adapt it to an average steemit reader.Me personally, never thought about it in that way, that neurons could fire literally at the same time, so that's probably why I haven't seen it as an oversight.

Again, you got the point,

Regards:)

I just wanted to adapt it to an average steemit reader.Me personally, never thought about it in that way, that neurons could fire literally at the same time, so that's probably why I haven't seen it as an oversight.

I see. Problem is, this kind of imprecise thinking leads to other mistakes. I have read this exact example already in books. Can't hurt to put it right when writing here.

Coin Marketplace

STEEM 0.23
TRX 0.25
JST 0.038
BTC 95511.18
ETH 3313.19
USDT 1.00
SBD 3.30