You are viewing a single comment's thread from:

RE: Would you love to see how I trained my small, tiny, powerful neural network?

in #steemstem6 years ago

Hi @egotheist,

Thank you for your insight, you are right - it is a little bit misleading, but if it's taken too literally.

Hebb really never said "cells that fire together wire together", neither I wrote that it's stated in his law - it's just part of the "slang" in academic circles because it's easy to remember.

The fact is that those two neurons can't fire exactly at the same time, so the presynaptic neuron have to fire a bit earlier (because of the causality, like you said) and that consistency is a big factor for weights setting.

I just wanted to adapt it to an average steemit reader.Me personally, never thought about it in that way, that neurons could fire literally at the same time, so that's probably why I haven't seen it as an oversight.

Again, you got the point,

Regards:)

Sort:  

I just wanted to adapt it to an average steemit reader.Me personally, never thought about it in that way, that neurons could fire literally at the same time, so that's probably why I haven't seen it as an oversight.

I see. Problem is, this kind of imprecise thinking leads to other mistakes. I have read this exact example already in books. Can't hurt to put it right when writing here.

Coin Marketplace

STEEM 0.22
TRX 0.24
JST 0.038
BTC 95076.63
ETH 3277.51
USDT 1.00
SBD 3.26