EMP is a god, now I understand NN



Wow... thanks for the praise

me be blushin' quite mightily hereabouts.

::emp::
 
For a start: THIS
[ame=http://www.amazon.com/Building-Neural-Networks-David-Skapura/dp/0201539217/ref=sr_1_1?ie=UTF8&qid=1365772686&sr=8-1&keywords=skapura]Amazon.com: Building Neural Networks (9780201539219): David M. Skapura: Books[/ame]
is the hands-down best book on neural networks for beginners.

In simple terms:
An Artificial Neural Network tries to simulate how neurons work.

Each neuron gets inputs from several others.
If all that input sums up high enough, it fires a signal itself.

The fun thing is that the firing between cells goes through synapses - connections between the nodes (neurons).
All the training is done by adjusting weights on those connections

So the signal could go like this:

Fire 1 --> Synapse weight .5 --> received .5
Fire .7 --> Synapse weight .2 --> received .14

So what the netwwork then does is (in training):
Use a function to look at how far the output was from the desired output and try to determine which weights have to be adjusted.
Adjust all weights, then try the next training task.

So in the end, you'll have a set of weights in between the neurons that make sense of any input you through at it (with a margin of error).
Fun thing is that it is quite resilient

If I trained a network on scanner images of letters with a bit of noise , the network still fires the right letter if there is more noise on the image

But.. if we put all the biological analogies aside... this is a network of weighted connections... and the optimization is a statistical one.

eliquid: so you give a desired output, put in your input and it tried to match. If off, readjust until near match?

Yep.. in the training, you give it (quite a few) sets of input where you know the desired output. Normally, you take data and put aside a percentage (10%) for testing

In the betting example say I have 10 years of game data I would take 10% of EACH YEAR to the side train the network with the rest
and test it with the 10% I put aside once training is complete, I disable the "learning" (the weights are not adjusted anymore).

And then I can use it for prediction

Now, there are a lot of tricks and tweaks and stuff, but this is the basics.

::emp::
 
Last edited:
what the fuck am I reading???

I'm not hating on this; in fact I'm hating that I don't understand.

What applications are you using neural networks for?
 
The weighted nodes sounds similar to A* pathfinding algorithm in games. I'm not working on NN but am trying to swallow the SURF algorithm for computer tracking objects in a video stream. Lots of crazy shit out there to learn.
 
what the fuck am I reading???

I'm not hating on this; in fact I'm hating that I don't understand.

What applications are you using neural networks for?

zNLld11.jpg


I have no idea what the fuck is going on either. Either they are trying to break captcha, recreate life/brain activity, or predict the future (Maybe create skynet since Cardine and Bofu seem to be involved at some degree - I'm deleting my twitter account)​
 
How did you manage to understand HTML, bro? Any book you'd recommend?

I'm going to get this now:

[ame="http://www.amazon.com/Building-Neural-Networks-David-Skapura/dp/0201539217/ref=wl_it_dp_o_pC_S_nC"]Amazon.com: Building Neural Networks (9780201539219): David M. Skapura: Books[/ame]