For a start: THIS
[ame=http://www.amazon.com/Building-Neural-Networks-David-Skapura/dp/0201539217/ref=sr_1_1?ie=UTF8&qid=1365772686&sr=8-1&keywords=skapura]Amazon.com: Building Neural Networks (9780201539219): David M. Skapura: Books[/ame]
is the hands-down best book on neural networks for beginners.
In simple terms:
An Artificial Neural Network tries to simulate how neurons work.
Each neuron gets inputs from several others.
If all that input sums up high enough, it fires a signal itself.
The fun thing is that the firing between cells goes through synapses - connections between the nodes (neurons).
All the training is done by adjusting weights on those connections
So the signal could go like this:
Fire 1 --> Synapse weight .5 --> received .5
Fire .7 --> Synapse weight .2 --> received .14
So what the netwwork then does is (in training):
Use a function to look at how far the output was from the desired output and try to determine which weights have to be adjusted.
Adjust all weights, then try the next training task.
So in the end, you'll have a set of weights in between the neurons that make sense of any input you through at it (with a margin of error).
Fun thing is that it is quite resilient
If I trained a network on scanner images of letters with a bit of noise , the network still fires the right letter if there is more noise on the image
But.. if we put all the biological analogies aside... this is a network of weighted connections... and the optimization is a statistical one.
eliquid: so you give a desired output, put in your input and it tried to match. If off, readjust until near match?
Yep.. in the training, you give it (quite a few) sets of input where you know the desired output. Normally, you take data and put aside a percentage (10%) for testing
In the betting example say I have 10 years of game data I would take 10% of EACH YEAR to the side train the network with the rest
and test it with the 10% I put aside once training is complete, I disable the "learning" (the weights are not adjusted anymore).
And then I can use it for prediction
Now, there are a lot of tricks and tweaks and stuff, but this is the basics.
::emp::