Hi i’m def miller and i’m going to show you how easy it is to make a classic back propagation girl net simulator in c plus plus this tutorial is for the beginning to intermediate c_plus plus programmer if you can write mbe by the simple consul program with a simple classes than you can certainly make a neural net and c plus plus in addition to showing how the neural networks will talk about some pizza cost.
Plus programming basic concepts such as class assign prototyping data hiding encapsulation and things like that the tutorial does not cover required any prior experience with the more advanced topics like exception handling or class inheritance or threats uh. so this will be a simple program and you can download the resulting source code at my block which you can reach from our website.
Neural net at the highest level of edas abstract back black box level is just so super simple all it does this eat you put numbers in its inputs any good numbers on its output so it’s just a mathematical transform what’s the big deal about the romance.
Well that and the big deal is that sometimes you want to get from your inputs to your outputs but you’re not sure or the formula as and you’re not even sure how to mathematically derived creek formula but if you have a lot of real world training data worked you know what the output should be given certain influence.
Then you can throw all that training data at your neural net id if it’s successful it will learned what the transformants then it will give you a reasonable result even on the inputs that is never seen before now if you look at the next level of neural nets uh. look inside the black.
Box to see what the next level of detail it is it’s it’s pretty clear that under on that contains just a bunch of these neurons that ur tightly connected and there are ranged in these columns movement of the called layers so there’s always a input layer of neurons that accepts the input numbers there’s always a layer of output neurons in this case just one.
But how for many in the you need for the problem yourself you have a layer of output neurons than in their outputs become the output of the entire neural net but in the middle in between the output in uh. input layers you have one or more hidden layers of neurons.
And they’re all connected so the teacher are disconnected fully to the the neurons in the next layer to the right eve could have uh. neural nets that are sparsely connected or have connections that feedback and earlier layers uh. that would give you a neural net a little bit of memory if you do that but for this tutorial were going to keep things simple and just assume that are.