Train Neural Network

325b2be4205aa4f65f1f8984e0fb9166
0
noise 101 Jul 12, 2004 at 11:53

Hellp people,
I have a 3-layer neural network (input, hidden, output) using sigmoid functions.
I can train the network to learn the xor function perfectly.
However when i try to train the network to learn the sine function it fails.
The train dataset is [41 rows x 2 columns] array.
The values in first column run from -1 to +1 and the its corresponding sine values are stored in the second column.
The network error remains the same (around 0.5) no matter how many iterations i run.
Anyone have a idea what is wrong with the network?
Thank you in advance…. really waiting for your feedback….

10 Replies

Please log in or register to post a reply.

F7a4a748ecf664f189bb704a660b3573
0
anubis 101 Jul 12, 2004 at 17:51

try changing the number neurons. there might be unlucky constalations where the network is locally stuck

325b2be4205aa4f65f1f8984e0fb9166
0
noise 101 Jul 13, 2004 at 00:54

@anubis

try changing the number neurons. there might be unlucky constalations where the network is locally stuck [snapback]8706[/snapback]

Thank you for ur reply.
Well I did try but it still not working…After trying to change the parameters, i found out the with the train dataset other than the train dataset i used for training the xor function (matrix of 4x2) the network fails.
Do you have any idea what should be the problem?
Thank you…

325b2be4205aa4f65f1f8984e0fb9166
0
noise 101 Jul 13, 2004 at 01:52

I took out the bias update and it works… really dun understand why i have not to update bias…
Now the problem is: cannot train negative values… and the network error can not go below 0.04…
I did try to change the number of neurons, the learning rate, the momentum but still cannot go below 0.04…
Anyone have an idea….
Any feedback will be very much appreciated….
Thank you….

F7a4a748ecf664f189bb704a660b3573
0
anubis 101 Jul 13, 2004 at 06:41

i saw your code but am relucatant to debug it :D
i can’t think of any typical error right now and small errors in your code can cause havok in the network

325b2be4205aa4f65f1f8984e0fb9166
0
noise 101 Jul 13, 2004 at 14:05

@anubis

i saw your code but am relucatant to debug it :D
i can’t think of any typical error right now and small errors in your code can cause havok in the network [snapback]8716[/snapback]

Thank you for viewing the code…
I modified the code and put here
Could you have a look?
Thank you very much

F7a4a748ecf664f189bb704a660b3573
0
anubis 101 Jul 13, 2004 at 15:07

“i’m reluctant to debug it” means i don’t enjoy browsing through other peoples code to find the one error that kills everything

325b2be4205aa4f65f1f8984e0fb9166
0
noise 101 Jul 16, 2004 at 18:58

@anubis

“i’m reluctant to debug it” means i don’t enjoy browsing through other peoples code to find the one error that kills everything [snapback]8723[/snapback]

I see…
OK, please give me some idea on these:
*the smallest network error when training sine function is around 0.04, when training function 1/x\^2 is around 0.01….no matter how many time i set the number of iteration… Why is this so? is true that the more the number of iteration the smaller the network error? i use mean square error.
*the error is not just going down…it’s up and down, very unstable…
Thank you very much…

F7a4a748ecf664f189bb704a660b3573
0
anubis 101 Jul 16, 2004 at 19:48

yeah… that’s normal. you see when training a network you don’t have a the usual kind of aproximation that gets better an better with each iteration. because of this your training algorithm might get locally stuck (imagine it like a car running out of fuel in between to hills). also a small error might always remain. after all it’s only an aproximation.

325b2be4205aa4f65f1f8984e0fb9166
0
noise 101 Jul 20, 2004 at 18:55

@anubis

yeah… that’s normal. you see when training a network you don’t have a the usual kind of aproximation that gets better an better with each iteration. because of this your training algorithm might get locally stuck (imagine it like a car running out of fuel in between to hills). also a small error might always remain. after all it’s only an aproximation. [snapback]8765[/snapback]

I have another question: If I build the network which correctly approximates the XOR function then can I say that my network is correctly built?
Thank you.

F7a4a748ecf664f189bb704a660b3573
0
anubis 101 Jul 20, 2004 at 21:19

there is no “correct” aproximation for a network. imagine it like driving with a car through hills and valleys. at some point you get stuck in a valley because you run out of fuel. there is no gurante however that the valley you found is the optimal valley (the one giving the correct answer), in fact there is no gurantee that the optimal valley even exists.
read up a bit on n-networks and you will understand my rather simple example