Neural networks: anger!

Filed under Neural networks

I really love neural networks – they can do really everything, and I am convinced they will be more and more part of our future; from big data to site suggestions, I can see them trying to catch up…

But I can’t help but being annoyed sometimes when working with nets. I managed to write some time ago Annfid using entirely nets (Encog), which I think is great for forensic investigations, but then sometimes you might get stuck on little things, and that is where the pain starts.

So, I wanted to write a little article here on how to get going with neural networks, and here the disappointment comes. I fed the nets with some data like:

1 + 1 = -4

2 + 2 = -2

3 + 3 = 0

4 + 4 = 2

5 + 5 = 4

on 6 it becomes spooky, but here is more or less the baseline: all numbers are actually n = n – 3. This means that 3 = 0, and the results turn out normal. Strangely enough, neural networks could not resolve the simple pattern! Now, I have to admit for this code I used a new version of encog, but never the less, the net gets stuck during the training.
I write the code below, in case someone reads and might have some ideas…

By the way… new RequiredImprovementStrategy() is really a great idea (resets the net if there is no improvement >1% after the specified number of cycles)!

After this post I wrote on the encog forum. The project owner (Jeff Heaton, he actually answers on the forum!) pointed out that activation sigmoid requires an input between 0 and 1.
So, here is the corrected code, where 1 is 0.1 and 9 is 0.9, results are much better (sorry, wrote it in C# this time)!

Post a Comment

Your email is never published nor shared. Required fields are marked *


− 2 = one