* * *

Author Topic: artificial neural networks | back propagation  (Read 4155 times)

cpalx

  • Hero Member
  • *****
  • Posts: 533
artificial neural networks | back propagation
« on: November 02, 2012, 01:32:58 pm »
is there any place where i can download the source code of a back propagation  artificial neural network? (Delphi or Lazarus)

Leledumbo

  • Hero Member
  • *****
  • Posts: 7680
  • Programming + Glam Metal + Tae Kwon Do = Me
Re: artificial neural networks | back propagation
« Reply #1 on: November 02, 2012, 05:30:07 pm »
None that I know of (I implemented a part of it when I took the class a few years back, but I lost the code), however there's a Delphi binding for FANN.

schuler

  • New member
  • *
  • Posts: 35
Re: artificial neural networks | back propagation
« Reply #2 on: August 09, 2017, 11:13:26 pm »
 :) Hello  :)
Just to let you know that I've just implemented a backpropagation algorithm in Lazarus:

https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/ubackpropagation.pas

 :) Have fun :)

avra

  • Hero Member
  • *****
  • Posts: 1123
    • Additional info
ct2laz - Easily convert components and projects between Lazarus and CodeTyphon

Thaddy

  • Hero Member
  • *****
  • Posts: 4517
Re: artificial neural networks | back propagation
« Reply #4 on: August 10, 2017, 10:29:24 am »
Try here: http://forum.lazarus.freepascal.org/index.php/topic,32620.msg210473.html#msg210473
Nope, you missed the point. That is actually very nice and concise code... Short is better than long. Playing with it..
"Logically, no number of positive outcomes at the level of experimental testing can confirm a scientific theory, but a single counterexample is logically decisive."

avra

  • Hero Member
  • *****
  • Posts: 1123
    • Additional info
Re: artificial neural networks | back propagation
« Reply #5 on: August 10, 2017, 09:06:49 pm »
Try here: http://forum.lazarus.freepascal.org/index.php/topic,32620.msg210473.html#msg210473
Nope, you missed the point
Would you be so kind to explain why? I don't get it. What's wrong with this simple back propagation pascal code that you get after following the first link:
http://wayback.archive.org/web/20100926121015/http://richardbowles.tripod.com:80/neural/source/simple1.htm
ct2laz - Easily convert components and projects between Lazarus and CodeTyphon

schuler

  • New member
  • *
  • Posts: 35
Re: artificial neural networks | back propagation
« Reply #6 on: August 10, 2017, 09:52:50 pm »
 :) HELLO  :)

Just added a new constructor. It allows you to create a network with any size.

Code: Pascal  [Select]
  1. var
  2.   B: TBackPropagation;
  3. const
  4.   aNetworkSize: array[0..3] of integer = (3,1000,1000,3);
  5. begin
  6.   B := TBackPropagation.Create(aNetworkSize);
  7.  

The above means that you are creating a network with:
1 input layer with 3 nodes.
2 hidden layers with 1000 nodes each.
1 output layer with 3 nodes.

Just found that using "0.9" as output was saturating weights on networks with more than 10 layers. Using "0.1" seems to work well on these networks. The reason is the derivative function requiring high values when output is too close to 1 and -1.

Code: Pascal  [Select]
  1. const outputs : TBackInputOutput =
  2.   (// XOR, AND,   OR
  3.     (-0.1,-0.1,-0.1),
  4.     ( 0.1,-0.1, 0.1),
  5.     ( 0.1,-0.1, 0.1),
  6.     (-0.1, 0.1, 0.1)
  7.   );
  8.  

My next step is: I'll create a class specific for data classification extending the backpropagation class.

 :) Have Fun :)

Leledumbo

  • Hero Member
  • *****
  • Posts: 7680
  • Programming + Glam Metal + Tae Kwon Do = Me
Re: artificial neural networks | back propagation
« Reply #7 on: August 12, 2017, 01:01:54 am »
:) Hello  :)
Just to let you know that I've just implemented a backpropagation algorithm in Lazarus:

https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/ubackpropagation.pas

 :) Have fun :)
Ah... if only you made it 7 years earlier, I might get an A from that class :P

schuler

  • New member
  • *
  • Posts: 35
Re: artificial neural networks | back propagation
« Reply #8 on: September 06, 2017, 06:10:15 am »
:) Hello :)

It's interesting how far you can push backpropagation in a convolutional neural network.

I've just finished coding a convolutional neural network in plain object pascal. It's now time for a long testing phase.

In the case that you have never heard about convolutional neural networks, there is a good example here:
http://cs.stanford.edu/people/karpathy/convnetjs/demo/cifar10.html

This is the example from convjs:
Code: Javascript  [Select]
  1. layer_defs = [];
  2. layer_defs.push({type:'input', out_sx:32, out_sy:32, out_depth:3});
  3. layer_defs.push({type:'conv', sx:5, filters:16, stride:1, pad:2, activation:'relu'});
  4. layer_defs.push({type:'pool', sx:2, stride:2});
  5. layer_defs.push({type:'conv', sx:5, filters:20, stride:1, pad:2, activation:'relu'});
  6. layer_defs.push({type:'pool', sx:2, stride:2});
  7. layer_defs.push({type:'conv', sx:5, filters:20, stride:1, pad:2, activation:'relu'});
  8. layer_defs.push({type:'pool', sx:2, stride:2});
  9. layer_defs.push({type:'softmax', num_classes:10});
  10.  

This is a similar implementation in Pascal using the brand new code:
Code: Pascal  [Select]
  1.   NumClasses := 10;
  2.   NN := TNNet.Create();
  3.   NN.AddLayer( TNNetInput.Create(32,32,3) );
  4.   NN.AddLayer( TNNetConvolutionRelu.Create(16,5,2,0) );
  5.   NN.AddLayer( TNNetMaxPool.Create(2) );
  6.   NN.AddLayer( TNNetConvolutionRelu.Create(20,5,2,0) );
  7.   NN.AddLayer( TNNetMaxPool.Create(2) );
  8.   NN.AddLayer( TNNetConvolutionRelu.Create(20,5,2,0) );
  9.   NN.AddLayer( TNNetMaxPool.Create(2) );
  10.   NN.AddLayer( TNNetLayerFullConnect.Create(NumClasses) );
  11.   NN.Randomize();
  12.  

The pascal source code can be found here:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/uconvolutionneuralnetwork.pas

 :) I wish everyone happy coding  :)

elxanders

  • Newbie
  • Posts: 1
Re: artificial neural networks | back propagation
« Reply #9 on: September 08, 2017, 05:07:37 am »
Hello. I was going to experiment with neural networks, did not want to do it myself and found your library
(ubackpropagation). Haven't tried it yet, but so far it seems not very promising using in real work the library that has no saving/loading routines.
P.S. your convolutional module still has a notice it is unfinished and should not be used.
« Last Edit: September 08, 2017, 05:10:54 am by elxanders »

schuler

  • New member
  • *
  • Posts: 35
Re: artificial neural networks | back propagation
« Reply #10 on: September 12, 2017, 10:17:16 am »
elxanders, sorry for taking so long to reply.

The current state is: coding self adapting learning rate. When both hiperbolic tangent and ReLU approaches :o are both stable and users don't need to spend a weekend fixing learning rate, it will be time for load/store methods. This might be ready in a month or two.

 :) happy coding to all pascal lovers :)

schuler

  • New member
  • *
  • Posts: 35
Re: artificial neural networks | back propagation
« Reply #11 on: September 29, 2017, 10:19:39 pm »
:) Hello Pascal Lovers :)

@elxanders

Hiperbolic Tangent and ReLU approaches seem to be both numerically stable (no overflow nor underflow) as long as input values are transformed as per example and initialization calls the same Randomize function used under testing.

Numerical stability seems to be found on plain pascal code, AVX 32 bits and AVX 64 bits (see uvolume unit).

Also, there is no need anymore to spend hours setting the proper learning rate. Default behavior seems to work well on all 6 testing algorithms. The testing file is located here:

https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/testcnnalgo/testcnnalgo.lpr

All tests so far have been done with CIFAR10 classification.

Although more 48 hours testing runs are required, first 16 hours runs look promising.

So, it seems pretty much that soon I'll be coding load/save functions.

:) I wish happy coding to all pascal lovers :)

 

Recent

Get Lazarus at SourceForge.net. Fast, secure and Free Open Source software downloads Open Hub project report for Lazarus