Recent

Author Topic: artificial neural networks | back propagation  (Read 29288 times)

cpalx

  • Hero Member
  • *****
  • Posts: 753
artificial neural networks | back propagation
« on: November 02, 2012, 01:32:58 pm »
is there any place where i can download the source code of a back propagation  artificial neural network? (Delphi or Lazarus)

Leledumbo

  • Hero Member
  • *****
  • Posts: 8746
  • Programming + Glam Metal + Tae Kwon Do = Me
Re: artificial neural networks | back propagation
« Reply #1 on: November 02, 2012, 05:30:07 pm »
None that I know of (I implemented a part of it when I took the class a few years back, but I lost the code), however there's a Delphi binding for FANN.

schuler

  • Full Member
  • ***
  • Posts: 223
Re: artificial neural networks | back propagation
« Reply #2 on: August 09, 2017, 11:13:26 pm »
 :) Hello  :)
Just to let you know that I've just implemented a backpropagation algorithm in Lazarus:

https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/ubackpropagation.pas

 :) Have fun :)

avra

  • Hero Member
  • *****
  • Posts: 2514
    • Additional info
ct2laz - Conversion between Lazarus and CodeTyphon
bithelpers - Bit manipulation for standard types
pasettimino - Siemens S7 PLC lib

Thaddy

  • Hero Member
  • *****
  • Posts: 14204
  • Probably until I exterminate Putin.
Re: artificial neural networks | back propagation
« Reply #4 on: August 10, 2017, 10:29:24 am »
Try here: http://forum.lazarus.freepascal.org/index.php/topic,32620.msg210473.html#msg210473
Nope, you missed the point. That is actually very nice and concise code... Short is better than long. Playing with it..
Specialize a type, not a var.

avra

  • Hero Member
  • *****
  • Posts: 2514
    • Additional info
Re: artificial neural networks | back propagation
« Reply #5 on: August 10, 2017, 09:06:49 pm »
Try here: http://forum.lazarus.freepascal.org/index.php/topic,32620.msg210473.html#msg210473
Nope, you missed the point
Would you be so kind to explain why? I don't get it. What's wrong with this simple back propagation pascal code that you get after following the first link:
http://wayback.archive.org/web/20100926121015/http://richardbowles.tripod.com:80/neural/source/simple1.htm
ct2laz - Conversion between Lazarus and CodeTyphon
bithelpers - Bit manipulation for standard types
pasettimino - Siemens S7 PLC lib

schuler

  • Full Member
  • ***
  • Posts: 223
Re: artificial neural networks | back propagation
« Reply #6 on: August 10, 2017, 09:52:50 pm »
 :) HELLO  :)

Just added a new constructor. It allows you to create a network with any size.

Code: Pascal  [Select][+][-]
  1. var
  2.   B: TBackPropagation;
  3. const
  4.   aNetworkSize: array[0..3] of integer = (3,1000,1000,3);
  5. begin
  6.   B := TBackPropagation.Create(aNetworkSize);
  7.  

The above means that you are creating a network with:
1 input layer with 3 nodes.
2 hidden layers with 1000 nodes each.
1 output layer with 3 nodes.

Just found that using "0.9" as output was saturating weights on networks with more than 10 layers. Using "0.1" seems to work well on these networks. The reason is the derivative function requiring high values when output is too close to 1 and -1.

Code: Pascal  [Select][+][-]
  1. const outputs : TBackInputOutput =
  2.   (// XOR, AND,   OR
  3.     (-0.1,-0.1,-0.1),
  4.     ( 0.1,-0.1, 0.1),
  5.     ( 0.1,-0.1, 0.1),
  6.     (-0.1, 0.1, 0.1)
  7.   );
  8.  

My next step is: I'll create a class specific for data classification extending the backpropagation class.

 :) Have Fun :)

Leledumbo

  • Hero Member
  • *****
  • Posts: 8746
  • Programming + Glam Metal + Tae Kwon Do = Me
Re: artificial neural networks | back propagation
« Reply #7 on: August 12, 2017, 01:01:54 am »
:) Hello  :)
Just to let you know that I've just implemented a backpropagation algorithm in Lazarus:

https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/ubackpropagation.pas

 :) Have fun :)
Ah... if only you made it 7 years earlier, I might get an A from that class :P

schuler

  • Full Member
  • ***
  • Posts: 223
Re: artificial neural networks | back propagation
« Reply #8 on: September 06, 2017, 06:10:15 am »
:) Hello :)

It's interesting how far you can push backpropagation in a convolutional neural network.

I've just finished coding a convolutional neural network in plain object pascal. It's now time for a long testing phase.

In the case that you have never heard about convolutional neural networks, there is a good example here:
http://cs.stanford.edu/people/karpathy/convnetjs/demo/cifar10.html

This is the example from convjs:
Code: Javascript  [Select][+][-]
  1. layer_defs = [];
  2. layer_defs.push({type:'input', out_sx:32, out_sy:32, out_depth:3});
  3. layer_defs.push({type:'conv', sx:5, filters:16, stride:1, pad:2, activation:'relu'});
  4. layer_defs.push({type:'pool', sx:2, stride:2});
  5. layer_defs.push({type:'conv', sx:5, filters:20, stride:1, pad:2, activation:'relu'});
  6. layer_defs.push({type:'pool', sx:2, stride:2});
  7. layer_defs.push({type:'conv', sx:5, filters:20, stride:1, pad:2, activation:'relu'});
  8. layer_defs.push({type:'pool', sx:2, stride:2});
  9. layer_defs.push({type:'softmax', num_classes:10});
  10.  

This is a similar implementation in Pascal using the brand new code:
Code: Pascal  [Select][+][-]
  1.   NumClasses := 10;
  2.   NN := TNNet.Create();
  3.   NN.AddLayer( TNNetInput.Create(32,32,3) );
  4.   NN.AddLayer( TNNetConvolutionRelu.Create(16,5,2,0) );
  5.   NN.AddLayer( TNNetMaxPool.Create(2) );
  6.   NN.AddLayer( TNNetConvolutionRelu.Create(20,5,2,0) );
  7.   NN.AddLayer( TNNetMaxPool.Create(2) );
  8.   NN.AddLayer( TNNetConvolutionRelu.Create(20,5,2,0) );
  9.   NN.AddLayer( TNNetMaxPool.Create(2) );
  10.   NN.AddLayer( TNNetLayerFullConnect.Create(NumClasses) );
  11.   NN.Randomize();
  12.  

The pascal source code can be found here:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/uconvolutionneuralnetwork.pas

 :) I wish everyone happy coding  :)

elxanders

  • Newbie
  • Posts: 2
Re: artificial neural networks | back propagation
« Reply #9 on: September 08, 2017, 05:07:37 am »
Hello. I was going to experiment with neural networks, did not want to do it myself and found your library
(ubackpropagation). Haven't tried it yet, but so far it seems not very promising using in real work the library that has no saving/loading routines.
P.S. your convolutional module still has a notice it is unfinished and should not be used.
« Last Edit: September 08, 2017, 05:10:54 am by elxanders »

schuler

  • Full Member
  • ***
  • Posts: 223
Re: artificial neural networks | back propagation
« Reply #10 on: September 12, 2017, 10:17:16 am »
elxanders, sorry for taking so long to reply.

The current state is: coding self adapting learning rate. When both hiperbolic tangent and ReLU approaches :o are both stable and users don't need to spend a weekend fixing learning rate, it will be time for load/store methods. This might be ready in a month or two.

 :) happy coding to all pascal lovers :)

schuler

  • Full Member
  • ***
  • Posts: 223
Re: artificial neural networks | back propagation
« Reply #11 on: September 29, 2017, 10:19:39 pm »
:) Hello Pascal Lovers :)

@elxanders

Hiperbolic Tangent and ReLU approaches seem to be both numerically stable (no overflow nor underflow) as long as input values are transformed as per example and initialization calls the same Randomize function used under testing.

Numerical stability seems to be found on plain pascal code, AVX 32 bits and AVX 64 bits (see uvolume unit).

Also, there is no need anymore to spend hours setting the proper learning rate. Default behavior seems to work well on all 6 testing algorithms. The testing file is located here:

https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/testcnnalgo/testcnnalgo.lpr

All tests so far have been done with CIFAR10 classification.

Although more 48 hours testing runs are required, first 16 hours runs look promising.

So, it seems pretty much that soon I'll be coding load/save functions.

:) I wish happy coding to all pascal lovers :)

schuler

  • Full Member
  • ***
  • Posts: 223
Re: artificial neural networks | back propagation
« Reply #12 on: November 04, 2017, 02:19:28 am »
@elxanders

Just finished coding load/save of the neural network. You can use these methods:
Code: Pascal  [Select][+][-]
  1. function TNNet.SaveToString(): string;
  2. procedure TNNet.SaveToFile(filename: string);
  3.  
  4. procedure TNNet.LoadFromString(strData: string);
  5. procedure TNNet.LoadFromFile(filename: string);

You can also load/save data and/or structure only:

Code: Pascal  [Select][+][-]
  1. function TNNet.SaveDataToString(): string;
  2. procedure TNNet.LoadDataFromString(strData: string);
  3.  
  4. function TNNet.SaveStructureToString(): string;
  5. procedure TNNet.LoadStructureFromString(strData: string);

A general example about how to use TNNet class can be found here:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/testcnnalgo/testcnnalgo.lpr
« Last Edit: November 04, 2017, 02:29:02 am by schuler »

elxanders

  • Newbie
  • Posts: 2
Re: artificial neural networks | back propagation
« Reply #13 on: November 05, 2017, 01:00:49 am »
Thank you. That was nice to hear. So, it is time to play with it.

A short example of non-convolution network training would be great. As the one you had in backpropagation module.

After you create a network:

Code: Pascal  [Select][+][-]
  1. NN := TNNet.Create();
  2. NN.AddLayer( TNNetInput.Create(InputSize) );
  3. NN.AddLayer( TNNetLayerFullConnectReLU.Create(LayerSize) );
  4. NN.AddLayer( TNNetLayerFullConnectReLU.Create(LayerSize) );
  5. ...
  6. NN.Randomize();
  7. NN.SetLearningRate(0.08,0.9);

Further steps to organize training process?
I can see input requites TNNetVolume. Will it be fast enough creating and passing objects each time instead of a simple dataset? What about output and managing it?

Will this TNNetwork be useful to tune (1) numeric output, or classification only?

Does it support batches (and accumulate errors), or Backpropagate() needed to be called after each Compute()? Can SetLearningRate be changed later during training process without any problems in structure?

Sorry for lazy questions, but exploring somebody else's source code exerts a lot :)

schuler

  • Full Member
  • ***
  • Posts: 223
Re: artificial neural networks | back propagation
« Reply #14 on: November 05, 2017, 01:32:03 am »
Hello elxanders,
Here we go with a short example. In this example, the NN learns 3 logic operations XOR, AND and OR.

Code: Pascal  [Select][+][-]
  1. type TBackInputOutput = array[0..3] of array[0..2] of TNeuralFloat;
  2.  
  3. const outputs : TBackInputOutput =
  4.   (// XOR, AND,   OR
  5.     (-0.1,-0.1,-0.1),
  6.     ( 0.1,-0.1, 0.1),
  7.     ( 0.1,-0.1, 0.1),
  8.     (-0.1, 0.1, 0.1)
  9.   );
  10.  
  11. const inputs : TBackInputOutput =
  12.   ( // x1,   x2, bias
  13.     (-0.9, -0.9, 1),
  14.     (-0.9,  0.9, 1),
  15.     ( 0.9, -0.9, 1),
  16.     ( 0.9,  0.9, 1)
  17.   );
  18.  
  19. procedure TForm1.BitBtn6Click(Sender: TObject);
  20. var
  21.   NN: TNNet;
  22.   I: integer;
  23.   Cnt: integer;
  24.   pOutPut: TNNetVolume;
  25.   vInputs, vOutput: TBackInputOutput;
  26.   totalTimeSeconds, startTime, finishTime: double;
  27.   shouldPrint: boolean;
  28. begin
  29.   shouldPrint := true;
  30.  
  31.   vInputs := inputs;
  32.   vOutput := outputs;
  33.  
  34.   pOutPut := TNNetVolume.Create(3,1,1,1);
  35.   NN := TNNet.Create();
  36.  
  37.   NN.AddLayer( TNNetInput.Create(3) );
  38.   NN.AddLayer( TNNetLayerFullConnect.Create(3) );
  39.   NN.AddLayer( TNNetLayerFullConnect.Create(3) );
  40.   NN.AddLayer( TNNetLayerFullConnect.Create(3) );
  41.  
  42.   NN.Randomize();
  43.   startTime := now();
  44.   for I := 1 to 3000 do
  45.   begin
  46.     if shouldPrint then WriteLn();
  47.     for Cnt := Low(inputs) to High(inputs) do
  48.     begin
  49.       NN.Compute(vInputs[cnt]);
  50.       NN.GetOutput(pOutPut);
  51.       NN.Backpropagate(vOutput[cnt]);
  52.       if shouldPrint then
  53.       WriteLn
  54.       (
  55.         I:7,'x',Cnt,' Output:',
  56.         pOutPut.Raw[0]:5:2,' ',
  57.         pOutPut.Raw[1]:5:2,' ',
  58.         pOutPut.Raw[2]:5:2,' - Training data:',
  59.         vOutput[cnt][0]:5:2,' ',
  60.         vOutput[cnt][1]:5:2,' ' ,
  61.         vOutput[cnt][2]:5:2,' '
  62.       );
  63.     end;
  64.   end;
  65.   finishTime := now();
  66.   totalTimeSeconds := (finishTime - startTime) * 24 * 60 * 60;
  67.   writeln('Total run time:', (totalTimeSeconds): 10: 5, ' seconds.');
  68.  
  69.   NN.Free;
  70.   pOutPut.Free;
  71. end;
  72.  
  73.  

You have 2 questions that affect performance:
* About creating and passing objects. For big datasets, I create all required objects before running the NN. There is an example here: testcnnalgo.lpr .
For simplicity, you can also use dynamic arrays with single elements.
* About batches: although I do not have batches and it could make the NN faster, some of my own benchmarks running against some well known APIs (won't give names for now) reveal that this implementation is up to 10x faster with CPU only (I mean, CPU implementation against CPU implementation). The OpenCL version is in the cards to come. Yes - you need to call Backpropagate each time.

The latest version of testcnnalgo.lpr shows an example with a decreasing learning rate. Yes, you can change learning rate and inertia at any time. I highly recommend running testcnnalgo.lpr program via command line to have a feeling.

There is a new implementation here:
https://github.com/joaopauloschuler/neural-api/tree/master/examples/XorAndOr

Feel free to share questions as they might help other users.

« Last Edit: September 27, 2019, 06:12:20 pm by schuler »

 

TinyPortal © 2005-2018