* * *

Author Topic: Conscious Artificial Intelligence - Project Update  (Read 2594 times)

schuler

  • Jr. Member
  • **
  • Posts: 89
Conscious Artificial Intelligence - Project Update
« on: November 24, 2017, 12:54:49 am »
 :) Hello :)
As some of you already know, I've been developing Neural Networks with Free Pascal / Lazarus and sometimes it's good to show progress.

The newest addition to the project is a convolutional neural network pascal unit capable of convolution and other common features. These are the available layers:
* TNNetInput (Input/output: 1D, 2D or 3D).
* TNNetFullConnect (Input/output: 1D, 2D or 3D).
* TNNetFullConnectReLU (Input/output: 1D, 2D or 3D).
* TNNetLocalConnect (Input/output: 1D, 2D or 3D - feature size: 1D or 2D). Similar to full connect with individual neurons.
* TNNetLocalConnectReLU (Input/output: 1D, 2D or 3D - feature size: 1D or 2D).
* TNNetConvolution (Input/output: 1D, 2D or 3D - feature size: 1D or 2D).
* TNNetConvolutionReLU (Input/output: 1D, 2D or 3D - feature size: 1D or 2D).
* TNNetMaxPool (Input/output: 1D, 2D or 3D - feature size: 1D or 2D).
* TNNetConcat (Input/output: 1D, 2D or 3D) - Allows concatenating the result from previous layers.
* TNNetReshape (Input/output: 1D, 2D or 3D).
* TNNetSoftMax (Input/output: 1D, 2D or 3D).

These are the available weight initializers:
* InitUniform(Value: TNeuralFloat = 1);
* InitLeCunUniform(Value: TNeuralFloat = 1);
* InitHeUniform(Value: TNeuralFloat = 1);
* InitGlorotBengioUniform(Value: TNeuralFloat = 1);

The source code is located here:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/uconvolutionneuralnetwork.pas

There is an extensive CIFAR-10 testing script located at:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/testcnnalgo/testcnnalgo.lpr

The API allows you to create divergent/parallel and convergent layers as per
example below:

Code: Pascal  [Select]
  1. // The Neural Network
  2. NN := TNNet.Create();
  3.  
  4. // Network that splits into 2 branches and then later concatenated
  5. TA := NN.AddLayer(TNNetInput.Create(32, 32, 3));
  6.  
  7. // First branch starting from TA (5x5 features)
  8. NN.AddLayerAfter(TNNetConvolutionReLU.Create(16, 5, 0, 0),TA);
  9. NN.AddLayer(TNNetMaxPool.Create(2));
  10. NN.AddLayer(TNNetConvolutionReLU.Create(64, 5, 0, 0));
  11. NN.AddLayer(TNNetMaxPool.Create(2));
  12. TB := NN.AddLayer(TNNetConvolutionReLU.Create(64, 5, 0, 0));
  13.  
  14. // Another branch starting from TA (3x3 features)
  15. NN.AddLayerAfter(TNNetConvolutionReLU.Create(16, 3, 0, 0),TA);
  16. NN.AddLayer(TNNetMaxPool.Create(2));
  17. NN.AddLayer(TNNetConvolutionReLU.Create(64, 5, 0, 0));
  18. NN.AddLayer(TNNetMaxPool.Create(2));
  19. TC := NN.AddLayer(TNNetConvolutionReLU.Create(64, 6, 0, 0));
  20.  
  21. // Concats both branches so the NN has only one end.
  22. NN.AddLayer(TNNetConcat.Create([TB,TC]));
  23. NN.AddLayer(TNNetLayerFullConnectReLU.Create(64));
  24. NN.AddLayer(TNNetLayerFullConnectReLU.Create(NumClasses));
  25.  

As a simpler example, this example shows how to create a simple fully forward connected network 3x3:
Code: Pascal  [Select]
  1. NN := TNNet.Create();
  2. NN.AddLayer( TNNetInput.Create(3) );
  3. NN.AddLayer( TNNetLayerFullConnectReLU.Create(3) );
  4. NN.AddLayer( TNNetLayerFullConnectReLU.Create(3) );
  5. NN.SetLearningRate(0.01,0.8);

In above example, learning rate is 0.01 and inertia is 0.8.

Follows an example showing how to train your network:
Code: Pascal  [Select]
  1. // InputVolume and vDesiredVolume are of the type TNNetVolume
  2. NN.Compute(InputVolume);
  3. NN.GetOutput(PredictedVolume);
  4. vDesiredVolume.SetClassForReLU(DesiredClass);
  5. NN.Backpropagate(vDesiredVolume);

Recently, an ultra fast Single Precision AVX/AVX2 API was coded so Neural Networks could benefit from it:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/uvolume.pas

2 videos were created showing this unit:
https://www.youtube.com/watch?v=qGnfwpKUTIQ
https://www.youtube.com/watch?v=Pnv174V_emw

In preparation for future OpenCL based Neural Networks, an OpenCL wrapper was created:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/ueasyopencl.pas
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/ueasyopenclcl.pas

There is an example for this OpenCL wrapper here:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/opencl/easy-trillion-test/

 :) Wish everyone happy pascal coding  :)
« Last Edit: November 25, 2017, 02:46:43 am by schuler »

DonAlfredo

  • Hero Member
  • *****
  • Posts: 930
Re: Conscious Artificial Intelligence - Project Update
« Reply #1 on: November 24, 2017, 09:51:35 am »
Very impressive. And OpenCL is a very useful feature for speeding things up.
Thanks.

schuler

  • Jr. Member
  • **
  • Posts: 89
Re: Conscious Artificial Intelligence - Project Update
« Reply #2 on: May 16, 2018, 01:28:01 am »
 :) Hello,  :)
I haven't updated this post for a long time. There are plenty of new features since the last update:

A new pooling layer:
* TNNetAvgPool (input/output: 1D, 2D or 3D)

Layers for inverse operations:
* TNNetDeLocalConnect (input/output: 1D, 2D or 3D)
* TNNetDeLocalConnectReLU (input/output: 1D, 2D or 3D)
* TNNetDeconvolution (input/output: 1D, 2D or 3D)
* TNNetDeconvolutionReLU (input/output: 1D, 2D or 3D)
* TNNetDeMaxPool (input/output: 1D, 2D or 3D)

New normalization layers:
* TNNetLayerMaxNormalization (input/output: 1D, 2D or 3D)
* TNNetLayerStdNormalization (input/output: 1D, 2D or 3D)

A dropout layer:
* TNNetDropout (input/output: 1D, 2D or 3D)

L2 regularization has been added. It's now possible to apply L2 to all layers or only to convolutional layers.
Code: Pascal  [Select]
  1.       procedure SetL2DecayToConvolutionalLayers(pL2Decay: TNeuralFloat);
  2.       procedure SetL2Decay(pL2Decay: TNeuralFloat);

A new video:
Increasing Image Resolution with Neural Networks
https://www.youtube.com/watch?v=jdFixaZ2P4w

Data Parallelism has been implemented. Experiments have been made with up to 40 parallel threads in virtual machines with up to 32 virtual cores.

Data Parallelism is implemented at TNNetDataParallelism. As per example, you can create and run 40 parallel neural networks with:
Code: Pascal  [Select]
  1. NN := TNNet.Create();
  2. ...
  3. FThreadNN := TNNetDataParallelism.Create(NN, 40);
  4. ...
  5. ProcThreadPool.DoParallel(@RunNNThread, 0, FThreadNN.Count-1, Nil, FThreadNN.Count);
  6. FThreadNN.AvgWeights(NN);

 :) Wish everyone happy coding  :)
« Last Edit: July 31, 2018, 05:30:11 am by schuler »

schuler

  • Jr. Member
  • **
  • Posts: 89
Re: Conscious Artificial Intelligence - Project Update
« Reply #3 on: July 31, 2018, 05:21:45 am »
It's time for new features and examples!

Follows some new examples:

CIFAR-10 with OpenCL on convolution forward step: https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/experiments/visualCifar10OpenCL/

CIFAR-10 batch updates example:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/experiments/visualCifar10BatchUpdate/

Mario Werner independently coded and shared examples for CIFAR-10 and MNIST:
https://bitbucket.org/108bits/cai-implementations/src/c8c027b1a0d636713f7ebb70a738f1cd7117a7a4?at=master

BTW, if you have examples that you've shared in public repositories, please let me know. I love seeing what you are doing.

These are new features:

OpenCL based Convolution: on tiny images 32x32x3 or 28x28x1, you won't get much improvement against the well made AVX code. But you'll find impressive improvement on large input volumes. OpenCL can be enabled by calling:
Code: Pascal  [Select]
  1. procedure EnableOpenCL(platform_id: cl_platform_id; device_id: cl_device_id);

Batch Updates Support
Batch updates can be enabled/disabled via:
Code: Pascal  [Select]
  1. procedure SetBatchUpdate(pBatchUpdate: boolean);

New Normalization layers:
* TNNetLocalResponseNorm2D: (input/output: 2D or 3D).
* TNNetLocalResponseNormDepth: (input/output: 2D or 3D).

New Concatenation / Splitting layers:
* TNNetDeepConcat (input/output: 1D, 2D or 3D) - Concatenates independent NN branches into the Depth axis.
* TNNetSplitChannels (input: 1D, 2D or 3D / output: 1D, 2D or 3D). Splits layers/channels from input. This is useful for creating divergent NN branches.

New ready to use data augmentation methods from TVolume:
* procedure FlipX();
* procedure FlipY();
* procedure CopyCropping(Original: TVolume; StartX, StartY, pSizeX, pSizeY: integer);
* procedure CopyResizing(Original: TVolume; NewSizeX, NewSizeY: integer);
* procedure AddGaussianNoise(pMul: TNeuralFloat);
* procedure AddSaltAndPepper(pNum: integer; pSalt: integer = 2; pPepper: integer = -2);

Final random thought:
As pascal is easy to teach and learn (in my opinion), pascal might be perfect for teaching/learning neural networks.

:) Wish everyone happy pascal coding :)
« Last Edit: July 31, 2018, 05:29:06 am by schuler »

Thaddy

  • Hero Member
  • *****
  • Posts: 6568
Re: Conscious Artificial Intelligence - Project Update
« Reply #4 on: July 31, 2018, 08:57:54 am »
As pascal is easy to teach and learn (in my opinion), pascal might be perfect for teaching/learning neural networks.
Actually, that has been the case since a very long time, in the early 80's it was pretty much the standard.
That's the way I learned some of the basics and taught some of the basics at university back in the 80's.
(Basically I learned it, wrote examples and taught it - almost - in the same week, writing the curriculum on the go... <hmm..> O:-) :-[ )
We used it in teaching for problems like Traveling Salesman and Knapsack. Lot's of fun. I must still have some line printed code from those days...The green/white line stuff with holes on both sides.
Let's see if I can put it into your way more advanced software form... Great job.
« Last Edit: July 31, 2018, 09:05:36 am by Thaddy »
Ada's daddy wrote this:"Fools are my theme, let satire be my song."

Trenatos

  • Sr. Member
  • ****
  • Posts: 391
  • Software developer - Open source contributor
    • MarcusFernstrom.com
Re: Conscious Artificial Intelligence - Project Update
« Reply #5 on: July 31, 2018, 08:18:43 pm »
This is very cool to see, many thanks for sharing!

schuler

  • Jr. Member
  • **
  • Posts: 89
Re: Conscious Artificial Intelligence - Project Update
« Reply #6 on: August 02, 2018, 11:30:33 pm »
Hello,
I got a couple of private questions from forum members and decided to post here a general reply as it might help others.

This link gives a general view of CNNs:
http://cs231n.github.io/convolutional-networks/

Above link mentions some well known NN architectures. I would like to show here how to implement some of these well known architectures with CAI:

Yann LeCun LeNet-5:

Code: Pascal  [Select]
  1. NN := TNNet.Create();
  2. NN.AddLayer( TNNetInput.Create(28, 28, 1) );
  3. NN.AddLayer( TNNetConvolution.Create(6, 5, 0, 1) );
  4. NN.AddLayer( TNNetMaxPool.Create(2) );
  5. NN.AddLayer( TNNetConvolution.Create(16, 5, 0, 1) );
  6. NN.AddLayer( TNNetMaxPool.Create(2) );
  7. NN.AddLayer( TNNetFullConnect.Create(120) );
  8. NN.AddLayer( TNNetFullConnect.Create(84) );
  9. NN.AddLayer( TNNetFullConnectLinear.Create(10) );
  10. NN.AddLayer( TNNetSoftMax.Create() );

AlexNet can be implemented as follows:

Code: Pascal  [Select]
  1. NN := TNNet.Create();
  2. NN.AddLayer( TNNetInput.Create(227, 227, 3) );
  3.  
  4. //Conv1 + ReLU
  5. NN.AddLayer( TNNetConvolutionReLU.Create(96, 11, 0, 4) );
  6. NN.AddLayer( TNNetMaxPool.Create(3,2) );
  7. NN.AddLayer( TNNetLocalResponseNormDepth.Create(5) );
  8.  
  9. //Conv2 + ReLU
  10. NN.AddLayer( TNNetConvolutionReLU.Create(256, 5, 2, 1) );
  11. NN.AddLayer( TNNetMaxPool.Create(3,2) );
  12. NN.AddLayer( TNNetLocalResponseNormDepth.Create(5) );
  13.  
  14. //Conv3,4,5 + ReLU
  15. NN.AddLayer( TNNetConvolutionReLU.Create(398, 3, 1, 1) );
  16. NN.AddLayer( TNNetConvolutionReLU.Create(398, 3, 1, 1) );
  17. NN.AddLayer( TNNetConvolutionReLU.Create(256, 3, 1, 1) );
  18. NN.AddLayer( TNNetMaxPool.Create(3,2) );
  19.  
  20. // Dropouts and Dense Layers
  21. NN.AddLayer( TNNetDropout.Create(0.5) );
  22. NN.AddLayer( TNNetFullConnectReLU.Create(4096) );
  23. NN.AddLayer( TNNetDropout.Create(0.5) );
  24. NN.AddLayer( TNNetFullConnectReLU.Create(4096) );
  25. NN.AddLayer( TNNetFullConnectLinear.Create(1000) );
  26. NN.AddLayer( TNNetSoftMax.Create() );

Above implementation is based on:
https://medium.com/@smallfishbigsea/a-walk-through-of-alexnet-6cbd137a5637

The VGGNet has an interesting architecture. It always uses 3x3 filters making its implementation easy to understand:

Code: Pascal  [Select]
  1. NN := TNNet.Create();
  2. NN.AddLayer( TNNetInput.Create(224, 224, 3) );
  3. NN.AddLayer( TNNetConvolutionReLU.Create(64, 3, 1, 1) );
  4. NN.AddLayer( TNNetConvolutionReLU.Create(64, 3, 1, 1) );
  5. NN.AddLayer( TNNetMaxPool.Create(2) );
  6. //112x112x64
  7. NN.AddLayer( TNNetConvolutionReLU.Create(128, 3, 1, 1) );
  8. NN.AddLayer( TNNetConvolutionReLU.Create(128, 3, 1, 1) );
  9. NN.AddLayer( TNNetMaxPool.Create(2) );
  10. //56x56x128
  11. NN.AddLayer( TNNetConvolutionReLU.Create(256, 3, 1, 1) );
  12. NN.AddLayer( TNNetConvolutionReLU.Create(256, 3, 1, 1) );
  13. NN.AddLayer( TNNetConvolutionReLU.Create(256, 3, 1, 1) );
  14. //28x28x256
  15. NN.AddLayer( TNNetMaxPool.Create(2) );
  16. NN.AddLayer( TNNetConvolutionReLU.Create(512, 3, 1, 1) );
  17. NN.AddLayer( TNNetConvolutionReLU.Create(512, 3, 1, 1) );
  18. NN.AddLayer( TNNetConvolutionReLU.Create(512, 3, 1, 1) );
  19. NN.AddLayer( TNNetMaxPool.Create(2) );
  20. //14x14x512
  21. NN.AddLayer( TNNetConvolutionReLU.Create(512, 3, 1, 1) );
  22. NN.AddLayer( TNNetConvolutionReLU.Create(512, 3, 1, 1) );
  23. NN.AddLayer( TNNetConvolutionReLU.Create(512, 3, 1, 1) );
  24. NN.AddLayer( TNNetMaxPool.Create(2) );
  25. //7x7x512
  26. NN.AddLayer( TNNetFullConnectReLU.Create(4096) );
  27. NN.AddLayer( TNNetFullConnectReLU.Create(4096) );
  28. NN.AddLayer( TNNetFullConnectLinear.Create(1000) );
  29. NN.AddLayer( TNNetSoftMax.Create() );

Above architectures have been added to the API as follows:
Code: Pascal  [Select]
  1.   THistoricalNets = class(TNNet)
  2.     public
  3.       procedure AddLeCunLeNet5(IncludeInput: boolean);
  4.       procedure AddAlexNet(IncludeInput: boolean);
  5.       procedure AddVGGNet(IncludeInput: boolean);
  6.   end;

Hope it helps and happy pascal coding!
« Last Edit: August 06, 2018, 09:45:26 pm by schuler »

 

Recent

Get Lazarus at SourceForge.net. Fast, secure and Free Open Source software downloads Open Hub project report for Lazarus