Forum > Third party

Conscious Artificial Intelligence - Project Update

(1/33) > >>

schuler:
 :) Hello :)
As some of you already know, I've been developing Neural Networks with Free Pascal / Lazarus and sometimes it's good to show progress.

The newest addition to the project is a convolutional neural network pascal unit capable of convolution and other common features. These are the available layers:
* TNNetInput (Input/output: 1D, 2D or 3D).
* TNNetFullConnect (Input/output: 1D, 2D or 3D).
* TNNetFullConnectReLU (Input/output: 1D, 2D or 3D).
* TNNetLocalConnect (Input/output: 1D, 2D or 3D - feature size: 1D or 2D). Similar to full connect with individual neurons.
* TNNetLocalConnectReLU (Input/output: 1D, 2D or 3D - feature size: 1D or 2D).
* TNNetConvolution (Input/output: 1D, 2D or 3D - feature size: 1D or 2D).
* TNNetConvolutionReLU (Input/output: 1D, 2D or 3D - feature size: 1D or 2D).
* TNNetMaxPool (Input/output: 1D, 2D or 3D - feature size: 1D or 2D).
* TNNetConcat (Input/output: 1D, 2D or 3D) - Allows concatenating the result from previous layers.
* TNNetReshape (Input/output: 1D, 2D or 3D).
* TNNetSoftMax (Input/output: 1D, 2D or 3D).

These are the available weight initializers:
* InitUniform(Value: TNeuralFloat = 1);
* InitLeCunUniform(Value: TNeuralFloat = 1);
* InitHeUniform(Value: TNeuralFloat = 1);
* InitGlorotBengioUniform(Value: TNeuralFloat = 1);

The source code is located here:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/uconvolutionneuralnetwork.pas

There is an extensive CIFAR-10 testing script located at:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/testcnnalgo/testcnnalgo.lpr

The API allows you to create divergent/parallel and convergent layers as per
example below:


--- Code: Pascal  [+][-]window.onload = function(){var x1 = document.getElementById("main_content_section"); if (x1) { var x = document.getElementsByClassName("geshi");for (var i = 0; i < x.length; i++) { x[i].style.maxHeight='none'; x[i].style.height = Math.min(x[i].clientHeight+15,306)+'px'; x[i].style.resize = "vertical";}};} ---// Creates The Neural NetworkNN := TNNet.Create(); // This network splits into 2 paths and then is later concatenatedInputLayer := NN.AddLayer(TNNetInput.Create(32, 32, 3)); // First branch starting from InputLayer (5x5 features)NN.AddLayerAfter(TNNetConvolutionReLU.Create({Features=}16, {FeatureSize=}5, {Padding=}2, {Stride=}1), InputLayer);NN.AddLayer(TNNetMaxPool.Create(2));NN.AddLayer(TNNetConvolutionReLU.Create({Features=}64, {FeatureSize=}5, {Padding=}2, {Stride=}1));NN.AddLayer(TNNetMaxPool.Create(2));EndOfFirstPath := NN.AddLayer(TNNetConvolutionReLU.Create({Features=}64, {FeatureSize=}5, {Padding=}2, {Stride=}1)); // Another branch starting from InputLayer (3x3 features)NN.AddLayerAfter(TNNetConvolutionReLU.Create({Features=}16, {FeatureSize=}3, {Padding=}1, {Stride=}1), InputLayer);NN.AddLayer(TNNetMaxPool.Create(2));NN.AddLayer(TNNetConvolutionReLU.Create({Features=}64, {FeatureSize=}3, {Padding=}1, {Stride=}1));NN.AddLayer(TNNetMaxPool.Create(2));EndOfSecondPath := NN.AddLayer(TNNetConvolutionReLU.Create({Features=}64, {FeatureSize=}3, {Padding=}1, {Stride=}1)); // Concats both branches into one branch.NN.AddLayer(TNNetDeepConcat.Create([EndOfFirstPath, EndOfSecondPath]));NN.AddLayer(TNNetConvolutionReLU.Create({Features=}64, {FeatureSize=}3, {Padding=}1, {Stride=}1));NN.AddLayer(TNNetLayerFullConnectReLU.Create(64));NN.AddLayer(TNNetLayerFullConnectReLU.Create(NumClasses));
As a simpler example, this example shows how to create a simple fully forward connected network 3x3:

--- Code: Pascal  [+][-]window.onload = function(){var x1 = document.getElementById("main_content_section"); if (x1) { var x = document.getElementsByClassName("geshi");for (var i = 0; i < x.length; i++) { x[i].style.maxHeight='none'; x[i].style.height = Math.min(x[i].clientHeight+15,306)+'px'; x[i].style.resize = "vertical";}};} ---NN := TNNet.Create();NN.AddLayer( TNNetInput.Create(3) );NN.AddLayer( TNNetLayerFullConnectReLU.Create(3) );NN.AddLayer( TNNetLayerFullConnectReLU.Create(3) );NN.SetLearningRate(0.01,0.8);
In above example, learning rate is 0.01 and inertia is 0.8.

Follows an example showing how to train your network:

--- Code: Pascal  [+][-]window.onload = function(){var x1 = document.getElementById("main_content_section"); if (x1) { var x = document.getElementsByClassName("geshi");for (var i = 0; i < x.length; i++) { x[i].style.maxHeight='none'; x[i].style.height = Math.min(x[i].clientHeight+15,306)+'px'; x[i].style.resize = "vertical";}};} ---// InputVolume and vDesiredVolume are of the type TNNetVolumeNN.Compute(InputVolume);NN.GetOutput(PredictedVolume);vDesiredVolume.SetClassForReLU(DesiredClass);NN.Backpropagate(vDesiredVolume);
Recently, an ultra fast Single Precision AVX/AVX2 API was coded so Neural Networks could benefit from it:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/uvolume.pas

2 videos were created showing this unit:
https://www.youtube.com/watch?v=qGnfwpKUTIQ
https://www.youtube.com/watch?v=Pnv174V_emw

In preparation for future OpenCL based Neural Networks, an OpenCL wrapper was created:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/ueasyopencl.pas
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/libs/ueasyopenclcl.pas

There is an example for this OpenCL wrapper here:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/opencl/easy-trillion-test/

 :) Wish everyone happy pascal coding  :)

DonAlfredo:
Very impressive. And OpenCL is a very useful feature for speeding things up.
Thanks.

schuler:
 :) Hello,  :)
I haven't updated this post for a long time. There are plenty of new features since the last update:

A new pooling layer:
* TNNetAvgPool (input/output: 1D, 2D or 3D)

Layers for inverse operations:
* TNNetDeLocalConnect (input/output: 1D, 2D or 3D)
* TNNetDeLocalConnectReLU (input/output: 1D, 2D or 3D)
* TNNetDeconvolution (input/output: 1D, 2D or 3D)
* TNNetDeconvolutionReLU (input/output: 1D, 2D or 3D)
* TNNetDeMaxPool (input/output: 1D, 2D or 3D)

New normalization layers:
* TNNetLayerMaxNormalization (input/output: 1D, 2D or 3D)
* TNNetLayerStdNormalization (input/output: 1D, 2D or 3D)

A dropout layer:
* TNNetDropout (input/output: 1D, 2D or 3D)

L2 regularization has been added. It's now possible to apply L2 to all layers or only to convolutional layers.

--- Code: Pascal  [+][-]window.onload = function(){var x1 = document.getElementById("main_content_section"); if (x1) { var x = document.getElementsByClassName("geshi");for (var i = 0; i < x.length; i++) { x[i].style.maxHeight='none'; x[i].style.height = Math.min(x[i].clientHeight+15,306)+'px'; x[i].style.resize = "vertical";}};} ---      procedure SetL2DecayToConvolutionalLayers(pL2Decay: TNeuralFloat);      procedure SetL2Decay(pL2Decay: TNeuralFloat);
A new video:
Increasing Image Resolution with Neural Networks
https://www.youtube.com/watch?v=jdFixaZ2P4w

Data Parallelism has been implemented. Experiments have been made with up to 40 parallel threads in virtual machines with up to 32 virtual cores.

Data Parallelism is implemented at TNNetDataParallelism. As per example, you can create and run 40 parallel neural networks with:

--- Code: Pascal  [+][-]window.onload = function(){var x1 = document.getElementById("main_content_section"); if (x1) { var x = document.getElementsByClassName("geshi");for (var i = 0; i < x.length; i++) { x[i].style.maxHeight='none'; x[i].style.height = Math.min(x[i].clientHeight+15,306)+'px'; x[i].style.resize = "vertical";}};} ---NN := TNNet.Create();...FThreadNN := TNNetDataParallelism.Create(NN, 40);...ProcThreadPool.DoParallel(@RunNNThread, 0, FThreadNN.Count-1, Nil, FThreadNN.Count);FThreadNN.AvgWeights(NN);
 :) Wish everyone happy coding  :)

schuler:
It's time for new features and examples!

Follows some new examples:

CIFAR-10 with OpenCL on convolution forward step: https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/experiments/visualCifar10OpenCL/

CIFAR-10 batch updates example:
https://sourceforge.net/p/cai/svncode/HEAD/tree/trunk/lazarus/experiments/visualCifar10BatchUpdate/

Mario Werner independently coded and shared examples for CIFAR-10 and MNIST:
https://bitbucket.org/108bits/cai-implementations/src/c8c027b1a0d636713f7ebb70a738f1cd7117a7a4?at=master

BTW, if you have examples that you've shared in public repositories, please let me know. I love seeing what you are doing.

These are new features:

OpenCL based Convolution: on tiny images 32x32x3 or 28x28x1, you won't get much improvement against the well made AVX code. But you'll find impressive improvement on large input volumes. OpenCL can be enabled by calling:

--- Code: Pascal  [+][-]window.onload = function(){var x1 = document.getElementById("main_content_section"); if (x1) { var x = document.getElementsByClassName("geshi");for (var i = 0; i < x.length; i++) { x[i].style.maxHeight='none'; x[i].style.height = Math.min(x[i].clientHeight+15,306)+'px'; x[i].style.resize = "vertical";}};} ---procedure EnableOpenCL(platform_id: cl_platform_id; device_id: cl_device_id);
Batch Updates Support
Batch updates can be enabled/disabled via:

--- Code: Pascal  [+][-]window.onload = function(){var x1 = document.getElementById("main_content_section"); if (x1) { var x = document.getElementsByClassName("geshi");for (var i = 0; i < x.length; i++) { x[i].style.maxHeight='none'; x[i].style.height = Math.min(x[i].clientHeight+15,306)+'px'; x[i].style.resize = "vertical";}};} ---procedure SetBatchUpdate(pBatchUpdate: boolean);
New Normalization layers:
* TNNetLocalResponseNorm2D: (input/output: 2D or 3D).
* TNNetLocalResponseNormDepth: (input/output: 2D or 3D).

New Concatenation / Splitting layers:
* TNNetDeepConcat (input/output: 1D, 2D or 3D) - Concatenates independent NN branches into the Depth axis.
* TNNetSplitChannels (input: 1D, 2D or 3D / output: 1D, 2D or 3D). Splits layers/channels from input. This is useful for creating divergent NN branches.

New ready to use data augmentation methods from TVolume:
* procedure FlipX();
* procedure FlipY();
* procedure CopyCropping(Original: TVolume; StartX, StartY, pSizeX, pSizeY: integer);
* procedure CopyResizing(Original: TVolume; NewSizeX, NewSizeY: integer);
* procedure AddGaussianNoise(pMul: TNeuralFloat);
* procedure AddSaltAndPepper(pNum: integer; pSalt: integer = 2; pPepper: integer = -2);

Final random thought:
As pascal is easy to teach and learn (in my opinion), pascal might be perfect for teaching/learning neural networks.

:) Wish everyone happy pascal coding :)

Thaddy:

--- Quote from: schuler on July 31, 2018, 05:21:45 am ---As pascal is easy to teach and learn (in my opinion), pascal might be perfect for teaching/learning neural networks.

--- End quote ---
Actually, that has been the case since a very long time, in the early 80's it was pretty much the standard.
That's the way I learned some of the basics and taught some of the basics at university back in the 80's.
(Basically I learned it, wrote examples and taught it - almost - in the same week, writing the curriculum on the go... <hmm..> O:-) :-[ )
We used it in teaching for problems like Traveling Salesman and Knapsack. Lot's of fun. I must still have some line printed code from those days...The green/white line stuff with holes on both sides.
Let's see if I can put it into your way more advanced software form... Great job.

Navigation

[0] Message Index

[#] Next page

Go to full version