Recent

Author Topic: Conscious Artificial Intelligence - How to Start  (Read 913 times)

dinmil

  • New Member
  • *
  • Posts: 44
Conscious Artificial Intelligence - How to Start
« on: November 20, 2019, 03:22:43 pm »
Hello.
I'm looking for someone to give me right directions to try to solve one of my problem.

I'm tring to use neural-api-master

I have 50 inputs (all float numbers between 0 and 1)
I have one output with 3 possible values 0, 0.5, and 1.
Is ti better to have one output with 0, 0.5 and 1, or is it better to have 3 outputs with 3 different combinations
1,0,0
0,1,0
0,0,1

I tried to start from XorAndOr example to get the feeling, but unfortunately I get no results.
I took 1000 records (50 inputs and 1 output), put it it to learn stage with default params (like in XorAndOr example) and after learning I tried to test it on using the same set.
All records produced 0 as output.

After that I tried to adjust some learning params like

  MyNFit.InitialLearningRate := 0.000001;
  MyNFit.LearningRateDecay := 0.001;
  MyNFit.L2Decay := 0.001;

but nothing significant happend. The I tried to move some other params like Epochs to 20000 and  put some layers
  MyNNet.AddLayer( TNNetInput.Create(50) );
  MyNNet.AddLayer( TNNetFullConnectReLU.Create(40) );
  MyNNet.AddLayer( TNNetFullConnectReLU.Create(30) );
  MyNNet.AddLayer( TNNetFullConnectReLU.Create(50) );
  MyNNet.AddLayer( TNNetFullConnectReLU.Create(1) );

then something happend but all the numbers are near zero 0.11, 0.16 and nothing is even close to 1.00.

This set of record have 964 zeros as result and 5 one and the rest are 0.5.

So the questions are.
1. How do you decide what kind of network is needed (how many layers, what kind of layers etc)
2. When you start learning how do you know is learning succesfull (without errors) and how to interprete errors
3. Is that a good way to start program (1000 learn) and test on same data
4. What params means and what are the numbers
  MyNFit.InitialLearningRate := 0.000001;
  MyNFit.LearningRateDecay := 0.001;
  MyNFit.L2Decay := 0.001;
  BatchSize
  Epoch
5. Are there any documentation where I can start learning and reading about neural-api-master project

Thanks in advance, and best regards







garlar27

  • Hero Member
  • *****
  • Posts: 622
Re: Conscious Artificial Intelligence - How to Start
« Reply #1 on: November 20, 2019, 04:40:12 pm »
Don't know if this is what you are looking for:

Conscious Artificial Intelligence - Project Update

dinmil

  • New Member
  • *
  • Posts: 44
Re: Conscious Artificial Intelligence - How to Start
« Reply #2 on: November 21, 2019, 02:07:54 pm »
Thanks. I'm checking.

Thaddy

  • Hero Member
  • *****
  • Posts: 9303
Re: Conscious Artificial Intelligence - How to Start
« Reply #3 on: November 21, 2019, 05:50:19 pm »
The author is on this forum, he usually answers very quickly ( schuler )
Note that a set of ~1000 is a bit on the low side to make it work. You need quite a bit more sample data (quadratic)
Examine the image recognition example as a guideline. It is self explaining.
« Last Edit: November 21, 2019, 05:56:10 pm by Thaddy »
also related to equus asinus.

schuler

  • Full Member
  • ***
  • Posts: 136
Re: Conscious Artificial Intelligence - How to Start
« Reply #4 on: November 21, 2019, 08:08:30 pm »
Hello dinmil,
Given that you are using ReLUs, I would pick as output:
1,0,0
0,1,0
0,0,1

Last layer should be a softmax with 3 classes above.

Your InitialLearningRate looks too small. Give a go with 0.001 or 0.0001.

If you put your code in a public repo, I can run it at my end if you like.

batchsize: 32 or 64 works well on most cases (unless you have a high end RYZEN, I would try a bigger number).

dinmil

  • New Member
  • *
  • Posts: 44
Re: Conscious Artificial Intelligence - How to Start
« Reply #5 on: November 22, 2019, 01:09:11 pm »
This is way I'm thinking. Put 3 outputs instead of one.

Problem is that this code have too much dependencies to my other libraries and modules.

Next question is:
Is there is a standard way to export inputs and desired outputs from neural network components so I can make little peace of code that everybody can compile?

dinmil

  • New Member
  • *
  • Posts: 44
Re: Conscious Artificial Intelligence - How to Start
« Reply #6 on: November 22, 2019, 04:46:31 pm »
I managed to make some example what am I doing:
Learn sample data is 1000 rows
Lest is 2691 rows from which first 1000 is the same as learn data
This is the program.
Folder neural must exists - it is copied from GIT - this folder contain all neural files
Test hits 90% of wanted outputs. Is it good or it can be better?

Files learn_input.csv, learn_output.csv and test_input.csv and test_output.csv must exists in exe folder

Code: Pascal  [Select]
  1. program TestNeural;
  2.  
  3. {$mode objfpc}{$H+}
  4.  
  5. uses
  6.   {$IFDEF UNIX}{$IFDEF UseCThreads}
  7.   cthreads,
  8.   {$ENDIF}{$ENDIF}
  9.   Classes, SysUtils,
  10.   dateutils,
  11.   neuralnetwork,
  12.   neuralvolume,
  13.   neuralfit
  14.   { you can add units after this };
  15.  
  16. const
  17.   FAV_SYS_NEWLINE = #13#10;
  18.  
  19. type
  20.   TInputArray  = array[0..50] of TNeuralFloat;
  21.   TOutputArray =  array[0..2] of TNeuralFloat;
  22.  
  23.   TRecStatistics = record
  24.     count_samples: integer;
  25.  
  26.     OK: integer;
  27.     NOK: integer;
  28.     UNKNOWN: integer;
  29.  
  30.     count_rej: Integer;
  31.     count_neg: Integer;
  32.     count_app: Integer;
  33.     count_unk: Integer;
  34.  
  35.     count_output_rej: Integer;
  36.     count_output_neg: Integer;
  37.     count_output_app: Integer;
  38.     count_output_unk: Integer;
  39.  
  40.     approved_but_shoud_be_rejected: Integer;
  41.     approved_but_shoud_be_negotiated: Integer;
  42.  
  43.     miss_rejected: integer;
  44.     miss_negotiated: integer;
  45.     miss_approved: integer;
  46.  
  47.     unknown_is_really_rejected: integer;
  48.     unknown_is_really_negotiated: integer;
  49.     unknown_is_really_approved: integer;
  50.   end;
  51.  
  52. var
  53.   _TrainingListLoaded: Boolean;
  54.   _TestListLoaded: Boolean;
  55.   _InputArray: TInputArray;
  56.   _OutputArray: TOutputArray;
  57.   _Input  : TNNetVolume;
  58.   _Output : TNNetVolume;
  59.   _Pair   : TNNetVolumePair;
  60.   _TrainingPairs: TNNetVolumePairList;
  61.   _TestPairs: TNNetVolumePairList;
  62.   _NFit: TNeuralFit;
  63.   _NNet: TNNet;
  64.   _ComputedOutput : TNNetVolume;
  65.   _LearnCount: Integer;
  66.   _TestCount: Integer;
  67.   _Statistics: TRecStatistics;
  68.  
  69.  
  70. const InitTRecStatistics: TRecStatistics = ({%H-});
  71.  
  72. function FormatFloat(InNumber: Extended; InNoOfDecimalPlaces: Integer = 4): String;
  73. begin
  74.   Result := Format('%0.' + IntToStr(InNoOfDecimalPlaces)+ 'f', [InNumber]);
  75. end;
  76.  
  77. procedure FavLogToConsole(InString: String);
  78. begin
  79.   if IsConsole then begin
  80.     try
  81.       {$ifdef Windows}
  82.       Writeln(StdOut, FormatDateTime('yyyy-mm-dd hh:nn:ss.zzz', Now), ': ', InString);
  83.       {$else}
  84.       Write(StdOut, FavToNiceStringDateAndTimeAndMSec(FavUTCNow), ': ', InString, #13#10);
  85.       {$endif}
  86.     except
  87.     end;
  88.   end;
  89. end;
  90.  
  91.  
  92. procedure FavSaveToFile(InFileName: String; const InString: String;
  93.   InRaiseError: Boolean);
  94. var MyFile: Text;
  95.     MyFileAssigned: Boolean;
  96.     MyFileOpened: Boolean;
  97. begin
  98.   MyFileAssigned:=False;
  99.   MyFileOpened:=False;
  100.   try
  101.     AssignFile(MyFile, InFileName);
  102.     MyFileAssigned:=True;
  103.     Rewrite(MyFile);
  104.     MyFileOpened:=True;
  105.     Write(MyFile, InString);
  106.     Flush(MyFile);
  107.     CloseFile(MyFile);
  108.     MyFileOpened:=False;
  109.     MyFileAssigned:=False;
  110.   except
  111.     on E: Exception do begin
  112.       if MyFileAssigned then begin
  113.         try
  114.           if MyFileOpened then begin
  115.             CloseFile(MyFile);
  116.           end;
  117.         except
  118.         end;
  119.       end;
  120.       if InRaiseError then begin;
  121.         raise;
  122.       end;
  123.     end;
  124.   end;
  125. end;
  126.  
  127. function FavLoadFromFile(InFileName: String; InRaiseErrorIfNotExists: Boolean = False): String;
  128. var MyFile: Text;
  129.     MyFileAssigned: Boolean;
  130.     MyFileOpened: Boolean;
  131.     MyBuffer: String;
  132.     MyString: String;
  133. begin
  134.   MyFileAssigned:=False;
  135.   MyFileOpened:=False;
  136.   MyString := '';
  137.  
  138.   if InRaiseErrorIfNotExists then begin
  139.     if FileExists(InFileName) = false then begin
  140.       raise Exception.Create('File: "' + InFileName + '" does not exists!');
  141.     end;
  142.   end;
  143.   try
  144.     AssignFile(MyFile, InFileName);
  145.     MyFileAssigned:=True;
  146.     Reset(MyFile);
  147.     MyFileOpened := true;
  148.     while not Eof(MyFile) do begin
  149.       ReadLn(MyFile, MyBuffer);
  150.       if MyString <> '' then MyString := MyString + #13#10;
  151.       MyString := MyString + MyBuffer;
  152.     end;
  153.     CloseFile(MyFile);
  154.     MyFileOpened := False;
  155.     MyFileAssigned:=False;
  156.   except
  157.     on E: Exception do begin
  158.       if MyFileAssigned then begin
  159.         try
  160.           if MyFileOpened then begin
  161.             CloseFile(MyFile);
  162.           end;
  163.         except
  164.         end;
  165.       end;
  166.       raise;
  167.     end;
  168.   end;
  169.   Result := MyString;
  170. end;
  171.  
  172. procedure FavAppendToFile(InFileName: String; const InString: String);
  173. var MyFile: Text;
  174.     MyFileAssigned: Boolean;
  175.     MyFileOpened: Boolean;
  176. begin
  177.   MyFileAssigned:=False;
  178.   MyFileOpened := False;
  179.   try
  180.     AssignFile(MyFile, InFileName);
  181.     MyFileAssigned:=True;
  182.     if FileExists(InFileName) then begin
  183.       Append(MyFile);
  184.       MyFileOpened := True;
  185.     end
  186.     else begin
  187.       Rewrite(MyFile);
  188.       MyFileOpened := True;
  189.     end;
  190.     Write(MyFile, InString);
  191.     Flush(MyFile);
  192.     CloseFile(MyFile);
  193.     MyFileOpened := False;
  194.     MyFileAssigned:=False;
  195.   except
  196.     on E: Exception do begin
  197.       if MyFileAssigned then begin
  198.         try
  199.           if MyFileOpened then begin
  200.             CloseFile(MyFile);
  201.           end;
  202.         except
  203.         end;
  204.       end;
  205.       raise;
  206.     end;
  207.   end;
  208. end;
  209.  
  210. procedure CalculateLayerCounts(InNumberOfInputs: Integer; InNumberOfSamples: Integer; var OutOneLayer,
  211.   OutTwoLayersFirst, OutTwoLayersSecond: Integer);
  212. begin
  213.   // Found in some book
  214.   // N -- number of training data N sampes  - 4 samples
  215.   // m -- number of output neurons = 3 outputs
  216.   // 1st layer: SQRT((m + 2) * N) + 2 * SQRT(N/(m+2)) = SQRT((3+2) * 4) + 2 * SQRT(4 / (3 +2)) = 4,4721 + 1,7889 = 6,26 = 7
  217.   // 2nd layer: m * SQLRT(N/(m+2)) = 3 * SQRT(4/(3+2)) = 2.68 = 3
  218.  
  219.   FavLogToConsole('Number of samples: ' + IntToStr(InNumberOfSamples));
  220.   FavLogToConsole('Number of outputs: ' + IntToStr(InNumberOfInputs));
  221.  
  222.   OutOneLayer := Trunc(sqrt((InNumberOfInputs + 2) * InNumberOfSamples)) + 1;
  223.  
  224.   OutTwoLayersFirst := Trunc(sqrt((InNumberOfInputs+2) * InNumberOfSamples) + 2 * sqrt(4 / (InNumberOfInputs + 2))) + 1;
  225.   OutTwoLayersSecond := Trunc(InNumberOfInputs * sqrt(4 / (InNumberOfInputs+2)) + 1);
  226.  
  227.   FavLogToConsole('OneLayer: ' + IntToStr(OutOneLayer));
  228.   FavLogToConsole('TwoLayersFirst: ' + IntToStr(OutTwoLayersFirst));
  229.   FavLogToConsole('TwoLayersSecond: ' + IntToStr(OutTwoLayersSecond));
  230. end;
  231.  
  232. procedure LoadLearnData;
  233. var MyStringsInput: TStrings;
  234.     MyStringsOutput: TStrings;
  235.     MyInputString: String;
  236.     MyOutputString: String;
  237.     F: Integer;
  238. begin
  239.   MyStringsInput  := nil;
  240.   MyStringsOutput := nil;
  241.   MyInputString   := '';
  242.   MyOutputString  := '';
  243.   try
  244.     if FileExists('learn_input.csv') and FileExists('learn_output.csv') then begin
  245.       FavLogToConsole('Loading training data from training files');
  246.       MyStringsInput := TStringList.Create;
  247.       MyStringsInput.LoadFromFile('learn_input.csv');
  248.       MyStringsOutput := TStringList.Create;
  249.       MyStringsOutput.LoadFromFile('learn_output.csv');
  250.  
  251.       if (MyStringsInput.Count > 0) and (MyStringsInput.Count = MyStringsOutput.Count) then begin
  252.         _LearnCount := MyStringsInput.Count;
  253.         for F := 0 to _LearnCount-1 do begin
  254.           _Input  := TNNetVolume.Create();
  255.           _Input.LoadFromString(MyStringsInput.Strings[F]);
  256.           _Output  := TNNetVolume.Create();
  257.           _Output.LoadFromString(MyStringsOutput.Strings[F]);
  258.           _Pair   :=  TNNetVolumePair.Create(_Input, _Output);
  259.           _TrainingPairs.Add(_Pair);
  260.  
  261.           MyInputString := MyInputString + _Input.SaveToString() + FAV_SYS_NEWLINE;
  262.           MyOutputString := MyOutputString + _Output.SaveToString() + FAV_SYS_NEWLINE;
  263.         end;
  264.         _TrainingListLoaded := True;
  265.       end;
  266.  
  267.       FreeAndNil(MyStringsInput);
  268.       FreeAndNil(MyStringsOutput);
  269.  
  270.       if _TrainingListLoaded = false then begin
  271.         FavLogToConsole('*****************************************');
  272.         FavLogToConsole('Learn/Training files are not loaded. Maybe count of records in files are not the same');
  273.       end;
  274.  
  275.       FavLogToConsole('Saving learn_input1.csv and learn_output1.csv...');
  276.       FavSaveToFile('learn_input1.csv', MyInputString, false);
  277.       FavSaveToFile('learn_output1.csv', MyOutputString, false);
  278.     end
  279.     else begin
  280.       FavLogToConsole('*****************************************');
  281.       FavLogToConsole('File learn_input.csv or file learn_output.csv does not exists');
  282.     end;
  283.   except
  284.   end;
  285. end;
  286.  
  287. procedure CreateNetwork;
  288. var MyOneLayerCount: Integer;
  289.     MyFirstLayerCount: Integer;
  290.     MySecondLayerCount: Integer;
  291.     MyTestCount: Integer;
  292. begin
  293.   MyTestCount := _TrainingPairs.Count;
  294.  
  295.   // Create network - so far 51 params
  296.   _NNet.AddLayer(TNNetInput.Create(Length(TInputArray)));
  297.  
  298.   CalculateLayerCounts(Length(_OutputArray), MyTestCount, MyOneLayerCount, MyFirstLayerCount, MySecondLayerCount);
  299.  
  300.   // FavLogToConsole('Using One Layer');
  301.   // _NNet.AddLayer( TNNetFullConnectReLU.Create(MyOneLayerCount) );
  302.  
  303.   FavLogToConsole('Using Two Layers');
  304.   _NNet.AddLayer( TNNetFullConnectReLU.Create(MyFirstLayerCount) );
  305.   _NNet.AddLayer( TNNetFullConnectReLU.Create(MySecondLayerCount) );
  306.  
  307.   // Last layer have 3 outputs
  308.   _NNet.AddLayer( TNNetFullConnectReLU.Create(Length(TOutputArray)) );
  309. end;
  310.  
  311. procedure Compute;
  312. begin
  313.   FavLogToConsole('Computing...');
  314.   _ComputedOutput := TNNetVolume.Create(3,1,1,1);
  315.  
  316.   // call predefind algorithm
  317.   _NFit.InitialLearningRate := 0.0001;
  318.   _NFit.LearningRateDecay := 0;
  319.   _NFit.L2Decay := 0;
  320.   _NFit.Verbose := false;
  321.   _NFit.HideMessages();
  322.   _NFit.Fit(_NNet, _TrainingPairs, nil, nil, 32, 3000); // 3000);
  323. end;
  324.  
  325. procedure PrintStatistics( var InStatistics: TRecStatistics);
  326. var
  327.   MyPercentHit, MyPercentMiss, MyPercentUnknown: Extended;
  328. begin
  329.   FavLogToConsole(Format('OK: %s, NOK: %s, UNKNOWN: %s', [IntToStr(InStatistics.OK),IntToStr(InStatistics.NOK),IntToStr(InStatistics.UNKNOWN)]));
  330.   FavLogToConsole(Format('Input: Reject: %s, Neg: %s, Approved: %s', [IntToStr(InStatistics.count_rej), IntToStr(InStatistics.count_neg), IntToStr(InStatistics.count_app)]));
  331.   FavLogToConsole(Format('Output: Reject: %s, Neg: %s, Approved: %s, Unknown: %s', [IntToStr(InStatistics.count_output_rej), IntToStr(InStatistics.count_output_neg), IntToStr(InStatistics.count_output_app), IntToStr(InStatistics.count_output_unk)]));
  332.  
  333.   if InStatistics.count_samples > 0 then begin
  334.     MyPercentHit := 100 * (InStatistics.OK / (InStatistics.count_samples));
  335.     MyPercentMiss := 100 * (InStatistics.NOK / (InStatistics.count_samples));
  336.     MyPercentUnknown := 100 * (InStatistics.UNKNOWN / (InStatistics.count_samples));
  337.   end
  338.   else begin
  339.     MyPercentHit := 0;
  340.     MyPercentMiss := 0;
  341.     MyPercentUnknown := 0;
  342.   end;
  343.  
  344.   FavLogToConsole(Format('Hit: %s percent; Miss: %s percent; Unknown: %s percent', [
  345.                         FormatFloat(MyPercentHit),
  346.                         FormatFloat(MyPercentMiss),
  347.                         FormatFloat(MyPercentUnknown)
  348.                   ]));
  349.   FavLogToConsole(Format('MissRejected: %s; MissNegotiated: %s; MissApproved: %s', [
  350.                         IntToStr(InStatistics.miss_rejected),
  351.                         IntToStr(InStatistics.miss_negotiated),
  352.                         IntToStr(InStatistics.miss_approved)
  353.                   ]));
  354.   FavLogToConsole(Format('UnknownIsRejected: %s; UnknownIsNegotiated: %s; UnknownIsApproved: %s', [
  355.                         IntToStr(InStatistics.unknown_is_really_rejected),
  356.                         IntToStr(InStatistics.unknown_is_really_negotiated),
  357.                         IntToStr(InStatistics.unknown_is_really_approved)
  358.                   ]));
  359.   FavLogToConsole(Format('ApprovedButShoudBeRejected: %s; ApprovedButShoudBeNegotiated: %s', [
  360.                         IntToStr(InStatistics.approved_but_shoud_be_rejected),
  361.                         IntToStr(InStatistics.approved_but_shoud_be_negotiated)
  362.                   ]));
  363. end;
  364.  
  365. function AnalyzeOutput(InRowNumber: Integer;
  366.   InDesired, InComputed: TNNetVolume;
  367.   var OutStatistics: TRecStatistics
  368.   ): Integer;
  369. var MyComputedOriginal: TOutputArray;
  370.     MyDesired: TOutputArray;
  371.     MyComputed: TOutputArray;
  372.     MyResult: Integer;
  373.  
  374.     MyNIResult: Extended;  // 100 is approwed = 0;  010 is negotiated = 0.5;  001 is rejected = 1   // desired, input
  375.     MyNOResult: Extended;  // output
  376. begin
  377.   Inc(OutStatistics.count_samples);
  378.  
  379.   MyResult := -1;  // not good result
  380.  
  381.   MyDesired[0] := InDesired.Raw[0];
  382.   MyDesired[1] := InDesired.Raw[1];
  383.   MyDesired[2] := InDesired.Raw[2];
  384.  
  385.   // fix output
  386.   MyComputedOriginal[0] := InComputed.Raw[0];
  387.   MyComputedOriginal[1] := InComputed.Raw[1];
  388.   MyComputedOriginal[2] := InComputed.Raw[2];
  389.  
  390.   MyComputed[0] := MyComputedOriginal[0];
  391.   MyComputed[1] := MyComputedOriginal[1];
  392.   MyComputed[2] := MyComputedOriginal[2];
  393.  
  394.   if MyComputed[0] < 0 then MyComputed[0] := 0;
  395.   if MyComputed[1] < 0 then MyComputed[1] := 0;
  396.   if MyComputed[2] < 0 then MyComputed[2] := 0;
  397.  
  398.   MyComputed[0] := round(MyComputed[0]);
  399.   MyComputed[1] := round(MyComputed[1]);
  400.   MyComputed[2] := round(MyComputed[2]);
  401.  
  402.   if      (MyDesired[0] = 1) and (MyDesired[1] = 0) and (MyDesired[2] = 0) then MyNIResult := 0
  403.   else if (MyDesired[0] = 0) and (MyDesired[1] = 1) and (MyDesired[2] = 0) then MyNIResult := 0.5
  404.   else if (MyDesired[0] = 0) and (MyDesired[1] = 0) and (MyDesired[2] = 1) then MyNIResult := 1
  405.   else begin
  406.     MyNIResult := -1
  407.   end;
  408.  
  409.   if      (MyComputed[0] = 1) and (MyComputed[1] = 0) and (MyComputed[2] = 0) then MyNOResult := 0
  410.   else if (MyComputed[0] = 0) and (MyComputed[1] = 1) and (MyComputed[2] = 0) then MyNOResult := 0.5
  411.   else if (MyComputed[0] = 0) and (MyComputed[1] = 0) and (MyComputed[2] = 1) then MyNOResult := 1
  412.   else begin
  413.     MyNOResult := -1
  414.   end;
  415.  
  416.   // Do little counting
  417.   if MyNIResult = 0 then begin
  418.     // 100 is approwed  = 0
  419.     Inc(OutStatistics.count_app);
  420.   end
  421.   else if MyNIResult = 0.5  then begin
  422.     // 010 is negotiated = 0.5
  423.     Inc(OutStatistics.count_neg);
  424.   end
  425.   else if MyNIResult = 1  then begin
  426.     // 010 is rejected = 1
  427.     Inc(OutStatistics.count_rej);
  428.   end
  429.   else begin
  430.     // This should never happen
  431.     Inc(OutStatistics.count_unk);
  432.   end;
  433.  
  434.   if MyNOResult = 0 then begin
  435.     Inc(OutStatistics.count_output_app);
  436.   end
  437.   else if MyNOResult = 0.5 then begin
  438.     Inc(OutStatistics.count_output_neg);
  439.   end
  440.   else if MyNOResult = 1 then begin
  441.     Inc(OutStatistics.count_output_rej);
  442.   end
  443.   else begin
  444.     Inc(OutStatistics.count_output_unk);
  445.   end;
  446.  
  447.   // Analyze result
  448.   if MyNIResult <> MyNOResult then begin
  449.     if MyNOResult = 0 then begin
  450.       // approved
  451.       if MyNIResult = 1 then begin
  452.         Inc(OutStatistics.approved_but_shoud_be_rejected)
  453.       end
  454.       else if MyNIResult = 0.5 then begin
  455.         Inc(OutStatistics.approved_but_shoud_be_negotiated)
  456.       end
  457.     end;
  458.  
  459.     if MyNIResult = 1 then begin
  460.       Inc(OutStatistics.miss_rejected)
  461.     end
  462.     else if MyNIResult = 0.5 then begin
  463.       Inc(OutStatistics.miss_negotiated)
  464.     end
  465.     else begin
  466.       Inc(OutStatistics.miss_approved)
  467.     end;
  468.  
  469.     Inc(OutStatistics.NOK);
  470.     MyResult := 1;  // not good
  471.     FavLogToConsole(Format('NotGood Row %s, : ResultComputed: %s, ResultWanted: %s', [
  472.                             IntToStr(InRowNumber),
  473.                             FormatFloat(MyNOResult, 1),
  474.                             FormatFloat(MyNIResult)]));
  475.     if MyNOResult = -1 then begin
  476.       if MyNIResult = 1 then begin
  477.         Inc(OutStatistics.unknown_is_really_rejected)
  478.       end
  479.       else if MyNIResult = 0.5 then begin
  480.         Inc(OutStatistics.unknown_is_really_negotiated)
  481.       end
  482.       else begin
  483.         Inc(OutStatistics.unknown_is_really_approved)
  484.       end;
  485.  
  486.  
  487.       Inc(OutStatistics.UNKNOWN);
  488.       MyResult := -1; // Completeley missed
  489.       FavLogToConsole(Format('NotGoodCombination Row %s, (%s; %s; %s): ResultComputed: %s, ResultWanted: %s',
  490.              [IntToStr(InRowNumber),
  491.               FormatFloat(MyComputedOriginal[0], 4),
  492.               FormatFloat(MyComputedOriginal[1], 4),
  493.               FormatFloat(MyComputedOriginal[2], 4),
  494.               FormatFloat(MyNOResult, 1),
  495.               FormatFloat(MyNIResult, 1)
  496.              ])
  497.       );
  498.     end;
  499.   end
  500.   else begin
  501.     Inc(OutStatistics.OK);
  502.     MyResult := 0;
  503.   end;
  504.  
  505.   Result := MyResult;
  506. end;
  507.  
  508. procedure TestLearning;
  509. var MyStringsInput: TStrings;
  510.     MyStringsOutput: TStrings;
  511.     MyInputString: String;
  512.     MyOutputString: String;
  513.  
  514.     MyTestInput  : TNNetVolume;
  515.     MyTestOutput : TNNetVolume;
  516.     MyTestPair   : TNNetVolumePair;
  517.  
  518.     F: Integer;
  519.     MyComputedString: String;
  520. begin
  521.   MyStringsInput := nil;
  522.   MyStringsOutput := nil;
  523.   MyInputString := '';
  524.   MyOutputString:= '';
  525.   MyComputedString := '';
  526.   _Statistics := InitTRecStatistics;
  527.   if FileExists('test_input.csv') and FileExists('test_output.csv') then begin
  528.     FavLogToConsole('Loading test data from test files');
  529.     MyStringsInput := TStringList.Create;
  530.     MyStringsInput.LoadFromFile('test_input.csv');
  531.     MyStringsOutput := TStringList.Create;
  532.     MyStringsOutput.LoadFromFile('test_output.csv');
  533.  
  534.     if (MyStringsInput.Count > 0) and (MyStringsInput.Count = MyStringsOutput.Count) then begin
  535.       _LearnCount := MyStringsInput.Count;
  536.       for F := 0 to _LearnCount-1 do begin
  537.         MyTestInput  := TNNetVolume.Create();
  538.         MyTestInput.LoadFromString(MyStringsInput.Strings[F]);
  539.         MyTestOutput  := TNNetVolume.Create();
  540.         MyTestOutput.LoadFromString(MyStringsOutput.Strings[F]);
  541.         MyTestPair   :=  TNNetVolumePair.Create(MyTestInput, MyTestOutput);
  542.         _TestPairs.Add(MyTestPair);
  543.  
  544.         MyInputString := MyInputString + _Input.SaveToString() + FAV_SYS_NEWLINE;
  545.         MyOutputString := MyOutputString + _Output.SaveToString() + FAV_SYS_NEWLINE;
  546.       end;
  547.       _TestListLoaded := True;
  548.     end;
  549.  
  550.     if _TestListLoaded then begin
  551.       for F := 0 to _TestPairs.Count-1 do begin
  552.         MyTestInput := TNNetVolumePair(_TestPairs.Items[F]).A;
  553.         MyTestOutput := TNNetVolumePair(_TestPairs.Items[F]).B;
  554.  
  555.         _NNet.Compute(MyTestInput);
  556.         _NNet.GetOutput(_ComputedOutput);
  557.         MyComputedString := MyComputedString + _ComputedOutput.SaveToString() + FAV_SYS_NEWLINE;
  558.         AnalyzeOutput(F, MyTestOutput, _ComputedOutput, _Statistics);
  559.       end;
  560.       FavLogToConsole('Saving computedoutput.csv...');
  561.       FavSaveToFile('computedoutput.csv', MyComputedString, False);
  562.     end
  563.     else begin
  564.       FavLogToConsole('*****************************************');
  565.       FavLogToConsole('Test files are not loaded. Maybe count of records in files are not the same');
  566.     end;
  567.   end
  568.   else begin
  569.     FavLogToConsole('*****************************************');
  570.     FavLogToConsole('File test_input.csv or file test_output.csv does not exists');
  571.   end;
  572. end;
  573.  
  574. begin
  575.   _TrainingListLoaded := False;
  576.   _TestListLoaded := False;
  577.   _LearnCount := 0;
  578.   _TestCount := 0;
  579.   _Statistics := InitTRecStatistics;
  580.  
  581.   _NNet := TNNet.Create();
  582.   _NFit := TNeuralFit.Create();
  583.   _TrainingPairs := TNNetVolumePairList.Create();
  584.   _TestPairs := TNNetVolumePairList.Create();
  585.  
  586.   LoadLearnData;
  587.   CreateNetwork;
  588.   Compute;
  589.   TestLearning;
  590.   PrintStatistics(_Statistics);
  591.  
  592.   writeln('Finished. Press any key to exit....');
  593.   readln;
  594.  
  595.   if _NNet <> nil then FreeAndNil(_NNet);
  596.   if _NFit <> nil then FreeAndNil(_NFit);
  597.   if _TrainingPairs <> nil then FreeAndNil(_TrainingPairs);
  598.   if _TestPairs <> nil then FreeAndNil(_TestPairs);
  599.  
  600. end.
  601.  
« Last Edit: December 12, 2019, 10:33:35 am by dinmil »

dinmil

  • New Member
  • *
  • Posts: 44
Re: Conscious Artificial Intelligence - How to Start
« Reply #7 on: November 22, 2019, 04:49:29 pm »
One more file is missing

schuler

  • Full Member
  • ***
  • Posts: 136
Re: Conscious Artificial Intelligence - How to Start
« Reply #8 on: November 24, 2019, 08:43:31 pm »
@dinmil,
Thank you for sending your code.

Just did small changes to check the training accuracy:
Code: Pascal  [Select]
  1.   _NFit.Verbose := true;
  2.   _NFit.EnableMonopolarHitComparison();
  3.   //_NFit.HideMessages();

I'm getting about 89.5% training accuracy on your code (well done for your first attempt). What is the accuracy that you are looking for?

schuler

  • Full Member
  • ***
  • Posts: 136
Re: Conscious Artificial Intelligence - How to Start
« Reply #9 on: November 24, 2019, 08:55:28 pm »
You can increase accuracy with this super small change:
Code: Pascal  [Select]
  1.   // Last layer have 3 outputs
  2.   _NNet.AddLayer( TNNetFullConnectLinear.Create(Length(TOutputArray)) );
  3.   _NNet.AddLayer( TNNetSoftMax.Create());

Got about 98.4% accuracy changing just the 2 lines of code shown above (replacing the last ReLU by linear and adding a softmax).

Happy days?

dinmil

  • New Member
  • *
  • Posts: 44
Re: Conscious Artificial Intelligence - How to Start
« Reply #10 on: November 26, 2019, 12:22:38 pm »
Here is the problem. This solution needs to be part of betting architecture.
System receive 700.000 bet slips per day. From this 700.000, about 10.000 are previewed by risk guys before they are accepted.
From this 10.000, 9.880 and accepted, 60 is rejected (too risky) and 60 are negotiated with customers (for smaller amount).

So far, in this example I have almost 100% hit for accepted, and 0% hit for negotiated and rejected.
Actually this is improvement, because algorithm can accept almost 100%, and therefore only 1% can by resolved manually by guys.
That is the idea. Move the job from guys to neural network.

Note, there is 10 risk guys which preview betslips.
It is possible for every guy to have different opinion for every betslip. Some guy want to accept it, some guy want to reject it, a some wants to negotiate.


Because it is too small percentage of rejected and negotiated is it too much too expect that I will ever be able to detect those 2 kinds?
« Last Edit: November 26, 2019, 01:36:58 pm by dinmil »

schuler

  • Full Member
  • ***
  • Posts: 136
Re: Conscious Artificial Intelligence - How to Start
« Reply #11 on: November 26, 2019, 12:50:29 pm »
Very interesting example!

What you are describing is sometimes called inbalanced training data set or oversampling/undersampling:
https://en.wikipedia.org/wiki/Oversampling_and_undersampling_in_data_analysis

How to fix:
Just give one third of training elements for each class given that you have 3 classes.

dinmil

  • New Member
  • *
  • Posts: 44
Re: Conscious Artificial Intelligence - How to Start
« Reply #12 on: December 11, 2019, 05:20:25 pm »
I did not gave up :-)
I noticed some potential issue in library.
Problem is output of 3 variables. When computing 3rd variable is always zero, no matter what.
Then I made 4th output variable which is 0.5 when learning (always) and after computing is always zero.
Is that normal, because I expect some number near 0.5?

Project is now on https://github.com/dinmil/TestNeural

This is from readme.md

Here is the problem:

This program contains real sample data that is taken from some betting system.

Because, system accepts 800.000 betslips (tickets) per day, and because about 10.000 tickets are previewed by risk employees, and because only one percent of tickets are really rejected, I want to make AI network which will automatically approve, reject or negotiate betslips. Risk employees count is about 10, and everybody have different opinions about approving, rejecting and negotiating. So the output is not clean or predictable.

Every betslip contain 93 parameters which somehow describe betslip:

is it dangerous
how much duplicate in system
is customer dangerous
what is most played sport on ticket
how much the price is different than competition etc.
Output can be:

ticket is accepted
ticket is negotiated for smaller amount or smaller price
ticket is rejected
I tried first with one output which will be

0.1 for approve
0.5 for negotiate
0.9 for reject , but I was not succesfull.
The I switch to 3 parameters: approve is 0.9 and 0.1 and 0.1 negotiate is 0.1 and 0.9 and 0.1 reject is 0.1 and 0.1 and 0.9

So 0.1 is unset and 0.9 is set. I had much more success but I noticed that last (3rd) computed output is always 0 (do not know the reason - maybe some library problem).

Then I introduced 4 parameters, and this last one is always 0.5 , when network is learning. When computed, this param is always 0 as expected (because of some bug, or....)

The best result I achieve by making network like this

Code: Pascal  [Select]
  1. // Create network - so far 93 params
  2. _NNet.AddLayer(TNNetInput.Create(Length(TInputArray)));
  3.  
  4. _NNet.AddLayer( TNNetFullConnectReLU.Create(MyFirstLayerCount) );
  5. _NNet.AddLayer( TNNetFullConnectReLU.Create(MySecondLayerCount) );
  6. _NNet.AddLayer( TNNetSoftMax.Create());
  7.  
  8. // Last layer have 4 outputs
  9. _NNet.AddLayer( TNNetFullConnectReLU.Create(Length(TOutputArray)) );
  10.  

Network, can predict approved ticket successfully but rejected and negotiated are problem. Maybe, data are the problem. Test_data.csv is super set of train_data.csv. Train_data contain every 6th approved ticket and every rejected or negotiated ticket from test_data. This make 10 percent of rejected tickets in train_data.csv. sample.

Important thing for network is not to approve ticket which is really rejected!!!! And second important thing is to approve as much as possible (so far about 90 percent).

Can I make better, and why 4th param is always 0?

« Last Edit: December 12, 2019, 10:35:37 am by dinmil »

440bx

  • Hero Member
  • *****
  • Posts: 1298
Re: Conscious Artificial Intelligence - How to Start
« Reply #13 on: December 11, 2019, 06:22:33 pm »
@dinmil

It would nice of you to edit your original subsequent post and put the Pascal code between the tags [ code=pascal ]  <your Pascal code here> [ /code ]  (don't include the spaces I had to use to "neuter" the tags.)  The code and the post would be much more readable that way.

HTH.
« Last Edit: December 11, 2019, 08:08:27 pm by 440bx »
using FPC v3.0.4 and Lazarus 1.8.2 on Windows 7 64bit.

dinmil

  • New Member
  • *
  • Posts: 44
Re: Conscious Artificial Intelligence - How to Start
« Reply #14 on: December 12, 2019, 10:37:20 am »
Quote
It would nice of you to edit your original subsequent post and put the Pascal code between the tags [ code=pascal ]  <your Pascal code here> [ /code ]  (don't include the spaces I had to use to "neuter" the tags.)  The code and the post would be much more readable that way.

Thanks for the tip. I changed my posts (looks much better now)