Recent

Author Topic: Conscious Artificial Intelligence - How to Start  (Read 666 times)

dinmil

  • New Member
  • *
  • Posts: 42
Conscious Artificial Intelligence - How to Start
« on: November 20, 2019, 03:22:43 pm »
Hello.
I'm looking for someone to give me right directions to try to solve one of my problem.

I'm tring to use neural-api-master

I have 50 inputs (all float numbers between 0 and 1)
I have one output with 3 possible values 0, 0.5, and 1.
Is ti better to have one output with 0, 0.5 and 1, or is it better to have 3 outputs with 3 different combinations
1,0,0
0,1,0
0,0,1

I tried to start from XorAndOr example to get the feeling, but unfortunately I get no results.
I took 1000 records (50 inputs and 1 output), put it it to learn stage with default params (like in XorAndOr example) and after learning I tried to test it on using the same set.
All records produced 0 as output.

After that I tried to adjust some learning params like

  MyNFit.InitialLearningRate := 0.000001;
  MyNFit.LearningRateDecay := 0.001;
  MyNFit.L2Decay := 0.001;

but nothing significant happend. The I tried to move some other params like Epochs to 20000 and  put some layers
  MyNNet.AddLayer( TNNetInput.Create(50) );
  MyNNet.AddLayer( TNNetFullConnectReLU.Create(40) );
  MyNNet.AddLayer( TNNetFullConnectReLU.Create(30) );
  MyNNet.AddLayer( TNNetFullConnectReLU.Create(50) );
  MyNNet.AddLayer( TNNetFullConnectReLU.Create(1) );

then something happend but all the numbers are near zero 0.11, 0.16 and nothing is even close to 1.00.

This set of record have 964 zeros as result and 5 one and the rest are 0.5.

So the questions are.
1. How do you decide what kind of network is needed (how many layers, what kind of layers etc)
2. When you start learning how do you know is learning succesfull (without errors) and how to interprete errors
3. Is that a good way to start program (1000 learn) and test on same data
4. What params means and what are the numbers
  MyNFit.InitialLearningRate := 0.000001;
  MyNFit.LearningRateDecay := 0.001;
  MyNFit.L2Decay := 0.001;
  BatchSize
  Epoch
5. Are there any documentation where I can start learning and reading about neural-api-master project

Thanks in advance, and best regards







garlar27

  • Hero Member
  • *****
  • Posts: 621
Re: Conscious Artificial Intelligence - How to Start
« Reply #1 on: November 20, 2019, 04:40:12 pm »
Don't know if this is what you are looking for:

Conscious Artificial Intelligence - Project Update

dinmil

  • New Member
  • *
  • Posts: 42
Re: Conscious Artificial Intelligence - How to Start
« Reply #2 on: November 21, 2019, 02:07:54 pm »
Thanks. I'm checking.

Thaddy

  • Hero Member
  • *****
  • Posts: 9285
Re: Conscious Artificial Intelligence - How to Start
« Reply #3 on: November 21, 2019, 05:50:19 pm »
The author is on this forum, he usually answers very quickly ( schuler )
Note that a set of ~1000 is a bit on the low side to make it work. You need quite a bit more sample data (quadratic)
Examine the image recognition example as a guideline. It is self explaining.
« Last Edit: November 21, 2019, 05:56:10 pm by Thaddy »
also related to equus asinus.

schuler

  • Full Member
  • ***
  • Posts: 134
Re: Conscious Artificial Intelligence - How to Start
« Reply #4 on: November 21, 2019, 08:08:30 pm »
Hello dinmil,
Given that you are using ReLUs, I would pick as output:
1,0,0
0,1,0
0,0,1

Last layer should be a softmax with 3 classes above.

Your InitialLearningRate looks too small. Give a go with 0.001 or 0.0001.

If you put your code in a public repo, I can run it at my end if you like.

batchsize: 32 or 64 works well on most cases (unless you have a high end RYZEN, I would try a bigger number).

dinmil

  • New Member
  • *
  • Posts: 42
Re: Conscious Artificial Intelligence - How to Start
« Reply #5 on: November 22, 2019, 01:09:11 pm »
This is way I'm thinking. Put 3 outputs instead of one.

Problem is that this code have too much dependencies to my other libraries and modules.

Next question is:
Is there is a standard way to export inputs and desired outputs from neural network components so I can make little peace of code that everybody can compile?

dinmil

  • New Member
  • *
  • Posts: 42
Re: Conscious Artificial Intelligence - How to Start
« Reply #6 on: November 22, 2019, 04:46:31 pm »
I managed to make some example what am I doing:
Learn sample data is 1000 rows
Lest is 2691 rows from which first 1000 is the same as learn data
This is the program.
Folder neural must exists - it is copied from GIT - this folder contain all neural files
Test hits 90% of wanted outputs. Is it good or it can be better?

Files learn_input.csv, learn_output.csv and test_input.csv and test_output.csv must exists in exe folder

program TestNeural;

{$mode objfpc}{$H+}

uses
  {$IFDEF UNIX}{$IFDEF UseCThreads}
  cthreads,
  {$ENDIF}{$ENDIF}
  Classes, SysUtils,
  dateutils,
  neuralnetwork,
  neuralvolume,
  neuralfit
  { you can add units after this };

const
  FAV_SYS_NEWLINE = #13#10;

type
  TInputArray  = array[0..50] of TNeuralFloat;
  TOutputArray =  array[0..2] of TNeuralFloat;

  TRecStatistics = record
    count_samples: integer;

    OK: integer;
    NOK: integer;
    UNKNOWN: integer;

    count_rej: Integer;
    count_neg: Integer;
    count_app: Integer;
    count_unk: Integer;

    count_output_rej: Integer;
    count_output_neg: Integer;
    count_output_app: Integer;
    count_output_unk: Integer;

    approved_but_shoud_be_rejected: Integer;
    approved_but_shoud_be_negotiated: Integer;

    miss_rejected: integer;
    miss_negotiated: integer;
    miss_approved: integer;

    unknown_is_really_rejected: integer;
    unknown_is_really_negotiated: integer;
    unknown_is_really_approved: integer;
  end;

var
  _TrainingListLoaded: Boolean;
  _TestListLoaded: Boolean;
  _InputArray: TInputArray;
  _OutputArray: TOutputArray;
  _Input  : TNNetVolume;
  _Output : TNNetVolume;
  _Pair   : TNNetVolumePair;
  _TrainingPairs: TNNetVolumePairList;
  _TestPairs: TNNetVolumePairList;
  _NFit: TNeuralFit;
  _NNet: TNNet;
  _ComputedOutput : TNNetVolume;
  _LearnCount: Integer;
  _TestCount: Integer;
  _Statistics: TRecStatistics;


const InitTRecStatistics: TRecStatistics = ({%H-});

function FormatFloat(InNumber: Extended; InNoOfDecimalPlaces: Integer = 4): String;
begin
  Result := Format('%0.' + IntToStr(InNoOfDecimalPlaces)+ 'f', [InNumber]);
end;

procedure FavLogToConsole(InString: String);
begin
  if IsConsole then begin
    try
      {$ifdef Windows}
      Writeln(StdOut, FormatDateTime('yyyy-mm-dd hh:nn:ss.zzz', Now), ': ', InString);
      {$else}
      Write(StdOut, FavToNiceStringDateAndTimeAndMSec(FavUTCNow), ': ', InString, #13#10);
      {$endif}
    except
    end;
  end;
end;


procedure FavSaveToFile(InFileName: String; const InString: String;
  InRaiseError: Boolean);
var MyFile: Text;
    MyFileAssigned: Boolean;
    MyFileOpened: Boolean;
begin
  MyFileAssigned:=False;
  MyFileOpened:=False;
  try
    AssignFile(MyFile, InFileName);
    MyFileAssigned:=True;
    Rewrite(MyFile);
    MyFileOpened:=True;
    Write(MyFile, InString);
    Flush(MyFile);
    CloseFile(MyFile);
    MyFileOpened:=False;
    MyFileAssigned:=False;
  except
    on E: Exception do begin
      if MyFileAssigned then begin
        try
          if MyFileOpened then begin
            CloseFile(MyFile);
          end;
        except
        end;
      end;
      if InRaiseError then begin;
        raise;
      end;
    end;
  end;
end;

function FavLoadFromFile(InFileName: String; InRaiseErrorIfNotExists: Boolean = False): String;
var MyFile: Text;
    MyFileAssigned: Boolean;
    MyFileOpened: Boolean;
    MyBuffer: String;
    MyString: String;
begin
  MyFileAssigned:=False;
  MyFileOpened:=False;
  MyString := '';

  if InRaiseErrorIfNotExists then begin
    if FileExists(InFileName) = false then begin
      raise Exception.Create('File: "' + InFileName + '" does not exists!');
    end;
  end;
  try
    AssignFile(MyFile, InFileName);
    MyFileAssigned:=True;
    Reset(MyFile);
    MyFileOpened := true;
    while not Eof(MyFile) do begin
      ReadLn(MyFile, MyBuffer);
      if MyString <> '' then MyString := MyString + #13#10;
      MyString := MyString + MyBuffer;
    end;
    CloseFile(MyFile);
    MyFileOpened := False;
    MyFileAssigned:=False;
  except
    on E: Exception do begin
      if MyFileAssigned then begin
        try
          if MyFileOpened then begin
            CloseFile(MyFile);
          end;
        except
        end;
      end;
      raise;
    end;
  end;
  Result := MyString;
end;

procedure FavAppendToFile(InFileName: String; const InString: String);
var MyFile: Text;
    MyFileAssigned: Boolean;
    MyFileOpened: Boolean;
begin
  MyFileAssigned:=False;
  MyFileOpened := False;
  try
    AssignFile(MyFile, InFileName);
    MyFileAssigned:=True;
    if FileExists(InFileName) then begin
      Append(MyFile);
      MyFileOpened := True;
    end
    else begin
      Rewrite(MyFile);
      MyFileOpened := True;
    end;
    Write(MyFile, InString);
    Flush(MyFile);
    CloseFile(MyFile);
    MyFileOpened := False;
    MyFileAssigned:=False;
  except
    on E: Exception do begin
      if MyFileAssigned then begin
        try
          if MyFileOpened then begin
            CloseFile(MyFile);
          end;
        except
        end;
      end;
      raise;
    end;
  end;
end;

procedure CalculateLayerCounts(InNumberOfInputs: Integer; InNumberOfSamples: Integer; var OutOneLayer,
  OutTwoLayersFirst, OutTwoLayersSecond: Integer);
begin
  // Found in some book
  // N -- number of training data N sampes  - 4 samples
  // m -- number of output neurons = 3 outputs
  // 1st layer: SQRT((m + 2) * N) + 2 * SQRT(N/(m+2)) = SQRT((3+2) * 4) + 2 * SQRT(4 / (3 +2)) = 4,4721 + 1,7889 = 6,26 = 7
  // 2nd layer: m * SQLRT(N/(m+2)) = 3 * SQRT(4/(3+2)) = 2.68 = 3

  FavLogToConsole('Number of samples: ' + IntToStr(InNumberOfSamples));
  FavLogToConsole('Number of outputs: ' + IntToStr(InNumberOfInputs));

  OutOneLayer := Trunc(sqrt((InNumberOfInputs + 2) * InNumberOfSamples)) + 1;

  OutTwoLayersFirst := Trunc(sqrt((InNumberOfInputs+2) * InNumberOfSamples) + 2 * sqrt(4 / (InNumberOfInputs + 2))) + 1;
  OutTwoLayersSecond := Trunc(InNumberOfInputs * sqrt(4 / (InNumberOfInputs+2)) + 1);

  FavLogToConsole('OneLayer: ' + IntToStr(OutOneLayer));
  FavLogToConsole('TwoLayersFirst: ' + IntToStr(OutTwoLayersFirst));
  FavLogToConsole('TwoLayersSecond: ' + IntToStr(OutTwoLayersSecond));
end;

procedure LoadLearnData;
var MyStringsInput: TStrings;
    MyStringsOutput: TStrings;
    MyInputString: String;
    MyOutputString: String;
    F: Integer;
begin
  MyStringsInput  := nil;
  MyStringsOutput := nil;
  MyInputString   := '';
  MyOutputString  := '';
  try
    if FileExists('learn_input.csv') and FileExists('learn_output.csv') then begin
      FavLogToConsole('Loading training data from training files');
      MyStringsInput := TStringList.Create;
      MyStringsInput.LoadFromFile('learn_input.csv');
      MyStringsOutput := TStringList.Create;
      MyStringsOutput.LoadFromFile('learn_output.csv');

      if (MyStringsInput.Count > 0) and (MyStringsInput.Count = MyStringsOutput.Count) then begin
        _LearnCount := MyStringsInput.Count;
        for F := 0 to _LearnCount-1 do begin
          _Input  := TNNetVolume.Create();
          _Input.LoadFromString(MyStringsInput.Strings[F]);
          _Output  := TNNetVolume.Create();
          _Output.LoadFromString(MyStringsOutput.Strings[F]);
          _Pair   :=  TNNetVolumePair.Create(_Input, _Output);
          _TrainingPairs.Add(_Pair);

          MyInputString := MyInputString + _Input.SaveToString() + FAV_SYS_NEWLINE;
          MyOutputString := MyOutputString + _Output.SaveToString() + FAV_SYS_NEWLINE;
        end;
        _TrainingListLoaded := True;
      end;

      FreeAndNil(MyStringsInput);
      FreeAndNil(MyStringsOutput);

      if _TrainingListLoaded = false then begin
        FavLogToConsole('*****************************************');
        FavLogToConsole('Learn/Training files are not loaded. Maybe count of records in files are not the same');
      end;

      FavLogToConsole('Saving learn_input1.csv and learn_output1.csv...');
      FavSaveToFile('learn_input1.csv', MyInputString, false);
      FavSaveToFile('learn_output1.csv', MyOutputString, false);
    end
    else begin
      FavLogToConsole('*****************************************');
      FavLogToConsole('File learn_input.csv or file learn_output.csv does not exists');
    end;
  except
  end;
end;

procedure CreateNetwork;
var MyOneLayerCount: Integer;
    MyFirstLayerCount: Integer;
    MySecondLayerCount: Integer;
    MyTestCount: Integer;
begin
  MyTestCount := _TrainingPairs.Count;

  // Create network - so far 51 params
  _NNet.AddLayer(TNNetInput.Create(Length(TInputArray)));

  CalculateLayerCounts(Length(_OutputArray), MyTestCount, MyOneLayerCount, MyFirstLayerCount, MySecondLayerCount);

  // FavLogToConsole('Using One Layer');
  // _NNet.AddLayer( TNNetFullConnectReLU.Create(MyOneLayerCount) );

  FavLogToConsole('Using Two Layers');
  _NNet.AddLayer( TNNetFullConnectReLU.Create(MyFirstLayerCount) );
  _NNet.AddLayer( TNNetFullConnectReLU.Create(MySecondLayerCount) );

  // Last layer have 3 outputs
  _NNet.AddLayer( TNNetFullConnectReLU.Create(Length(TOutputArray)) );
end;

procedure Compute;
begin
  FavLogToConsole('Computing...');
  _ComputedOutput := TNNetVolume.Create(3,1,1,1);

  // call predefind algorithm
  _NFit.InitialLearningRate := 0.0001;
  _NFit.LearningRateDecay := 0;
  _NFit.L2Decay := 0;
  _NFit.Verbose := false;
  _NFit.HideMessages();
  _NFit.Fit(_NNet, _TrainingPairs, nil, nil, 32, 3000); // 3000);
end;

procedure PrintStatistics( var InStatistics: TRecStatistics);
var
  MyPercentHit, MyPercentMiss, MyPercentUnknown: Extended;
begin
  FavLogToConsole(Format('OK: %s, NOK: %s, UNKNOWN: %s', [IntToStr(InStatistics.OK),IntToStr(InStatistics.NOK),IntToStr(InStatistics.UNKNOWN)]));
  FavLogToConsole(Format('Input: Reject: %s, Neg: %s, Approved: %s', [IntToStr(InStatistics.count_rej), IntToStr(InStatistics.count_neg), IntToStr(InStatistics.count_app)]));
  FavLogToConsole(Format('Output: Reject: %s, Neg: %s, Approved: %s, Unknown: %s', [IntToStr(InStatistics.count_output_rej), IntToStr(InStatistics.count_output_neg), IntToStr(InStatistics.count_output_app), IntToStr(InStatistics.count_output_unk)]));

  if InStatistics.count_samples > 0 then begin
    MyPercentHit := 100 * (InStatistics.OK / (InStatistics.count_samples));
    MyPercentMiss := 100 * (InStatistics.NOK / (InStatistics.count_samples));
    MyPercentUnknown := 100 * (InStatistics.UNKNOWN / (InStatistics.count_samples));
  end
  else begin
    MyPercentHit := 0;
    MyPercentMiss := 0;
    MyPercentUnknown := 0;
  end;

  FavLogToConsole(Format('Hit: %s percent; Miss: %s percent; Unknown: %s percent', [
                        FormatFloat(MyPercentHit),
                        FormatFloat(MyPercentMiss),
                        FormatFloat(MyPercentUnknown)
                  ]));
  FavLogToConsole(Format('MissRejected: %s; MissNegotiated: %s; MissApproved: %s', [
                        IntToStr(InStatistics.miss_rejected),
                        IntToStr(InStatistics.miss_negotiated),
                        IntToStr(InStatistics.miss_approved)
                  ]));
  FavLogToConsole(Format('UnknownIsRejected: %s; UnknownIsNegotiated: %s; UnknownIsApproved: %s', [
                        IntToStr(InStatistics.unknown_is_really_rejected),
                        IntToStr(InStatistics.unknown_is_really_negotiated),
                        IntToStr(InStatistics.unknown_is_really_approved)
                  ]));
  FavLogToConsole(Format('ApprovedButShoudBeRejected: %s; ApprovedButShoudBeNegotiated: %s', [
                        IntToStr(InStatistics.approved_but_shoud_be_rejected),
                        IntToStr(InStatistics.approved_but_shoud_be_negotiated)
                  ]));
end;

function AnalyzeOutput(InRowNumber: Integer;
  InDesired, InComputed: TNNetVolume;
  var OutStatistics: TRecStatistics
  ): Integer;
var MyComputedOriginal: TOutputArray;
    MyDesired: TOutputArray;
    MyComputed: TOutputArray;
    MyResult: Integer;

    MyNIResult: Extended;  // 100 is approwed = 0;  010 is negotiated = 0.5;  001 is rejected = 1   // desired, input
    MyNOResult: Extended;  // output
begin
  Inc(OutStatistics.count_samples);

  MyResult := -1;  // not good result

  MyDesired[0] := InDesired.Raw[0];
  MyDesired[1] := InDesired.Raw[1];
  MyDesired[2] := InDesired.Raw[2];

  // fix output
  MyComputedOriginal[0] := InComputed.Raw[0];
  MyComputedOriginal[1] := InComputed.Raw[1];
  MyComputedOriginal[2] := InComputed.Raw[2];

  MyComputed[0] := MyComputedOriginal[0];
  MyComputed[1] := MyComputedOriginal[1];
  MyComputed[2] := MyComputedOriginal[2];

  if MyComputed[0] < 0 then MyComputed[0] := 0;
  if MyComputed[1] < 0 then MyComputed[1] := 0;
  if MyComputed[2] < 0 then MyComputed[2] := 0;

  MyComputed[0] := round(MyComputed[0]);
  MyComputed[1] := round(MyComputed[1]);
  MyComputed[2] := round(MyComputed[2]);

  if      (MyDesired[0] = 1) and (MyDesired[1] = 0) and (MyDesired[2] = 0) then MyNIResult := 0
  else if (MyDesired[0] = 0) and (MyDesired[1] = 1) and (MyDesired[2] = 0) then MyNIResult := 0.5
  else if (MyDesired[0] = 0) and (MyDesired[1] = 0) and (MyDesired[2] = 1) then MyNIResult := 1
  else begin
    MyNIResult := -1
  end;

  if      (MyComputed[0] = 1) and (MyComputed[1] = 0) and (MyComputed[2] = 0) then MyNOResult := 0
  else if (MyComputed[0] = 0) and (MyComputed[1] = 1) and (MyComputed[2] = 0) then MyNOResult := 0.5
  else if (MyComputed[0] = 0) and (MyComputed[1] = 0) and (MyComputed[2] = 1) then MyNOResult := 1
  else begin
    MyNOResult := -1
  end;

  // Do little counting
  if MyNIResult = 0 then begin
    // 100 is approwed  = 0
    Inc(OutStatistics.count_app);
  end
  else if MyNIResult = 0.5  then begin
    // 010 is negotiated = 0.5
    Inc(OutStatistics.count_neg);
  end
  else if MyNIResult = 1  then begin
    // 010 is rejected = 1
    Inc(OutStatistics.count_rej);
  end
  else begin
    // This should never happen
    Inc(OutStatistics.count_unk);
  end;

  if MyNOResult = 0 then begin
    Inc(OutStatistics.count_output_app);
  end
  else if MyNOResult = 0.5 then begin
    Inc(OutStatistics.count_output_neg);
  end
  else if MyNOResult = 1 then begin
    Inc(OutStatistics.count_output_rej);
  end
  else begin
    Inc(OutStatistics.count_output_unk);
  end;

  // Analyze result
  if MyNIResult <> MyNOResult then begin
    if MyNOResult = 0 then begin
      // approved
      if MyNIResult = 1 then begin
        Inc(OutStatistics.approved_but_shoud_be_rejected)
      end
      else if MyNIResult = 0.5 then begin
        Inc(OutStatistics.approved_but_shoud_be_negotiated)
      end
    end;

    if MyNIResult = 1 then begin
      Inc(OutStatistics.miss_rejected)
    end
    else if MyNIResult = 0.5 then begin
      Inc(OutStatistics.miss_negotiated)
    end
    else begin
      Inc(OutStatistics.miss_approved)
    end;

    Inc(OutStatistics.NOK);
    MyResult := 1;  // not good
    FavLogToConsole(Format('NotGood Row %s, : ResultComputed: %s, ResultWanted: %s', [
                            IntToStr(InRowNumber),
                            FormatFloat(MyNOResult, 1),
                            FormatFloat(MyNIResult)]));
    if MyNOResult = -1 then begin
      if MyNIResult = 1 then begin
        Inc(OutStatistics.unknown_is_really_rejected)
      end
      else if MyNIResult = 0.5 then begin
        Inc(OutStatistics.unknown_is_really_negotiated)
      end
      else begin
        Inc(OutStatistics.unknown_is_really_approved)
      end;


      Inc(OutStatistics.UNKNOWN);
      MyResult := -1; // Completeley missed
      FavLogToConsole(Format('NotGoodCombination Row %s, (%s; %s; %s): ResultComputed: %s, ResultWanted: %s',
             [IntToStr(InRowNumber),
              FormatFloat(MyComputedOriginal[0], 4),
              FormatFloat(MyComputedOriginal[1], 4),
              FormatFloat(MyComputedOriginal[2], 4),
              FormatFloat(MyNOResult, 1),
              FormatFloat(MyNIResult, 1)
             ])
      );
    end;
  end
  else begin
    Inc(OutStatistics.OK);
    MyResult := 0;
  end;

  Result := MyResult;
end;

procedure TestLearning;
var MyStringsInput: TStrings;
    MyStringsOutput: TStrings;
    MyInputString: String;
    MyOutputString: String;

    MyTestInput  : TNNetVolume;
    MyTestOutput : TNNetVolume;
    MyTestPair   : TNNetVolumePair;

    F: Integer;
    MyComputedString: String;
begin
  MyStringsInput := nil;
  MyStringsOutput := nil;
  MyInputString := '';
  MyOutputString:= '';
  MyComputedString := '';
  _Statistics := InitTRecStatistics;
  if FileExists('test_input.csv') and FileExists('test_output.csv') then begin
    FavLogToConsole('Loading test data from test files');
    MyStringsInput := TStringList.Create;
    MyStringsInput.LoadFromFile('test_input.csv');
    MyStringsOutput := TStringList.Create;
    MyStringsOutput.LoadFromFile('test_output.csv');

    if (MyStringsInput.Count > 0) and (MyStringsInput.Count = MyStringsOutput.Count) then begin
      _LearnCount := MyStringsInput.Count;
      for F := 0 to _LearnCount-1 do begin
        MyTestInput  := TNNetVolume.Create();
        MyTestInput.LoadFromString(MyStringsInput.Strings[F]);
        MyTestOutput  := TNNetVolume.Create();
        MyTestOutput.LoadFromString(MyStringsOutput.Strings[F]);
        MyTestPair   :=  TNNetVolumePair.Create(MyTestInput, MyTestOutput);
        _TestPairs.Add(MyTestPair);

        MyInputString := MyInputString + _Input.SaveToString() + FAV_SYS_NEWLINE;
        MyOutputString := MyOutputString + _Output.SaveToString() + FAV_SYS_NEWLINE;
      end;
      _TestListLoaded := True;
    end;

    if _TestListLoaded then begin
      for F := 0 to _TestPairs.Count-1 do begin
        MyTestInput := TNNetVolumePair(_TestPairs.Items[F]).A;
        MyTestOutput := TNNetVolumePair(_TestPairs.Items[F]).B;

        _NNet.Compute(MyTestInput);
        _NNet.GetOutput(_ComputedOutput);
        MyComputedString := MyComputedString + _ComputedOutput.SaveToString() + FAV_SYS_NEWLINE;
        AnalyzeOutput(F, MyTestOutput, _ComputedOutput, _Statistics);
      end;
      FavLogToConsole('Saving computedoutput.csv...');
      FavSaveToFile('computedoutput.csv', MyComputedString, False);
    end
    else begin
      FavLogToConsole('*****************************************');
      FavLogToConsole('Test files are not loaded. Maybe count of records in files are not the same');
    end;
  end
  else begin
    FavLogToConsole('*****************************************');
    FavLogToConsole('File test_input.csv or file test_output.csv does not exists');
  end;
end;

begin
  _TrainingListLoaded := False;
  _TestListLoaded := False;
  _LearnCount := 0;
  _TestCount := 0;
  _Statistics := InitTRecStatistics;

  _NNet := TNNet.Create();
  _NFit := TNeuralFit.Create();
  _TrainingPairs := TNNetVolumePairList.Create();
  _TestPairs := TNNetVolumePairList.Create();

  LoadLearnData;
  CreateNetwork;
  Compute;
  TestLearning;
  PrintStatistics(_Statistics);

  writeln('Finished. Press any key to exit....');
  readln;

  if _NNet <> nil then FreeAndNil(_NNet);
  if _NFit <> nil then FreeAndNil(_NFit);
  if _TrainingPairs <> nil then FreeAndNil(_TrainingPairs);
  if _TestPairs <> nil then FreeAndNil(_TestPairs);

end.

dinmil

  • New Member
  • *
  • Posts: 42
Re: Conscious Artificial Intelligence - How to Start
« Reply #7 on: November 22, 2019, 04:49:29 pm »
One more file is missing

schuler

  • Full Member
  • ***
  • Posts: 134
Re: Conscious Artificial Intelligence - How to Start
« Reply #8 on: November 24, 2019, 08:43:31 pm »
@dinmil,
Thank you for sending your code.

Just did small changes to check the training accuracy:
Code: Pascal  [Select]
  1.   _NFit.Verbose := true;
  2.   _NFit.EnableMonopolarHitComparison();
  3.   //_NFit.HideMessages();

I'm getting about 89.5% training accuracy on your code (well done for your first attempt). What is the accuracy that you are looking for?

schuler

  • Full Member
  • ***
  • Posts: 134
Re: Conscious Artificial Intelligence - How to Start
« Reply #9 on: November 24, 2019, 08:55:28 pm »
You can increase accuracy with this super small change:
Code: Pascal  [Select]
  1.   // Last layer have 3 outputs
  2.   _NNet.AddLayer( TNNetFullConnectLinear.Create(Length(TOutputArray)) );
  3.   _NNet.AddLayer( TNNetSoftMax.Create());

Got about 98.4% accuracy changing just the 2 lines of code shown above (replacing the last ReLU by linear and adding a softmax).

Happy days?

dinmil

  • New Member
  • *
  • Posts: 42
Re: Conscious Artificial Intelligence - How to Start
« Reply #10 on: November 26, 2019, 12:22:38 pm »
Here is the problem. This solution needs to be part of betting architecture.
System receive 700.000 bet slips per day. From this 700.000, about 10.000 are previewed by risk guys before they are accepted.
From this 10.000, 9.880 and accepted, 60 is rejected (too risky) and 60 are negotiated with customers (for smaller amount).

So far, in this example I have almost 100% hit for accepted, and 0% hit for negotiated and rejected.
Actually this is improvement, because algorithm can accept almost 100%, and therefore only 1% can by resolved manually by guys.
That is the idea. Move the job from guys to neural network.

Note, there is 10 risk guys which preview betslips.
It is possible for every guy to have different opinion for every betslip. Some guy want to accept it, some guy want to reject it, a some wants to negotiate.


Because it is too small percentage of rejected and negotiated is it too much too expect that I will ever be able to detect those 2 kinds?
« Last Edit: November 26, 2019, 01:36:58 pm by dinmil »

schuler

  • Full Member
  • ***
  • Posts: 134
Re: Conscious Artificial Intelligence - How to Start
« Reply #11 on: November 26, 2019, 12:50:29 pm »
Very interesting example!

What you are describing is sometimes called inbalanced training data set or oversampling/undersampling:
https://en.wikipedia.org/wiki/Oversampling_and_undersampling_in_data_analysis

How to fix:
Just give one third of training elements for each class given that you have 3 classes.