Recent

Author Topic: Strange bugs with Advanced Records - SOLVED  (Read 3553 times)

440bx

  • Hero Member
  • *****
  • Posts: 4740
Re: Strange bugs with Advanced Records - SOLVED
« Reply #45 on: October 30, 2024, 05:05:22 pm »
Long story short, you as a programmer only ever are responsible to follow the semantics of the language.
For that part, I qualify that with "most of the time" or "if convenient". 

If something falls outside of the defined behavior of the language, you should not make any assumptions about it
I definitely don't make any assumptions.  I make sure the compiler behaves as I expect it to and, not only that, as it _has_ to. 

That's the reason I only use "const" when dealing with ordinal types and I am fully aware that some types are ordinal in 32 64 bit but not in 64 32 bit, e.g, qword.  Anything I haven't done in the past, I make it a point to look at the generated assembler to ensure it is the way I expect it to be (and the way the compiler should be coding it.)

ETA:

I should have noted that I normally place a comment pointing out that part of a "constref" record is also being passed by reference as a different parameter.  Just to eliminate any "surprise" factor.

« Last Edit: October 30, 2024, 07:06:12 pm by 440bx »
(FPC v3.0.4 and Lazarus 1.8.2) or (FPC v3.2.2 and Lazarus v3.2) on Windows 7 SP1 64bit.

Thaddy

  • Hero Member
  • *****
  • Posts: 16193
  • Censorship about opinions does not belong here.
Re: Strange bugs with Advanced Records - SOLVED
« Reply #46 on: October 30, 2024, 05:11:24 pm »
The only thing what we really should be discussing if fpc needs a true static variable: as it is whe have the choice beween typed const in J+ state OR J- state. We should have both, without switches. And it confuses people because of the unfortunate naming.
If I smell bad code it usually is bad code and that includes my own code.

Warfley

  • Hero Member
  • *****
  • Posts: 1762
Re: Strange bugs with Advanced Records - SOLVED
« Reply #47 on: October 30, 2024, 07:42:16 pm »
I definitely don't make any assumptions.  I make sure the compiler behaves as I expect it to and, not only that, as it _has_ to. 

That's the reason I only use "const" when dealing with ordinal types and I am fully aware that some types are ordinal in 32 64 bit but not in 64 32 bit, e.g, qword.  Anything I haven't done in the past, I make it a point to look at the generated assembler to ensure it is the way I expect it to be (and the way the compiler should be coding it.)

I whole heartedly disagree with that approach. When using a high level language you should not make any assumptions (or expectations) about the underlying assembly. To give an example in C the "char" type is the only ordinal type which is neither signed nor unsigned. The reason for this is simply, some processors are faster with signed chars, others with unsigned. So in order to produce the optimal code for any CPU, C does not make any assumptions about that.
I noted earlier that C++ had up until C++20 no defined representation of signed integer types. The reason for that is, coming from C, it may be implemented on machines that use Sign and Magnitude, 1s complement or 2s complement. This has only been changed recently because 2s complement is so common that it doesn't make sense to optimize for the others anymore.

But what I'm getting at here is, if you write a program in valid C, or C++, or Pascal, or any other high level language, it should work the exact same on any machine and any cpu. No matter if it's a 64 bit little endian x64 CPU, or a 16 bit big endian motorola 6809 chip.
In C it goes even so far that things like converting bit representations (through pointers or unions) are actually not allowed by the standard, meaning if you write fully standard C without any implementation defined or undefined behavior, it will run exactly the same on any cpu.

So whenever writing code in a high level language it's best to assume it's implemented using fairy dust and magic, and don't think about what happens on an assembler level. Assumptions about the generated assembler works ok-ish with a language like Pascal which does not have much undefined behavior and frankly has rather little optimizations, but you still shouldn't bet on it, as there is still constant work on the FPC, and as I said previously, I'm personally very curious about the LLVM backend, as LLVM can do some crazy optimizations.

PS: also with Pascal or C I'm of course talking about rather low level languages, where there is an "obvious" way on which assembly they result in. If you go to much higher level languages such as Haskell or something similar, thinking about your code in assembly is going to give you much more trouble

440bx

  • Hero Member
  • *****
  • Posts: 4740
Re: Strange bugs with Advanced Records - SOLVED
« Reply #48 on: October 30, 2024, 09:44:25 pm »
I whole heartedly disagree with that approach.
I have no doubt you have plenty of company there.

When using a high level language you should not make any assumptions (or expectations) about the underlying assembly.
I will say it again: I don't make assumptions and I verify that the compiler did what I expected it to do.

Compilers don't do magic, they follow rules to generate code, that's true of optimizers too, they follow rules.  Compilers, unlike programmers cannot break the rules because if they do, that's reported as a bug.  If a compiler breaks a rule I need it to enforce, I'll change the code to have it generate the code I want.  IOW, the compiler is tool that works for _me_, I don't work for the compiler.

When a programmer breaks the rules (which I sometimes do) then it is on his/her shoulder to ensure breaking whatever rule got broken does not have undesirable consequences, if it does fix it and, if it doesn't enjoy :)

One rule that should ideally never be broken is, someone who doesn't know what they're doing shouldn't be breaking rules (unless they want to learn the hard way.)
 
(FPC v3.0.4 and Lazarus 1.8.2) or (FPC v3.2.2 and Lazarus v3.2) on Windows 7 SP1 64bit.

Warfley

  • Hero Member
  • *****
  • Posts: 1762
Re: Strange bugs with Advanced Records - SOLVED
« Reply #49 on: October 31, 2024, 05:59:16 am »
Quote
I don't make assumptions and I verify that the compiler did what I expected it to do.
Assumptions that you verify are still assumptions. Note that when I googled "expectation synonym" the very first result was "assumption" :)

Compilers don't do magic, they follow rules to generate code, that's true of optimizers too, they follow rules.  Compilers, unlike programmers cannot break the rules because if they do, that's reported as a bug.  If a compiler breaks a rule I need it to enforce, I'll change the code to have it generate the code I want.  IOW, the compiler is tool that works for _me_, I don't work for the compiler.

But these rules can change with future versions of the Compiler, or when used on different targets. When I write code in Pascal it should work independently of which configuration of Compiler and target platform is used. I neither can nor want to be bothered every time my code is used in a different configuration to verify it does whatever I originally intended it to do.

If not it would be completely impossible to write libraries in any way shape or form, because when I write a library, there is absolutely no way I can know how it will be used in the end, and I need to write it in a way that it works in pretty much all configurations now and in the future. And this is possible because programming languages define their semantics independently of their target system or configuration. A valid standard conforming c program written in 1980s for a 8080, should still work correctly today using modern GCC on a x64.
« Last Edit: October 31, 2024, 06:02:18 am by Warfley »

440bx

  • Hero Member
  • *****
  • Posts: 4740
Re: Strange bugs with Advanced Records - SOLVED
« Reply #50 on: October 31, 2024, 07:06:28 am »
Assumptions that you verify are still assumptions. Note that when I googled "expectation synonym" the very first result was "assumption" :)
Personally, in my book, once something is verified it is no longer an assumption.  It's kinda like that by definition.

But these rules can change with future versions of the Compiler, or when used on different targets.
That's true.  In those cases the assumption(s), if any, need to be verified again to account for the changes that took place.

I am not suggesting that rules should be broken in a cavalier manner.  What I am "suggesting" is that there are rules the compiler _wishes_ it could enforce but cannot.  Breaking those rules is very low risk (as long as the programmer knows what he/she is doing.)  Among those, passing a large structure as a constant by reference and passing some other records that are part of it by reference (and not constant.)    The important thing when that is done is to always reference those other records using their reference (var) identifier because, if referenced using the constant identifier then the compiler could mistakenly (and correctly) presume the value has not changed.  Because of that, values which are not constant should be referenced using the "var" identifier. That complies with the compiler's rules because the compiler cannot presume any field in those records are constant because they are passed by reference and, the compiler _knows_ that therefore it cannot indulge in optimizations that only work when the values are constant.

Again, if the programmer knows what he/she is doing, things will work fine.

A bit ironically, what I described follows the rules and, depending on how things are interpreted, also keeps the promise(s) made to the compiler, i.e, the values are never changed using the constref identifier and values that are changed are changed using the "var" identifier, therefore the compiler is being kept informed of what is happening using the methods the compiler expects to be kept informed.

As long as it's done the right way, there is no problem there.
(FPC v3.0.4 and Lazarus 1.8.2) or (FPC v3.2.2 and Lazarus v3.2) on Windows 7 SP1 64bit.

ad1mt

  • Sr. Member
  • ****
  • Posts: 327
    • Mark Taylor's Home Page
Re: Strange bugs with Advanced Records
« Reply #51 on: November 01, 2024, 09:04:34 pm »
The reason you have this problem is quite simply is that dynamic arrays are (unlinke dynamic strings) not copy on write. Meaning if you have multiple references to the same array, it will not be made unique on access.
For strings there is the function UniqueString, but afaik nothing comparable exists for arrays yet.

For your purposes the most simple way of doing the deep copy would be to simply use setlength, as it ensures a refcount of 1, meaning it can be used to create a unique copy:
Code: Pascal  [Select][+][-]
  1. class operator REC.copy(constref v1:REC; var V2:REC);
  2. begin
  3.   v2.I:=v1.I;
  4.   SetLength(v2.I, Length(v2.I));
  5. end;

Personally I think a deepcopy intrinisc utilizing RTTI would be a very useful addition to the compiler. All thats needed to implement it is already part of the InitRTTI table used for managed types anyway
I find all this very disappointing.

I've programmed in low-level languages like assembly and C, so I know about all these complexities.
But my view is that a high-level language should hide the low-level details so that you don't have to worry about them, and the code does what you would expect. Programmers should not have know the deep internals of how data types are represented and copied to make their programs work.

If the programmer wants to gain the efficiency of copy-on-write and having the internals of records be pointers to external data objects, then I think that those features should be enabled by a switch, and with the switch having a warning that hidden dangers are present. Then expert programmers, who understand the implications, can turn the features on if they wish.

I had a problem several months ago with FPU exceptions on Intel 32bit CPU's. I turns out that FPU exceptions do not work correctly on Intel 32bit CPU's, unless a special compiler switch is turned-on. I argued that programmers should not have to turn on an obscure switch to make their program work correctly. The correct behaviour should be enabled by default and if the programmer wants to gain a small run-time efficiency by having unsafe FPU exceptions code, then it should be enabled with a switch (rather than the other way round). The existing compiler behaviour means that code using FPU exceptions is broken by default on 32bit Intel CPU's.

Warfley

  • Hero Member
  • *****
  • Posts: 1762
Re: Strange bugs with Advanced Records
« Reply #52 on: November 01, 2024, 10:07:57 pm »
I find all this very disappointing.

I've programmed in low-level languages like assembly and C, so I know about all these complexities.
But my view is that a high-level language should hide the low-level details so that you don't have to worry about them, and the code does what you would expect. Programmers should not have know the deep internals of how data types are represented and copied to make their programs work.

If the programmer wants to gain the efficiency of copy-on-write and having the internals of records be pointers to external data objects, then I think that those features should be enabled by a switch, and with the switch having a warning that hidden dangers are present. Then expert programmers, who understand the implications, can turn the features on if they wish.

But I mean the problem was quite the opposite, the problem is not that arrays do hidden copy on write, but that they specifically do not do that. An array is just a dumb pointer, that is copied around when you copy the record.
The problem here is specifically that the fpc does not try to be smart and does not add copy-on-write (as it for example does with dynamic strings). If arrays had copy on write like strings have, your problem wouldn't exist but instead the issue is very specifically is that FPC just does a dumb pointer copy and nothing more.

You could create a deepcopy function for simple records using RTTI. The problem of course arises when you have records where data is coupled to semantics, e.g. if you have a record that "owns" a pointer, just copying that pointer will not work. Meanwhile when you have a record that has a shared pointer, deepcopying that pointer means that no one is responsible for the deep copy.

ad1mt

  • Sr. Member
  • ****
  • Posts: 327
    • Mark Taylor's Home Page
Re: Strange bugs with Advanced Records
« Reply #53 on: November 02, 2024, 09:08:10 am »
An array is just a dumb pointer, that is copied around when you copy the record.
My point is this...

I did not understand the internals of how the compiler dealt with a record containing a dynamic array. Then when I wrote the code v1:= v2; procl(v1,v2); expecting the behaviour to be the same as any other type, it did not work. The proc_call overwote v1 as well as v2, even though in the definition of proc, v1 was specified as const.

This means that the behaviour of the code was breaking already established rules about the expected behaviour of const parameters in procedure calls, and is inconsistent.

The implications these kinds of complexities, means that no-one can safely code in Pascal until they are an expert that knows everything, right down to all the obscure details of how the compiler works. I.M.O. this defeats the whole philosophy of Pascal as being a safe language.


440bx

  • Hero Member
  • *****
  • Posts: 4740
Re: Strange bugs with Advanced Records
« Reply #54 on: November 02, 2024, 09:24:47 am »
The implications these kinds of complexities, means that no-one can safely code in Pascal until they are an expert that knows everything, right down to all the obscure details of how the compiler works. I.M.O. this defeats the whole philosophy of Pascal as being a safe language.
No compiler can compensate for an inadequate level of knowledge in a programmer.

It is unfortunate that programmers are commonly mislead/encouraged to believe the compiler can protect them from themselves.  There are plenty of ways to trip a compiler, any compiler.  The reason is simple, the compiler only has compile time information, the guard rails it enforces are usually very easy to circumvent at runtime and, that, does _not_ reflect badly on the compiler, it's reflects on the programmer.    The programmer has to know what he/she is doing and cannot blame the compiler for not always catching code that won't work properly.
(FPC v3.0.4 and Lazarus 1.8.2) or (FPC v3.2.2 and Lazarus v3.2) on Windows 7 SP1 64bit.

Warfley

  • Hero Member
  • *****
  • Posts: 1762
Re: Strange bugs with Advanced Records
« Reply #55 on: November 02, 2024, 01:37:10 pm »
I did not understand the internals of how the compiler dealt with a record containing a dynamic array. Then when I wrote the code v1:= v2; procl(v1,v2); expecting the behaviour to be the same as any other type, it did not work. The proc_call overwote v1 as well as v2, even though in the definition of proc, v1 was specified as const.
That is not special knowledge, if you know how records work and you know how dynamic arrays work, it's exactly what you would expect. If you don't know either, you should read into that before using those types.

Basically a record is a simple composite type, where an assignment is equivalent to assigning all of the fields:
Code: Pascal  [Select][+][-]
  1. rec1 := rec2;
  2. // Is basically the same as
  3. rec1.Field1 := rec2.Field1;
  4. rec1.Field2 := rec2.Field2;
  5. rec1.Field3 := rec2.Field3;
  6. ...
It's a very dumb copy.

Dynamic arrays are reference counted pointers. An assignment of a dynamic array does not copy the array but just the pointer to the array:
Code: Pascal  [Select][+][-]
  1. arr1 := arr2;
  2. // Is the same as
  3. Pointer(arr1) := @arr2[0]; // Copy pointer
  4. IncrementRefcount(arr2);

So by putting these two facts together, a record which contains a dynamic array will copy all the fields of the record as if they are assigned, meaning because on assignment of an array the the data of the array is not copied, it also won't be when assigning a record.
In technical terms, it does a shallow copy.

There is absolutely nothing unexpected about the combination of those two, if you are familiar with each one individually
« Last Edit: November 02, 2024, 01:38:46 pm by Warfley »

Thaddy

  • Hero Member
  • *****
  • Posts: 16193
  • Censorship about opinions does not belong here.
Re: Strange bugs with Advanced Records - SOLVED
« Reply #56 on: November 02, 2024, 02:05:15 pm »
I have shown that many times, including in the wiki, but here again:
Code: Pascal  [Select][+][-]
  1. program reccopy;{$mode objfpc}{$modeswitch advancedrecords}
  2. type
  3.   Tmyrec = record
  4.     a,b,c:integer;
  5.     class operator copy (constref value:Tmyrec; var result:Tmyrec);
  6.   end;
  7. // in Delphi this one is called assign
  8. class operator Tmyrec.copy (constref value:Tmyrec; var result:Tmyrec);
  9. begin
  10.   // move (sizeof) takes only the fields, not the methods.
  11.   move (value,result, sizeof(tmyrec));
  12.   writeln('copy called');// <--- just to make it very clear...
  13. end;
  14. var
  15.   x,y :Tmyrec;
  16. begin
  17.   x.a := 100;
  18.   x.b := 200;
  19.   x.c := 300;
  20.   y:=x; //invokes copy
  21.   y.a := 500;// change a value, see what happens..
  22.   writeln (x.a:4, x.b:4, X.c:4);
  23.   writeln (y.a:4, y.b:4, y.c:4);  
  24. end.
You use the copy operator to make a deep copy instead of a shallow - pointer-  copy on assignment.
I am almost sure ALL of this has been shown in this thread already.
Management operators are never called directly, they are hidden from view.
« Last Edit: November 02, 2024, 06:13:17 pm by Thaddy »
If I smell bad code it usually is bad code and that includes my own code.

 

TinyPortal © 2005-2018