> Forum has [ quote ]... [ / quote ] blocks.
I know , but they are much longer to type (or require a third arm for mouse, while other two arms are on keyboard.)
> If var2 has a refcount of 0 before the call (due to some severe bug) then yes, such a watch will free the memory of var2. But it's the called function, not the debugger that does it.
Frankly, it is mostly an argument about definitions, funny but usually meaning little. If the program died with division by zero error, was it an error in the program or an error in the function but not in the program? If debugger calls the function that then unexpectedly destroys something - then from user point of view it was destroyed by debugger. Just like a prey is killed by the hunter, not by the bullet or by the rifle.
If we go really techy, then the function did nothing wrong, it was inconsistent state of the passed parameter object that led to disaster. And then who was the actor, who gave that inconsistent value to the function? it was the watch window. On the next iteration it would be unsavvy devleloper, who enabled function calls in watch window without making sure the function works safely on incorrect data. And so forth go blame games.
From appliaction programmer's point of view, function is the code he types. Programmer does not do refcounting, it is "Delphi" or "Lazarus" which does it automatically. Compiler developer then would shout "learn your tools". AppDev would yell "fix your tools" and so it goes, making the question "who did it" senseless.
> of 0 before the call (due to some severe bug)
The thing is, the programmer (the program he writes) did not do any calls yet, he is tracing through the object nested constructors (bonus points for foing it in assembler window not even Pascal one). It is debugger then which suddenly calls a funciton (like property getter) before the object is ready for it.
So there is "severe bug" in the program, there was not enough of understanding of inner working of the debugger. Or not enough of forethought and sharp memory.
> Teaching the debugger how to revert the effects of every possible call to the kernel or any library => nope
Not in Pascal, yes. It would need a different language.
But implementing intrinsics in the way that would preclude side effects, and then not extending in the ways breaking it - is possible. That is what i wrote. Special care has to be done that intrinsics would be free from side effects by design and by implementation.
Otherwise, debugger's :length is worse than RTL's Length. But if an intrinsic gives warranty of never triggering side effects (even on inconsistent program states and data structures), then it alone makes a good reason to duplicate things. I think it shuld be requirement for current and futue intrinsics to never ever have side effects, better fail than this.
> If it is a com interface.... But that must be known.
In Delphi Classic it is by definition so. But as soon as we go cross-platform - it can no more be. At most it cold have been Mozilla XPCOM, but i guess it also does not exist on every FPC-supported platform. So it is even less certainty in FPC than there is in Delphi/COM (where it is lacking already).
> And IIRC, fpc does not put that into the DWARF info
I don't know DWARF, but i also would not be sure FPC even could put that info there without breaking compatibility with some "old classic" tools.
Not all formats are extensible enough, and when they are - practical programs are not always forgiving...