Forum > Debugger

When fpDebug is debugging, Currency type data shows no decimal point

(1/2) > >>

zhuyl:
hello
    When fpDebug is debugging, Currency type data shows no decimal point, such as a=12.48 is displayed as 1248

Martin_fr:
Yes, it's a known issue.

It will currently happen with any debugger, as this is actually caused by the compiler.

In order for a debugger to work, the compiler adds "debug info", which among other things, describes all variables.
FPC identifies "currency" as an integer type. (IIRC 8byte signed int).

That is also how fpc actually compiles this. Currency is stored as int, not as float. The value stored is multiplied by 1000. Allowing for precisely 3 digits after the decimal dot. Because it is not stored as fraction, there is no rounding error (which is important for currency data).

Internally FPC makes sure that the data is correctly handled and displayed.
But it does not include this info for the debugger.

Currently the only difference between the types Int64 and Currency is the name.
But, the debugger can not rely on this. Since any one can define their own type currency "type Currency = int64;" and that would then NOT be a currency....

zhuyl:
I tested the gdb debugger to show that it was correct

Martin_fr:
Just checked. Indeed my feedback was incomplete.


Fpc provides 2 types of debug info: Stabs and Dwarf (in diff versions).

When Fpc writes Dwarf (version 2 or 3, and afaik also version 4) then it declares an int value.
With stabs it writes debug info that seems to be correct.

Unfortunately FpDebug only supports Dwarf.

Actually from a brief look at the Dwarf spec, it seems they support a proper declaration.  Yet, this needs to be still added to fpdebug. I will see when I can find time to add it, but even when added, it wont do anything until fpc follows up.

I tested fpc 3.2.3 64bit win, and it did not use this, but described it as plain int (using dwarf).
Not tested 3.3.1.


EDIT:
To fast again.

The IDE-gdb translator actually has a hardcoded check for "currency".

But that means, if you declare your own type then it will also be shown with decimal point. (still depends, on debug info version, upper/lower, .... But it can happen, I managed to make it happen)

--- Code: Pascal  [+][-]window.onload = function(){var x1 = document.getElementById("main_content_section"); if (x1) { var x = document.getElementsByClassName("geshi");for (var i = 0; i < x.length; i++) { x[i].style.maxHeight='none'; x[i].style.height = Math.min(x[i].clientHeight+15,306)+'px'; x[i].style.resize = "vertical";}};} ---type CURRENCY = 1..999999;var a: CURRENCY; 

Dwarf vs Stabs

This (according to what I was told) also affects how "file of ..." is displayed.

If you us "variant", then select "Dwarf 2 (with sets)" for fpdebug, as it does not display with Dwarf 3 yet.

Martin_fr:
I'll see, if I add a check to FpDebug. If I find time. It will probably be optional, and default to off (Because it can affect other data, if currency is re-declared).

Navigation

[0] Message Index

[#] Next page

Go to full version