Literal values in hex notation are of a signed type. I believe this is documented somewhere.(but can't find it quickly now)
As can be seen when you add these:
var
t:qword = qword($FFFFFFFFFFFFFFFF); //without cast, it is int64 this is delphi compatible.
begin
writeln(DecDigits(T));
One reason is probably Delphi compatibility. Older Delphi's didn't have a true uint64 type.
Then there is of course the ambiguity that is inherent for a literal in hex. The compiler chooses signed...could have been unsigned, which causes the reverse problem..
// this reveals it:
{$mode delphi}
begin
writeln(GetTypeKind($FFFFFFFFFFFFFFFF)); // tkInteger ....
end.
Note I filed a bug against GetTypeKind returning tkInteger. It should of course return tkInt64 but that's another matter.
Safe to say a literal hex value is a signed value by default.
[edit]
Marco informed me that since $FFFFFFFFFFFFFFFF equals -1 the compiler can encode it as an integer that can hold -1, so the default 32 bit integer type is OK.
{$ifdef fpc}{$mode delphi}{$H+}{$endif}
program testintlit;
begin
// all these print tkInteger, so the literal type is a signed integer.
writeln(gettypekind($f)); // nibble
writeln(gettypekind($ff)); // byte or shortint
writeln(gettypekind($ffff)); // word or smallint
writeln(gettypekind($ffffffff)); // dword or integer (32)
writeln(gettypekind($ffffffffffffffff));// qword or int64
end.
Note this is only the case for literals: the typekind of a variable is always the typekind of the declared size and sign.
But a qword sized hex literal still needs a cast to qword($ffffffffffffffff) or High(qword) instead of $ffffffffffffffff .