Hello,
I just want to make sure I don't make the wrong assumption about how the compiler bitpacks ranges.
I expect that when bitpacked, the following two ranges will be packed in the _same_ number of bits:
type
RANGE_1 = 0..7;
RANGE_2 = 0..4;
In both cases, I expect the number of bits required to be 3 (bits 0, 1 and 2 - ignoring ordering dependencies that may vary from one cpu to another). I'm seeking confirmation of that and if it is not, then some explanation that sheds light on the reason(s) why not.
Thank you for your help.