I'd retroactively add another vote for FontForge. It's the fundamental editor for font files, in the same way that InkScape is the fundamental (open-source) editor for vector illustrations.
I'm uncomfortable getting directly involved, but my suspicion is that Charset might mean slightly different things to different OSes or at least graphical environments.
In a unix (strictly, X11) context, a font might be named (e.g. by xlsfonts) something like
-misc-fixed-bold-r-normal--0-0-100-100-c-0-iso8859-1
sometimes described (e.g. by xfontsel) as having the fields
-fndry-fmly-wght-slant-sWdth-adstyl-pxSize-ptSz-resx-resy-spc-avgWdth-rgstry-encdng
X11 widget sets will generally try to "do the right thing" if given a font specification for which they don't have a file, or if a field contains * as a wildcard.
The last two of those fields are the registry and encoding, and I've seen those two together referred to as the character set. So the example file I gave assumed the iso8859 registry and the -1 encoding.
The implication there is that one of the registries is UTF, and one of its encodings is -8. But that would appear to imply that the number of distinct codepointsin a character set is a function of the registry, and what glyph is placed in each codepoint is a function of the encoding.
However as I've said, that could be OS-specific and the LCL's behaviour could be influenced by both the OS and by Delphi's historic behaviour running on Windows.
In the case of X11 then
https://en.wikipedia.org/wiki/X_logical_font_description might help, plus the Flowers reference cited. However I can't find anything useful about assigning a numeric value to the character set: that might be a "Windows-ism" and might imply that the field is ignored on other OSes.
Ultimately, the "easiest" way of getting something definitive might be to trace into the LCL with a debugger, and seeing what different character sets do when they hit the underlying API.
MarkMLl