I use funcs to convert encoding to/from UTF8. They work good. (Lazarus app, so it is utf8 in strings).
function StrUTF8ToEnc(const S: string; Enc: TSystemCodePage): string;
var
buf: RawByteString;
begin
if S='' then exit('');
buf:= S;
SetCodePage(buf, Enc, true);
SetCodePage(buf, CP_UTF8, false);
Result:= buf;
if Result='' then
raise EConvertError.Create('Cannot convert UTF-8 to DBCS code page');
end;
function StrEncToUTF8(const S: string; Enc: TSystemCodePage): string;
var
buf: RawByteString;
begin
if S='' then exit('');
buf:= S;
SetCodePage(buf, Enc, false);
SetCodePage(buf, CP_UTF8, true);
Result:= buf;
if Result='' then
raise EConvertError.Create('Cannot convert DBCS code page to UTF-8');
end;
Now I want to check, in cross-platform way, the availability of encodings. E.g. EUC-JP (codepage 51932), EUC-KR (codepage 51949), EUC-TW (51950). I try to see are my funcs giving exception for text 'ABC' (the StrUTF8ToEnc works better, another one always gives 'ok'). It detects non-existence of EUC-JP on Windows 10. But it gives 'all ok' for EUC-KR and EUC-TW on Windows-10, but I see from SynWrite editor that these 2 encodings are not supported on Win10. So my method is not OK.
How to test in cross-platform way?