And what specifically? That it's UTF-16? And doesn't MS itself state in its documentation:
"Wide characters encoded using UTF-16LE (for little-endian) are the native character format for Windows." (Support for Unicode)
Just because it's the windows native character format doesn't make the decision good. It's also the native character format to the JVM but Java the programming language is UTF-8 based. You can have high level languages using different concepts to their low level machine interfaces. UTF-16 makes sense for Kernels like Windows or low level virtual machines like the JVM because 1 char = 1 word makes things very efficient. But it's absolutely terrible for actual developers of high level applications that should interface with users.
The unicode space has codepoints for 100k chinese characters alone, not counting any other alphabets. A UTF-16 char can only hold 65k values. How is it solved? By introducing unicode spces, basically an additional information encoding how the unicode characters are encoded... But guys we totally solved codepages with UTF-16, we are fully international and don't have to exchange encoding metainformation anymore.
Delphi is not a language for low level kernel development, it's a language for interactive user facing applications. and there UTF-16 is a terrible choice.
The academic world didn't use Matlab? Strange, at my university they used Matlab. Matlab has a much better reputation than Python. We all know that there is another reason for using Python
I mean of course, the main driving factor for the switch to python is cost. Matlab costs the universities thousands of euros per year per student, python is free. But I studied computer science in the 2010s, and while in the early 2010s like 2014 you'd still see a lot of matlab, by 2016 around pretty much half of the chairs at least provided all the lecture material for both, matlab and python. Like I said, python blew up around 10 years ago, the big corpo investment was then around 2018 to now.
As for statistical calculations (and expensive packages like SPSS, Statistica), that's what R was created for (and has been available for many years).
I mean you can use R, there are specialized languages for everything, but whenever I worked with non computer scientists, e.g. I did my masters thesis in a project chaired by electrical engineers, or worked during my masters on some medical research, they all used python. Because to non computer scientists it's easy, intuitive and provides all the tools you need.
As for the use of Python for calculations by mathematicians and physicists, I would rather disagree. There is Matlab, yes, its licenses are expensive. Octave and Scilab have existed for many years. As for Python's mathematical libraries - aren't they written in C (and C++)?
Mostly even fortran. Python is a glue language. You don't write big algorithmis or stuff in it, you use it to take libraries like numpy or pandas, which are backed in Fortran or C/C++ and just use python to organize the data you want to perform the computations on. The libraries are the machines that do the work and python are the assembly line putting the things together.
And it's really great for that. If your problem is: "I have a bunch of output files with data and need to crunch the numbers using standard tools and do some plotting", python is absolutely perfect. Thats the reason why it became the language of choice for machine learning/AI. The heavy code is done in libraries like torch, and the actual models are only like 100 lines of glue code to take some activation functions and back propagation methods and glue them together into the torch driven AI library.
In the time it would take me to setup a Pascal project to develop some AI image recognition model, I'm probably already finished in python, because setting up a neural network is just a few lines of code.
The JavaScript problem is somewhat mitigated by the presence of TypeScript. That's why "somewhat" because various computer baboons try to "glue" monsters using "Electron" and similar "inventions".
I mean typescript solves some of the problems but because it's a superset, the underlying problems are not solved. Things like that they have a max, min, etc. functions and then Math.max, Math.min, etc. which are the same but work slightly differently (when it comes to IEEE edge cases like NaN, Inf, etc.) or that you have linear equality with == and all that fuzz. It's just inherently broken.
Also I personally don't think electron apps are broken, VSCode is an electron app and by now my favorite editor. It's that bad electron apps are broken. When someone knows what they are doing (or in the case of VSCode, Microsoft just throwing enough money at it), it works fine. A clean VSCode starts up in around 100-200ms. The reason most electron apps are slow and bloated is because they are badly designed, slow low performance code, with thousands of dependencies, etc.
Thats the real javascript hell, the fact that 70 million people a week download a dependency
is-number, which at it's core is nothing other than "!isNaN(var)" (or "!Math.isNaN(var)" because Javascript xD)