So the problem will grow (regardless of the UTF-8 vs. UTF-16 vs. UTF-32 issue).
But theres a difference between maybe in the future running out of codepoints, and knowing while you write your standard that you don't have enough codepoints. When the UTF-16 spec was written they already knew that it is not sufficient, this is why unicode planes exist. Microsoft has an excuse, they started implementing before UTF-16 was finalized, and before UTF-8 was available. Embarcadero doesn't
And if your goal is to create something to get rid of codepages and then re-introduce code pages agail (granted in a slightly less awful form), you failed. Except for the MS .Net languages and Delphi, no other popular language uses UTF-16. C, C++, Java, Go, Swift, Rust, etc. they are all UTF-8 (often with UTF-32) compatibility based. Microsoft was in a hurry and had to go for UTF-16 because UTF-8 was not finished at the time they developed their NT kernel. Embarcadero had over 10 more years to look at the situation, at a moment where anyone else said that UTF-16 was a failure, and said: Failure you say? let's go!
And it's not hard to have transparent conversion between UTF-8 and UTF-16 so you can work with the underlying APIs. Java does this (as I said the JVM is UTF-16 based, the language is UTF-8 native), FreePascal does it, and hell even Windows does it, because even Microsoft noticed that everyone else uses UTF-8 and started with Windows10 to make all ther ANSI String APIs UTF-8 compatible (which internally they just translate to UTF-16). So compatibility to Windows is a very bad reasoning, because conversion between UTF-8 and 16 is completely trivial. If embarcadero want's, I give them the function for free:
CodePoint := DecodeUTF8(sequence);
Plane := CodePoint div (2 shl 16);
u16Char := CodePoint mod (2 shl 16);
I believe if everyone else can, even the developers at embarcadero can do this.
I mean you can use R, there are specialized languages for everything, but whenever I worked with non computer scientists, e.g. I did my masters thesis in a project chaired by electrical engineers, or worked during my masters on some medical research, they all used python. Because to non computer scientists it's easy, intuitive and provides all the tools you need.
At least in academia, yes it is. When I left academia, every engineering and medical chair has made their switch to python. No more R or Matlab. Within like 4 years all alternatives nearly completely died down and as I was working at some of these chairs, I can tell you it werent nefarious reasons, it was just that Python was the new thing that all the students wanted to work with. Python has just won.
And just from that I don't think that it can be a bad language, when within a few years it completely captures multiple domains (and this was all prior to the big investments of MS and co), all by itself, there must something be to the language.
For example, what bothers me the most is the lack of types
Good news then for you, this was introduced just 10 years ago with PEP 484 in 2014.
# Untyped
def Add(x, y):
return x + y
# Typed
def Add(x: int, y: int) -> int:
return x + y
While not enforced by the runtime, if you use a compiler/linter like mypy it will not compile.
constants
Again, PEP 484 introduced final types:
MY_CONST: Final[int] = 42
poor class emulation
Because it's not an OOP language, it was never intended to have classes, at least not in the OOP sense. Python is a templating language, which you can use to emulate OOP, but it's correct, if you want to do OOP, you should use an OOP language. But thats like driving your car into the lake and then complaining to the manufacturer that it's very bad at swimming.
The need to configure the entire eco-system on each computer where you want to run something Python. It is too much time wasting. And no, Docker (or other such "inventions") are not a solution to this problem, they are just a clumsy and naive attempt to get around a serious problem. Problems are solved permanently and effectively, not by applying a thick layer of putty so that the cheapness does not show through.
May I remind you that for the question on how to deploy Lazarus applications on Linux on this forum the usual answer is either: Let the user install all the dependencies according to their distro
or: Use a container like docker/flatpack/snap, but there is nearly no documentation on doing that so good luck.
I can't find the last thread of such, but just a few weeks ago someone asked that question and did not get a sufficient answer. Deploying software with dependencies is hard. And tbh. it's much harder for Lazarus, where you need to install native libraries, than it is for pyhton where pip does all the work for you
This is caused by the simple laziness of people to learn. Nothing more. I have a very unpleasant experience with using "programs" written using HTML/CSS/JS (Electron, etc.). It's a poor emulation of real software. Moreover, it is bloated and resource-hungry in its operation. I agree that writing an "application" using Electron is incomparably easier than using, for example, Qt in C++. But if we go this way, then everything will be manufactured this way in a moment: houses, household appliances, cars, etc. From cardboard, string and plastic foil. This is technological regression, not progress.
Buuuut... it works right? Today there are more systems relying on this "bad" technology than there has been total software in circulation back in the 00s, when big heavy OOP languages where the shit. And it works really well. Yes my TV works with web technology, but it works fine and has so for the past 5 years, and I expect it to still work fine for the next 10.
Meanwhile in the 2000s all of that stuff was running Java, a language built around the software engineering best practices of the time, and tbh. only looking at the products, not considering my personal oppinion about the technology I only see improvements, not regression.
Can you really say that everything is being developed worse, when the final products are just getting better and better? To me it seems that this is perfectly well suited for the job. My TV would not be better if the apps on it were written in C++ or Java instead.