Well, it's a translation like "millions of flies can't be wrong so...". Just because people have some judgment (often naive) doesn't mean they're right.
In matters of taste, sure they are. In the end it's mostly that, a question of taste. Sure we could talk about that development in some languages is more productive or less error prone than in others. I'm well aware that weakly typed languages generally have a higher error ratio than strongly typed languages. But nobody decides to use a language because of that, otherwise we would all be using functional languages, as they have generally been shown to increse development time and decrease number of bugs, but we don't.
We chose languages with what we are most comfortable with, because even the best language doesn't help if the developer hates working with it.
I would argue that if there is such a thing as the perfect language, it's probably LISP, after all it's one of the first programming languages ever created, still in use today with many diehard fans, and has never gone through any major revision of it's core principles. A language thats so good from the very beginning that it never needed to change is pretty much the perfect language.
Yet you will never see me program in LISP because I really hate working with that language. It's perfect, but also terrible.
So yeah in the end with programming languages the most important thing is, how much programmers like it, and people really like python.
This is a hint to the interpreter. There is no need to declare types (like in C++ or Object Pascal).
It's not even that, the default interpreter simply ignores typehints. It's for additional tooling and compilers to perform the checking. Also you can add runtime checks for them. The language provides the tools through PEP 484, but to utilize it you need to add additional tooling.
But thats kinda normal that you need additional tooling, your FPC will also not work (at least on unix) if you don't have a linker installed.
These problems with implementing applications (and therefore Lazarus) on Linux are (unfortunately) the result of design choices made by its creators (Torvalds et al.). This cannot be changed now (unless someone decides to revolutionize the Linux architecture, which I highly doubt, and there would also be a revolt by most of its users, mainly server users - and they would probably be right). In the case of Python, there are also various dependencies and versions of libraries and interpreter configurations. Besides, this Python problem is also present in Windows (but less severe).
I mean thats always the problem with dependencies. Either you centralize like Linux does, which means you must trust the user to be able to figure out how to install dependencies, or you decentralize like Windows or MacOS does and you ship the same dependencies with each application.
Both has advatages and drawbacks. But python allows both. You can either require the user to install dependencies globally (e.g. using pip globally or the system package manager) or to install them locally (using a vritualenv).
I find python actually refreshingly easy, you clone a python repo and all you need to do is:
$> virtualenv .venv
$> source .venv/bin/activate
$> pip install -r requirements.txt
and you are done.
Sure I will have multiple instances of numpy on my machine because every project installs it seperately, but thats pretty much the same as the DLL hell on Windows
Generally, that ecology is important. In this case, script solutions are energy-hungry. There is no forward-thinking here. For now, this junk has no access to medical, aviation or military solutions. The problems will begin when some CEO decides that he can save money on expensive C++ programmers and hires cheap students, Python or other JavaScript enthusiasts.
It's always a question of scale. In a TV the thing that consumes all the energy is not the CPU running inefficient scripts. It's the big screen producing light and requiring cooling.
Even if you would write completely optimized handcrafted assembly code for the TV software, you will probably not even shave a single digit percent of the energy consumption. I would even reckon that the energy produced due to the additional development effort will be more than the total savings.
And the thing with medical or aviation stuff, I don't know what you are on about. Thats just throwing ideas around, point me to a real problem. Because I have seen software used for aviation control, and there they use very different mechanisms. Namely they very heavily go the route of formal verification. So it doesn't matter if it's python code or anything else (infact they usually use their own visual and domain specific programming languages as they are mostly engineers), because provably correct code is correct code.
JavaScript, on the other hand, should not be used outside websites. We all know why, even though some people like it. Besides, for today's needs, it is insufficient. Something new would be useful (WebAssembly was supposed to be, but I guess Google or Mozilla are no longer interested in it).
There is a good reason why web technologies are used on many things that are not websites, it's cross compatibility. You only need to port a browser and everything works exactly the same. Thats why it's used where Java was used previously. Radios, TVs, Fridges, Phones, etc.
I mean we probably all have written GUI applications with Lazarus, and I can't count how often I ran into bugs because QT has different behavior to GTK or to Windows Forms and don't get me started on cocoa.
Building something that needs to run on dozens of different small OSes is very hard. And emulated languages like Web technology completely circumvent that problem. Build your app in JavaScript and it will run on any device the same. Build it in Lazarus and your development time scales lineraly with each platform added