If and only if there's a full/headless browser present. So same inherent issues as the Java Runtime and the .NET runtime without adding any value.
The thing that adds the *value* of cross compatibility adds no inherent value? Are you kidding me? Cross compatibility is the value and you even acknowledge that. This is exactly the reason why languages like Java, Python or JavaScript are so popular. You need to install just 150mb or so once, and from that point onwards you can use any application using that technology.
If you look at the most popular languages according to their prevalence in open source projects on github, the three most popular languages by a long shot are Javascript, Python and Java, making up together nearly 50% of all the repositories.
Also this argument simply does not work at all, because if you ever used Linux, you might have noticed that Linux does not provide any native system apis for creating graphical user interfaces. In fact you need at least a display server. Modern applications also don't talk to the displayserver directly, but use a framework like QT or GTK for this.
So any argument against a runtime environment must be able to explain why why relying on a 150 mb electron installation is bad, while relying on a 150mb QT5 installation is no issue at all.
HTML was not created with this in mind. With the craze of browser stuff, it's been mangled to accommodate it, with no real sense of future proof or even standardization.
Originally not, but things can change. Today the W3C consortium designs future versions of web technology with exactly this in mind. And what about that there is no standardization? HTML, CSS are all standardized by the W3C. The last version of the HTML standard, HTML 5.2 was released in 2017:
Link. The last version of the CSS (2.1) standard was published in 2016:
Link the CSS3 standard is not finalized yet, but the consortium is currently working on it with many features already finalized.
JavaScript is standardized, under the name of ECMAScript, by ecma international, and the latest version ECMAScript 2020 came just out last year:
LinkAll these standards are developed in close relationship with browser developers and web developers to be designed to serve the emerging needs of the userbase.
So this point is just plainly and factually wrong, front to back. Pretty much everything on the web is standardized and graphical abilities are a major concern in the standardization process.
Also a bit bold by someone who uses Pascal to criticize lack of Standardization. Because Pascal is one of the few languages where there are Standards but everyone decides to ignore them completely.
I still think its a craze/fad but I'm not futurist to see it's cycle end. But granted, there is a crowd that as chosen to board that train.
You can think what you want, personally I also don't like web technology, because everything feels like they reinvented the wheel, like storing data for offline usage, instead of allowing an internal filesystem it deploys a complete database system, which makes the very simple act of storing and loading files a real pain in the neck.
But a large community is a major argument for a development environment. It means more libraries available with more users to be using and testing them, more support and less likely to loose support in the future.
But in the end, native only depends on the OS. It does not depend on the interpreting platform(JavaScript/Python/PERL/PHP).
If for some wacky reason the interpreting platform goes away, so will all the effort put on it.
If the OS goes away, errrmmm, well, then the computer doesn't work and the argument is moot.
You can't be serious here. Have you never updated your computer? If Windows XP goes away, I can still use the computer using Windows Vista, 7, 8 or 10, but there will be software that will not run anymore. Whats the difference between software that does not work anymore because the only OS it was running on was going EOL vs a program not working anymore because the only interpreter it was working on going EOL.
Except that it is often much easier to run outdated interpreting environments on a modern machine then it is to run outdated operating systems. Windows XP would simply not work on my new PC, but from personal experience I can tell that I was able to easiely run Java 5 on my new machine, which I needed for a very old application to run locally.
Adding another layer of complexity and resource hogging software is not something I'm very interested in, even if all your bullet list points where true.
Is it harder going the compiled way? Well, it's bloody easier from the assembly days and the IDE's help a bunch.
Yes it is harder in certain circumstances. How about we do a simple experiment. We write both a calculator, I using only web technology, you fully native. Requirements are: It has to run on Android, iOS, Mac, Linux and Windows. Who do you think will have to put more work into this? I am pretty sure that using webdev I could do that in like 10 minutes.
@Warfley: A compiled 32b app made in the Windows XP era, can still run in all it's glory on a Windows 10 64b machine of today.
This is a property of windows which Microsoft pays a lot of money to have. Neither on Android, iOS, Linux or Mac will this work. You basically just picked out the single platform this works with.
To me your whole post feels like you are also just talking about windows and windows alone. But thats not what reality is like. Windows is not the be all and end all for software development, it isn't even anymore the most used operating system, android is. Tablets have taken over a lot of the market from home PCs, and Android and iOS together make up 56% of the market.
Even on desktops windows is falling especially in the US, ChomeOS already makes up nearly 7% of the market, Mac OS another 30%, with windows only left with 60%.
Modern software must be designed to reflect this change. Most applications must run on mobile, for many apps like mail clients or chat services, this might even be the most important requirement. Corporations are using more and more using Chrome Books, for whom web development is the native platform.
Sure you can say that you personally only develop for Windows, in which case you are right, there is no reason to use web technology. This is why Microsoft still develops native apps for Windows using C++ or .Net and not with web technology. But don't act surprised that people with different requirements might choose a different development platform
If you try to run your power hungry, memory hogging HTML5+CSS+JS web app on an Internet Explorer of the time, you're shit out of luck aren't you?
How can you complain about standardization earlier and then mention internet explorer, which was notorious for ignoring the HTML, CSS and JavaScript standards. But that aside, this comparison is very lackluster, because this is exactly the other way around as your previous claim. You develop apps for the current system, not for the past. A modern Windows 10 64 bit application won't run on 32 bit Windows XP either.
To make the analogy more fitting consider a website written in XHMTL 1.2 from 2003 and run it in a modern day browser. And this works still really well, because backwards compatibility is also a major concern for browser developers. In fact a webapp from 2010 will still work in a modern browser. A native android app from 2010 will not work on a modern android device. This is again a point for web tech and against native
This whole Web craze really looks like the way Microsoft kept pumping out new versions of Windows, just to make people upgrade their hardware...
Like Marco says, if you have state-of-the-art gaming rig, then you can run, PROBABLY, 2 Electron apps at once.
If you still have Windows 8 and don't update hardware, it's 1 Electron app and the computer turn into a heat exchanger!!!
Which is completely wrong. VSCode, one of the most complex electron apps out there even runs on my 7 year old macbook air completely fluently, much better than for example XCode which is a native app, even developed by Apple themselves.
In fact, I have VSCode open nearly all the time, and I don't notice it at all.
Yeap, that rando on the internet that disagrees... Sorry!!
Well disagreement is one thing, but the thing that bothers me with your response is that it is factually wrong on like half of your points. You claim that the thing that adds cross compatibility, which is a value, has no inherent value, is plainly a contradiction. The claim that there are no standards and that HTML is not developed for complex graphical usage is also just wrong, and that it if an OS goes EOL or "disappears", you can't use your computer is just comically stupid. Your last post about backwards compatibility sim completely missed the mark because of a analogy that simply does not work and makes it even ridiculus to think about because backwards compatibility is one of the major concerns for web development, and browsers are in that regard even better fitted than Windows.
I don't like webdev, I think its practices are often overcomplicated, reinvent the wheel on nearly all issues and disregards any best practices that developed on native development. I think JavaScript is a stupid language that should be burned to the ground and completely reworked, and the idea that you should include a whole framework just for one line of code, which is a common practice in web development, is just beyond me.
But it has it's merits, and to conclude here, every argument of yours that does not boil down to "I don't like it" (which is completely fine) relies on misconceptions, faulty comparisons or being factually wrong.
You seem to have a very strong opinion on that matter, but it is build upon misconceptions, an outdated view of the technology and really the points that you are bringing up are, for anyone who knows even a little bit about the technology, just comically wrong.