Nowadays, we come across many websites where the pages are too slow. They take a while to load simple content like thumbnails, texts, and pictures. Most of the time, these pages are slow not because of your internet speed, but because of the technology the websites use for serving content
I mean, this post makes no valid argument against JavaScript, there’s no benchmarks or anything aside from an opinion.
I don’t personally like webdev and don’t like to code in JavaScript, but there are good and bad web applications out there, just like any software.
The author seems to know the real problem, so I don’t know why they’re blaming it on JavaScript.
Because only JS is able to do that in a web browser. Everything else is just a dependency tree.
A page could load thousands of images and thousands of tiny CSS files.
None of that is JS, all of that is loads of extra requests.
Never mind WASM. It’s a portable compiled binary that runs on the browser. Code that in c#, rust, python, whatever.
So no, JS is not the only way to poorly implement API requests.
Besides, http/2 has connection reuse. If the IP and the TLS cert authority is the same, additional API/file etc requests will happen over the established TLS connection, reducing the overhead of establishing a secure connection.
Your dislike is of badly made websites and the prevalence of the browser being a common execution framework, and is wrongly directed at JS.
That’s not necessarily special to JS. It’s special to client-side code. A mobile app writing in swift could do this. A cli tool written in any language could do this.
This isn’t an argument against JS, it’s an argument against misuse of client resources.
edited my comment to include the excruciatingly obvious assumption.