🌐
Videos Blog About Series πŸ—ΊοΈ
❓
πŸ”‘

Web Programming seems to finally be standardizing. How did we get here? πŸ”—
1661204858  

🏷️ blog 🏷️ www
entirely by accident

An old (ish) trick to speed up webpages is using sendfile() to DMA files to a socket. Nowadays you use SSL_sendfile and Kernel TLS, (optionally offloaded to a specialized TLS processor) but you get the idea. Bypass the CPU and just vomit data out the NIC.

Couple that with the natural speed benefits to the "god algorithm" (already knowing the answer, e.g. caching) and the strength of static rendering frameworks became clear to everyone. That said, static renderers didn't really catch on until recently and even now dynamic renderers are the overwhelming majority of pages out there. This is because building a progressive render pipeline that is actually fast and correctly invalidates caches at each step is not an immediately obvious design.

Templating engines tend to encourage this approach, as they all have some kind of #include directive. The step from there to static renders requires integration with the data model, so that re-renders can detect changes in the underlying data. Just like how strict typing helps optimize compiled programs, well-structured data aids template renderers in reasoning about when to re-render. In every imperative program this is how the actual source is linked and built. It has been fun watching JS and typescript frameworks re-learn these old lessons the hard way as they get frustrated with long build times.

The trouble then comes down to how do you serve this data up to the browser? You can't simply hand it two HTML documents, one static and the other dynamic without using things like frames (inline or via frameset). The best you can do is use JavaScript to insert data into a page. Even then, if you insert new DOM, this will be slow. It is much faster to *only* flesh out missing data in a fully formed interface, and juggle visibility based on whether the data is loaded or not.

This is obviously far from the original promise of HTML's declarative nature. Nevertheless, it means the only performant strategy is to divorce the interface from the data and fill on the client side. If there were some standard means (say via header in a HEAD request, or link tags) to instruct browsers to fetch JSON DATA sections to fill the innerText of various selectors with we could perhaps do away with nearly all XHRs and spinners on cold-loads entirely. If you could do it on cold-loads, you could also do it within documents and on-the-fly, leaving the only role for JS to be managing state transitions. Alas, that ship has probably sailed for good.

HTML has become a widget toolkit rather than means to create documents as it was originally envisioned. This happened because it was not openly trying to be a cross-platform widget toolkit and thus this aspect was not actively suppressed by the OS vendors. I don't think it's a coincidence that Javascript is now the fastest growing programming language, despite frequently being hated more than PHP over the past 20 years. Worse is better works to some degree because those engaged in anti-competitive practices don't see things that are worse than their crap as a real threat. HTML/CSS/JS was a far worse widget toolkit than any of its competitors until relatively recently.

This is not to say that the browser wars and repeated embrace, extend, extinguish attempts by Microsoft and other vendors didn't come very close to killing HTML/CSS/JS. They very much wanted their own document standards to succeed. As a result you still see a bunch of Word and PDF documents passed around online. Things stagnated for a good long time as a result of this. But when it turned out the OS vendors were too dysfunctional in the wake of the dotcom crash to actually build something better, forward motion slowly resumed.

Despite the OS vendors rightly seeing the threat to their business open web standards represented, they proved too useful to the new titans of tech. Being the things powering social (read:advertising) networks ability to reach into everyone's pockets ultimately tied the hands of OS vendors who had for decades prevented anything truly cross-platform from working well. The stars have finally aligned and the OS wars are mostly over. Hooray.

This is largely what is behind some of the questionable activities of the WHATWG. The ham-fisted imposition of DRM and slavish pursuit of what the ad networks want has sent the web down some blind alleys of late. Nevertheless it's clearly not in their interest to deliberately kneecap the web and pages being capable of performing well.

Anyways, since all this API data is going to require a hit to the CPU to stream it, it must by necessity be returned in very small chunks if it can't be delivered and stored persistently on the client side for future reference. Hopefully the entire API to serve this stuff can fit inside cache. This requires a uniform design to your backing data that can be queried simply. Dare we say with a standard query language.

What I am observing is that the only role left for programming languages other than JavaScript in the userspace is as batch processors and API servers that are glorified proxies to SQL servers. Even then Node is a strong contender for use in those jobs too. Thanks to recent developments such as tauri, we might actually get truly cross platform interfaces and even window managers out of the deal.

25 most recent posts older than 1661204858
Size:
Jump to:
POTZREBIE
© 2020-2023 Troglodyne LLC