I find myself yearning for simpler computers. Not simpler as in iOS simple, but rather in the general sense. This is partly because computers are still unreasonably inefficient and unresponsive despite having the fastest chips we’ve ever designed, and partly because, to be honest, the ratio of computing power versus actual usefulness of computation these days is appalling.
As if the increasing complexity of browser engines did not rank them on an almost equal footing with full-blown operating systems, modern cross-platform desktop applications are increasingly shipping with their own private browser runtimes and a gaggle of weird custom interfaces that make a mockery of native UIs, in no small part thanks to Electron and its cousins.
So instead of “native” apps, what we’re really getting are multiple (subtly different) copies of the Chromium runtime to fill our SSDs and crummy, half-baked pseudo-UIs that often make it hard (or nigh on impossible) to handle such bare essentials as cut-and-paste and proper accessibility (or, on the Mac, that are opaque to scripting, system services, and all the other niceties that made using the Mac desktop environment a sublime experience until Apple started punting on them).
I find this trend worrisome given the web is still a smörgåsbord of incomplete implementations of convoluted standards that can only charitably be called “evolving” (in contrast to “utterly misguided”). Even if it is saving people time and money in the short term, I keep wondering how much technical debt we’re getting ourselves into as an industry.
The app economy is also to blame, I think – turning mobile phones into appliances that run sandboxed, often (frustratingly) single-purpose apps begat a mindset where it’s acceptable for apps to be largely self-contained silos, and if you consider that we’re now running increasingly deeper stacks of transpiled code on what are effectively desktop-class CPUs, it’s pretty obvious that today’s computing model has a lot of overhead.
Hailing from simpler times (my first computer was a Sinclair ZX81, with barely more computing power than a digital watch), I find it all more than a bit decadent – and I’m not even going to go into the way a good chunk of the app economy hinges on siloing and locking away your own data…
The Missing Plan
I’ve been following the Plan9 mailing lists over the past year or so (MARC is a nice enough way to peek in, if you’re curious), out of a grim fascination for its community (which is hard to peg, although it has faint echoes of the VAX mailing lists I subscribed to back in the dialup era) and the way it steadfastly hangs on – at least nominally – to the platform.
Plan9 is not a viable option for modern computing (it doesn’t support most modern hardware, follows antique UX paradigms and, most tellingly, can’t run any of the mainstream browser engines), but the idea of it is interesting. After last year’s deep dive into its little universe I had a Raspberry Pi running it on my home office for a while, and found it to be equal parts practical and infuriating1.
The bits that make it the most fascinating aren’t its antiquated UI, its reliance on three-button mice or the sparseness in software or features, but how cohesive and straightforward it feels when compared to today’s computing environments. Putting aside Apple‘s stuff, both Windows and Linux are a quagmire of inconsistent UX that has been piled on for decades, and I, for one, could do with much less clutter – albeit not without some visual flair and polish.
But bringing Plan9 up to modern standards is never going to happen, so if like me what you most value is a cohesive user experience and raw, unmitigated speed (the kind that makes me turn to vim to edit text and stick to Safari for browsing the web) there aren’t any real choices out there.
The Internet of Shit is not about Hardware
Like Ethernet, it is a fast enough, easy enough, common enough technology that is encroaching itself into every aspect of computing – even, it seems, in VR, as John Carmack mentioned in his (as ever, inspiring and utterly deserving of the label “legendary”) keynote at Oculus 2016. And, again, like Ethernet, it’s getting a bunch of forklift upgrades to cope with the load.
And even if we assume there’s now a somewhat broad (if rolling) agreement on the technology thanks to the rise of WebKit (and Blink), the business and content landscape is a mess – we’ve gone from banner ads to Flash ads to pop-ups and interstitials and all manner of clutter, and just as the demise of plugins and the onset of ad blockers was making the Web useful again, content producers (ever so hesitant in actually creating better content instead of churning out dozens of five-paragraph junk “articles” around a single piece of news) are now trying to block the ad blockers.
Things are spiraling out of control, and everyone appears to be losing when it comes down to ease of use, performance, and user satisfaction.
On the hardware side, if you discount tablets (which, for some people, are a noncommital limbo) and look for the simplest possible approach, one is left with the uneasy idea that Chromebooks are likely to be the future (at least in spirit) of what mainstream computing, with the dismal consequences you’d expect in terms of consistency and user experience – all the advantages of the web, with all the shortcomings, caveats and poor engineering we keep heaping atop it.
This seems like a good enough reason to keep avoiding doing any sort of front-end work and focus instead on back-end stuff, as well as shifting my projects increasingly towards languages that generate native, highly performant code.
The tendency of those lower-level languages to have stable ecosystems whose age is measured in decades rather than days is just icing on the cake, but, more importantly, should give pause to those who are trying to carve out a market for themselves atop the latest and greatest tech.
Me, I’ll just keep doing what I do until they eventually figure it out by themselves.