This year I decided I was going to take a break during the holiday season, or else. As it turned out, I am sort of taking a break while recovering from another cold and drenched in antihistamines to the gills, which is neither here nor there but which at least affords me some time to jot down some cursory notes and string them together by topic.
Overall, it was an unusual year. Ignoring (for the sake of sanity) the political upheaval across the Atlantic, technology and industry still left somewhat of a sour taste that I couldn’t foresee, and the overall feeling I have is that of a year we’d all gladly skip – if we could.
My coding output decreased markedly this year, which I blame squarely on the kind of work I’m doing these days and the constant context switching I have to muddle through. After years of having a single working environment seamlessly synced across multiple Macs, going back to switching operating systems between home and work is taking its toll, and is something I intend to revisit next year.
But most of it comes from the nature of the work itself – architecture work in “traditional” IT is rather less about building stuff from scratch than re-using existing pieces, so there are a lot less interesting problems to solve from a coding perspective.
I ended up in the rather ironic position of being unable to dive into the .NET ecosystem in any meaningful way, but then again I’m not really sure I want to, since my current skill set is tremendously useful in my role, and I’ve had a sizable amount of fun leveraging it.
Azure is now a completely different beast from what it was when I joined up.
Besides the dizzying array of features launched each week (which is par for the course, as I also read and subscribe to competition announcements), the tooling and APIs are constantly improving, and my toolbox is now updated on a monthly basis.
Templating has made infrastructure deployments trivial, but moving infrastructure to the cloud is only one step in a transformational process that most organizations have trouble addressing systematically. Software architectures are changing, but it will take time, and despite hosting the Web Summit and having plenty of technical savvy, Portugal isn’t that well off that established companies can re-architect their IT solutions at the drop of a hat.
As with many other technology shifts in the past, the trouble is not so much about the tech itself but with people’s ability to change, and that’s all the more evident now that it entails removing reliance on physical assets.
For me, this is the year Apple fell off the pedestal, for three reasons:
- Their software quality has decreased markedly (witness my struggles with iOS, photos and backups – especially photos)
- Their hardware designs now completely prioritize style over substance, with polish and state of the art technology taking (even more of) a backseat to profit
- Their biggest push in what regards technical innovation is now… expansion ports and dongles.
The last point is the one that is likely to be most contentious, so I’ll expand upon it a bit: they may be driving innovation in terms of manufacturing processes and integration (all of their devices have unique characteristics that only they can deliver), but they can hardly be considered visionaries at this point – if they have a computing and user experience vision that goes beyond the what the iPad currently delivers, it doesn’t show (or, indeed, seem feasible). All we see is machines being turned into sleek, expensive appliances at the expense of existing connectivity and functionality.
Apple is unlikely to move wholesale to ARM in the near future (at least not without redesigning all the glue they get essentially for free with the Intel chipsets), but nobody cares what chipset an appliance is using as long as it works.
I am atypical in that I mostly rely on my iPad for a lot of my personal computing needs and cannot envision myself using regular PCs in the future, but am still concerned with the shift to web technology.
It’s clear to me that despite my distaste for the trend and underlying technology, the future of mainstream desktop apps is now a mix of high-quality (but expensive) fully native apps and a bunch of generic, cross-platform Electron shells with captive web apps inside.
The browser hasn’t replaced the desktop yet. And given this trend, it probably never will, really – they’ll just glom together into an even bigger mess.
Being older and (arguably) wiser, I’ve learned to curb my tendency towards drafting elaborate plans, but there are a few major areas I intend to address.
One is work-life balance – I’ve always skimped on the latter, and I think it’s time to re-assess that, or at least make sure the former becomes more enjoyable and gratifying, even if it entails shifting away from technology (something I ponder every year and have so far avoided tackling). That entails a little more soul-searching and assessing which goals to reach for, so I expect it to take a long while to sort out.
Another is figuring out which subset of those goals fit into what I need to learn next to stay relevant. For instance, I have so far resisted the temptation to jump onto the deep learning bandwagon – there is so much to be done in terms of data cleansing and simple plumbing (most of what people actually need is meaningful, trustworthy data, not magical insights) that I’m left with relatively little time to do fancy stuff.
Deep learning seems interesting in that regard because it’s been too long since I’ve had truly off-the-wall stuff to do, I have the requisite background, and it seems to be what all the cool kids are doing these days.
Which obviously means there’s a sizable risk of it being over-hyped, but then again anything that is poorly understood by the tech industry falls into that category, and I at least have a notion of its suitability to task.
Onwards, then, to an arguably better future.