The Local AI Moat

Regular readers will know that I’ve spent most of the past two years shoehorning LLMs into single-board computers, partly as a learning exercise and partly because there are lots of local/”edge” applications where semantic reasoning (no matter how limited) and “interpretation” of sensor data are actually useful.

But now we’re at a point where running a decently useful open weights model on your laptop is entirely feasible.

This comes at what is possibly , and after having started my own inference library and tried hacking away at @antirez’s brilliant hack within my meagre resources, I feel like a serious rift is developing between the “haves” who were lucky to get hardware on time (or can splurge multiple K of European Pesos on it) and the “have nots”.

The societal impact of the entire thing in the always hype-driven geek community is, of course, fascinating (especially since a very small number of people have a disproportionate amount of influence in this little echo chamber), and I sometimes feel like Jane Goodall observing packs of opinionated chimpanzees, but I digress.

Personally, after spending the day mulling on this, I find the whole thing extremely depressing, for three reasons:

  • Despite , I see computers as something inherently distributed and personal. There are a lot of latent contradictions here, yes–I’ve learned to live with them.
  • As an European citizen, the geopolitics of the asymmetrical situation we are in today regarding technology and AI in general , and yes, I have learned to deal with that too, but really wish I could do something about it.
  • Personally, I can’t afford to keep up. People in startups, self-employed or in very specific minuscule niches might be able to spend enough to do so, but I can’t.

I’m thrifty by nature, usually plan (and over-think) my purchases years in advance, am at a point in my career (and the industry) where job security and already had , so saving up every dime I can for a potential rainy day has been very much on my mind and I now agonize over stuff as simple as ordering a 70 EUR battery to revive an eight-year laptop (because, yes, I do still use old machines).

So there is (pardon my French) absolutely no fucking way I am getting decent local inference hardware anytime soon. And I count myself lucky I built when I did, even if that was also a painful decision at the time and it is now hopelessly outdated for most things.

That’s it. I’ve vented. Now I’m going to take something for my sinuses, chase it with an antihistamine, and doze off until 4AM tomorrow.