Notes for March 13-19

This is an abridged list of the non-work things I accomplished this week.

Monday, 2023-03-13

Random fiddling day.

  • Revisited RDP connections from to a domain-joined machine: Security protocol set to TLS, DOMAIN\username authentication (not the UPN), Best quality.
  • Cleaned out my homebridge configuration (also disabled automatically adding 433MHz sensors discovered by OpenMQTTGateway, which was a cute puzzle to sort out).
  • Triggered some monthly restic backups. Remember, kids, always have an off-site backup.
  • Looked at ComfyUI, which is intriguing to say the least (and a breath of fresh air after kludgy Stable Diffusion WebUis where the actual workflow is a mess).
  • Sorted out some media archives.

Tuesday, 2023-03-14

I can never get the hang of Tuesdays. My died mid-afternoon, so I found myself with some time in between troubleshooting sessions.

  • Found it rather amusing that I serendipitously sorted out remote desktop domain authentication yesterday, almost as if I predicted this. Still can’t get Remmina to work with corporate WVD, though, so might have to turn the into a temporary “corporate” desktop.
  • Did some spelunking in OpenMQTTGateway code and MQTT topics to understand what it can decode in the 433MHz band and how it is mapped to topics.
  • Spent half an hour with WeasyPrint to generate a presentable document out of Markdown notes. Still the best PDF generation tool out there, and has pretty decent CSS support, plus it’s trivial to automate:
MARKUP = $(wildcard *.md)

all: $(MARKUP:.md=.pdf)

%.pdf: %.html layout.css
    python -m weasyprint -e utf8 -m A4 -s layout.css $< [email protected]

%.html: %.md
    python -m markdown < $< > [email protected]
  • Created a ComfyUI sandbox on and spent a while collecting all the requisite models and going through the (maybe too whimsical) examples. Really happy with the UX so far, and with the fact that I went with a 12GB GPU.
  • Began adding docstrings to my py-sdf fork to make it easier to use with VS Code autocomplete.

Wednesday, 2023-03-15

Mid-week slump. Slept horribly, had a lot of catching up to do, still managed to have a few productive breaks:

  • Realized 4 was already in Fedora testing and grabbed it (it went into mainstream 3 days later).
  • For the first time this year, added a little bit more content navigation functionality to the site. Still very happy with the way the static page generator turned out.
  • Given my work laptop woes, tried to get a semblance of my usual environment working over RDP device redirection:

Client (Fedora)

  • Remmina, Advanced, Redirect local microphone, sys:pulse
  • Remmina, Advanced, USB device redirection, id:0fd9:006d#3564:fef4,addr:01:0b

Also make sure you can access the USB devices (some might be automatically accessible to dialout group members, but this makes sure):

# cat /etc/udev/rules.d/70-passthrough-access.rules 
# Elgato StreamDeck
SUBSYSTEM=="usb", ATTR{idVendor}=="0fd9", ATTR{idProduct}=="006d", MODE="0666"
# Webcam - tried it just to see if it worked, here for reference
SUBSYSTEM=="usb", ATTR{idVendor}=="3564", ATTR{idProduct}=="fef4", MODE="0666"

Server (Windows 11)

Run gpedit.msc and configure this setting:

Computer Configuration:
  Administrative Templates:
    Windows Components:
      Remote Desktop Services:
        Remote Desktop Session Host:
          Device and Resource Redirection:
            - Do not allow supported Plug and Play device redirection = Disabled

I have , but the above is what you need for USB pass-through.

The StreamDeck works great, the audio is passable, but I can’t get the camera to work since /freerdp still doesn’t support UVC camera pass-through (I already knew passing the raw USB device would be unfeasible, but I had to give it a go). For now, that only works in Windows and Mac/iOS clients.

  • Did a little more Fedora audio tweaking, including moving to a real-time kernel on the and setting to use pulseaudio (just because the preset for it had slightly lower latency):
# Quick set of essentials for audio priority
echo '@audio - rtprio 90
@audio - memlock unlimited' | sudo tee -a /etc/security/limits.d/audio.conf
echo 'fs.inotify.max_user_watches=600000' | sudo tee -a /etc/sysctl.conf
sudo usermod -aG audio $USER
sudo usermod -aG realtime $USER
sudo dnf copr enable ycollet/audilinux
sudo dnf install kernel-rt-mao

Thursday, 2023-03-16

Long meeting day, way into the evening.

  • Realized that a recent Raspbian update broke screen blanking on my automation dashboard, which can be worked around by reverting the X server version:
sudo apt install xserver-xorg-core=2:1.20.11-1+deb11u5
sudo apt-mark hold xserver-xorg-core
  • Spent a little trying to get the Linux Intune client to work in Fedora, even though it is unsupported. Got it to work via… unconventional means, but crashes when syncing an AD account.
  • Fiddled with PyTorch 2.0, but xformers hasn’t really been updated yet, so most Stable Diffusion tools can’t make proper use of it yet.

Friday, 2023-03-17

Winding down for the weekend. was serviced, which meant doing the BitLocker dance and appeasing the InTune deities, so that took a chunk out of my day.

  • Updated my page with a more comprehensive set of tweaks that I refined while was MIA.
  • Realized the CSS font stack for this site could be improved for monospace fonts, so I re-did the entire thing while looking at modern-font-stacks, which is a very handy resource if you are designing text-intensive websites and want to deliver the best possible experience without any web fonts.
  • Investigated a possible uwsgi bug related to cron tasks.
  • Investigated how to programmatically take screenshots under Wayland using dbus.
  • Fiddled with pyxel as a way to port some code one of my kids wrote in PICO-8.

Saturday, 2023-03-18

Family day.

  • Decided to clean up and post before it got too stale (had to drop a fair chunk of it because it was outdated already).
  • Brief outing to attend local Chemistry Olympics (kid brought home bronze medal, yay!)
  • Decided to tackle the Docker Apocalypse and start moving all my public images to ghcr.io. Even though I have a private registry at home (and another in Azure) some of my images are in general use and need a public repository, and they’re all in GitHub anyway, so I’m starting with this GitHub Action as a baseline to build and push new images for each new tag:
# cat .github/workflows/build-image.yml
name: Build Image

on:
  push:
    tags:
      - v*

jobs:
  Build:
    runs-on: ubuntu-latest
    steps:
      - name: Checkout Code
        uses: actions/[email protected]
      - name: Login to Registry
        uses: docker/[email protected]
        with:
          registry: ghcr.io
          username: ${{ github.actor }}
          password: ${{ secrets.GITHUB_TOKEN }} 
      - name: Build and Push Docker Image
        uses: docker/[email protected]
        with:
          push: true
          context: . 
          tags: |
            ghcr.io/${{ github.repository }}:${{ github.ref_name }}
            ghcr.io/${{ github.repository }}:latest

Since docker buildx is now largely usable, I will be updating my cross-platform images to use a slight variation on the above.

Sunday, 2023-03-19

Father’s Day over here, and another day impacted by machine issues.

  • Fiddled with rtl-433 a bit more, but I’m starting to realize it can’t pick up the decade-old 433MHz sensors I have.
  • My rebooted after updates to a corrupted filesystem (not sure if it’s a SATA issue or a btrfs one, but I know where I would place my bets), so I set the default boot device to the Windows NVME and begain reinstalling the Fedora drive as time permits:
# For later reference, this my baseline Fedora install:
# yabridge COPR
sudo dnf copr enable patrickl/yabridge-stable
# list of essentials I need:
sudo dnf install cabextract curl fontconfig git gnome-extensions-app \
gnome-shell-extension-pop-shell gnome-shell-extension-user-theme \
gnome-tweaks godot golang htop keepassxc kvantum liberation-fonts \
lm_sensors openscad remmina rpm-build rsms-inter-fonts syncthing \
tmux vim wine xorg-x11-font-utils yabridge docker
# RPM Fusion and MS web fonts
sudo dnf install \
https://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm \
https://download1.rpmfusion.org/nonfree/fedora/rpmfusion-nonfree-release-$(rpm -E %fedora).noarch.rpm \
https://downloads.sourceforge.net/project/mscorefonts2/rpms/msttcore-fonts-installer-2.6-1.noarch.rpm
# VAAPI and Firefox hardware acceleration
sudo dnf install ffmpeg handbrake libva-utils libva-intel-driver \
intel-media-driver igt-gpu-tools
# groups
sudo usermod -aG dialout $USER
sudo usermod -aG video $USER
sudo usermod -aG docker $USER

In the meantime Windows makes for a slightly better thin DAW box and work thin client (I get UVC camera pass-through, can run all VSTs and have WSL), but, ironically, my xrdp configurations are so fine-tuned that mstsc.exe is slower than .

I guess you just can’t have it all…

On Large Language Models

I’ve been pretty quiet about ChatGPT and Bing for a number of , the most pertinent of which is that I have so much more going on in my life right now.

But I think it’s time to jot down some notes on how I feel about Large Language Models (henceforth abbreviated to LLMs) and the current hype around them.

And I’m going to try to do that from the perspective of someone who:

  • Graduated from college soon after the peak of the 90’s AI Winter (yeah, I’m old–we call it “experience” these days)
  • Actually decided not to major in AI (but rather in more networking-focused topics) because of said Winter, although I went and racked up my point average by acing AI coursework as optional credits.
  • Survived several hype cycles over the past 30 years.
  • Dove into analytics and data science during the “resurgence” in 2012 (as well as racking up a few ML certifications) before getting sucked into telco again.
  • Spends an unhealthy amount of time reading papers and mulling things.

Plus the field is evolving so quickly that I’ve drafted this around four times–all the while progressively shrinking it it down to a quick tour over what I think are the key things to ponder.

How Smart is an LLM, anyway?

I’m going to start with an obvious fact, which is that LLMs just seem to be smart. Sometimes recklessly so.

Yes, typical outputs are vastly better than Markov chains, and there is a tendency to draw a rough parallel with running the probabilities for the next token through the LLM.

Like people like Tim Bray have pointed out, that is seriously underestimating the complexity of what is represented in model weights.

The reason why the Markov analogy breaks down is that LLM output is not probabilistic–there is randomness involved in setting up inference, sure, and sequential correlation between output tokens, but the factors driving the output are several dozens of orders of magnitude above what we were used to.

Random outcomes like the LLM starting to hallucinate are just par for the course of a neural network trying to go beyond the training data, or focusing attention on parts that lack enough conditioning to have a decent output.

But going back to the initial point, there is zero “knowledge” or intelligence in an LLM. There are impressive amounts of correlation, to be sure, but the core principle harks back to the first AI Winter–it’s just that we’ve crossed a quality threshold that seemed hitherto unattainable.

It may look like emergent behavior, but that is simply because we can’t trace every step that led to the output. There is no agency, nor real “understanding”.

And, as anyone who’s read Douglas Hofstadter will point out, there is also no “strange loop” or a coherent capability to self-reference–the outputs are just the result of navigating an LLM’s internal representation of massive amounts of data, and they’re entirely functional in more than one sense of the word.

Things Are Just Getting Started

Shoving all those orders of magnitude into something that can fit into an enterprise-class GPU (or, increasingly, a GPU and a hefty set of NVMe drives) takes quite a toll, and training LLMs requires massive computational power that is (for the moment) outside an individual’s reach.

But that is certain to change over time, and inference is already possible on consumer-grade hardware–like this past couple of weeks’ spate of news around llama.cpp proves, there is a lot of low hanging fruit where it regards optimizing running the models, and at multiple levels1.

Although things like weight quantization degrade the output quality quite a bit, I expect more techniques to pop up as more eyes go over the papers and code that are already out there and spot more gaps and tricks to run LLMs efficiently.

And despite the fact that the spotlight is on OpenAI and the massive cloud infrastructure required, I personally find it a lot more interesting to figure out how low LLMs can go and still produce coherent results.

This because I have fairly high hopes for tailored models, and see a lot of value in having fully on-premises and even embedded solutions–I know I’m bucking the trend here, but the history of computing is one of decentralization, and you’re probably reading this on a smartphone… So my point should be obvious.

What Are LLMs Good For?

Having spent entirely too long dealing with customer support and call centers (I actually find the generic “chatbot” thing extremely annoying, and resisted getting into building those, but such is life), I’d say that, at the very least, LLMs are certain to take virtual assistants and support chatbots to the next level.

And no, this is not a new idea–it’s been hashed to death over the years, and the real problem is that most support knowledge bases are useless, even if you manually tag every snippet of information and carefully craft interaction flows. Traditional chatbots (and even summarization-driven ones) simply suck at doing the kind of basic correlation even a script-driven, barely trained human can pull off on autopilot, and hacking them together was always a brittle and unrewarding endeavor.

But an LLM is trained on other content as a baseline, which gives it a much better ability to fill in the gaps in such knowledge bases, and certainly have better conversational skills than a goldfish–and I can see LLMs doing a decent job in highly patterned, formalized inputs like legal documents, medical reports, retail catalogues, etc.

How Reliable Are These Things?

To be honest, right now, not that much. I wouldn’t rely on any publicly available LLM for decision-making of any kind (coding, advice, or even accurate summarization), although every iteration improves things noticeably.

Sure, some of the and “style transfer” is pretty hilarious, but LLMs still have trouble with basic math, let alone writing reliable code2–they’re not even that useful at “rubber ducking” a problem.

Outputs are generally shallow and LLMs still have trouble creating coherent long form without hallucinating, but I do think they can be useful as baselines for a human to improve upon, as long as that person has a good enough grasp of the problem domain to spot obvious flaws in “reasoning” (not just incorrections, but also gaps) and the willingness to double check any references.

Of course, any of those sanity checks seem absent from a lot of the hype-driven discussions I’m seeing online… But, more to the point, LLMs do seem to knock things out of the park for short interactions.

Which is why I think the search market disruption gambit is going to pay off handsomely–LLMs make for a much better search experience because you get adjacent information you would otherwise be unable to get from either direct or statistical matches (and you don’t get pesky ads, keyword squatters, etc.)

How Manageable Are These Things?

This is where I have the most doubts, to be honest.

The current “programming paradigm” is hopelessly primitive, and all the early deployment shenanigans prove it–prompt stealing and prompt injection attacks (which can be much more interesting than you’d expect) remind me of all the loopholes Asimov managed to squeeze out of The Three Laws of Robotics.

Plus the ease with which the models “hallucinate” and veer off into the wild blue yonder were, until recently, being dealt with by ham-fisted tactics like limiting the number of consecutive interactions with the model.

In short, it all feels… very Sorceror’s Apprentice, to be honest.

And I don’t think “stacking” models or just creating embeddings is going to help here–long-term curation of model inputs is going to be key.

Which means time-consuming, costly, and ever more challenging work to improve general purpose LLMs, especially those targeting search (where having non-AI generated training sets is going to be harder and harder).

Fast Iteration, But What About Fast Training?

Another important constraint that is being glossed over is that there is no easy, immediate feedback loop to improve an LLM–in the current chat-like interaction models you can add more context to a session, but:

  • It doesn’t really “stick”–sometimes not even subsequent invocations (even if the session wrappers are continuously improving, you’re effectively adding stubs to the original prompt, and that can only go so far).
  • Any on-the-fly corrections don’t become part of the core model (you need to have a full training iteration).

These things can be worked around, but are fundamental limitations–and yet, they don’t have any real consequence for simple one-shot tasks like “summarize this webpage” and most of the “productivity boosters” we’re likely to see over the coming months.

But they do compound my notion that LLMs feel more like an impressive party trick than a broadly sweeping change in paradigm–at least for now. Their real impact lies elsewhere, and most likely beyond the obvious chatbot scenarios.

It would be nice to take away a lot of the drudgery we’ve baked into computer use (as well as several typical knowledge worker tasks), although there are interesting (and risky) implications in empowering certain kinds of people to mass-produce content3

Conclusion

So where does this leave us?

Well, we’re clearly in the upward swing of the hype cycle. And, like I pointed out at the start of this piece, I’ve been there before–the quick iteration, the optimizations, the unexpected new techniques in established domains, and the fallout (both good and bad). Those parts are to predict.

The big difference this time is that for users, the barrier to entry is effectively nil, and, again, the outputs are way better (and more impressive) than anything else we’ve seen before. Even if it’s still just a more elaborate Chinese Room, there is a lot more public interest and momentum than is usual in most tech hype cycles.

So yes, this one is going to be a bumpy ride, and not just for geeks. Make sure you have your metaphorical seat belt on tight.


  1. And while I was revising this Pytorch 2 came out, with a nearly 50% performance boost for image models–I’m just waiting for xformers to fall in line to upgrade my setup… ↩︎

  2. I routinely try to get LLMs to, say, invert a heap, or even to compose SQL queries (which I hate doing), and the results are always abysmal. I can’t even imagine how badly they would fare in medicine or law. ↩︎

  3. And I don’t mean political parties or nation states here. The prospect of mass-produced A.I.-accelerated reports, presentations, memos, etc. should be enough to give any corporate knowledge worker pause.. ↩︎

Notes for March 6-12

This is an abridged list of the non-work things I accomplished this week.

Monday, 2023-03-06

Cleanup day, not just physically.

  • Resumed weekly tidying of office. Considering adding a ritual sacrifice of my optimism and creativity (in the shape of a cookie) before turning on my work machine.
  • Was again stung by ‘s insistence in kneecapping its OS by removing media codecs in a new install.
  • Spent a while trying to get ChatGPT to help with creating a set of Azure Monitor queries in my personal subscription, which it was utterly useless for since it just… made up stuff (and confused Kusto queries with the simpler feature set in Log Analytics).

Tuesday, 2023-03-07

Some radio stuff.

  • Set up a LilyGo 433MHz board to run OpenMQTTGateway, which was childishly simple to do, but doesn’t seem to immediately pick up all the 433MHz stuff I’m interested in–I guess that will require sitting down with my SDR dongle for a while and tuning a few things. Was somewhat amused to see an Acurite Grill/Meat Thermometer 01185M pop up almost immediately, though.
  • Fixed some of my home automation, namely some overdue battery swaps and the LG Smart TV API integration that was broken by my having a few days ago.

Wednesday, 2023-03-08

Impromptu ISP veteran meetup day.

  • Walked 5Km in an attempt at doing some actual exercise.
  • Listened to another Oxide and Friends episode during the above.
  • Took a little time before dinner to check on Steam Linux updates and fool around with emulators, with mixed results.
  • Finally managed to get an Xbox controller to pair with by setting:
# head -2 /etc/bluetooth/main.conf
[General]
Privacy=device
  • Cleared out a mass of duplicates from my library.
  • Spent a fair bit trying to get sdf running on my iPad Pro, but scikit would have none of it.

Thursday, 2023-03-09

Woke up early again.

  • Had a stab at getting py-sdf (a more complete fork) to use CUDA with cupy, but numpy array type coverage isn’t there yet. Ended up starting a fork to experiment.
  • Tried to install my Kontakt VSTs in by resorting to an older installer, but Native Instruments’ software is just completely user hostile.
  • Learned the hard way that CAD Sketcher will only work “out of the box” in 37 if you install via flatpak due to different runtimes (also, updated regarding how to set it up in mm to edit STL files directly).
  • Played around with Steam and with plugged into my for a few minutes of glorious enjoyment. If it wasn’t for the fan noise when the GPU ramps up, I’d likely never move it to the closet.

Friday, 2023-03-10

Wound down for the weekend.

  • Tried to proactively clean up office after work to save time on Monday.
  • Spent a while reading up on CUDA development and clearing my personal backlog, drafts, random organizational chores.

Saturday, 2023-03-11

Decided to investigate 3D modeling options.

  • Personal inbox zero.
  • Got a somewhat usable modeling worflow going with py-sdf, meshview and Jupyter inside VS Code:
This works just as well on Linux, but it feels slicker on the Mac.

It can’t really replace , given the massive amount of libraries for it out there, but might be a good alternative for other things.

  • Practiced using CAD Sketcher to design a simple enclosure, which was somewhat of a failure. Realized I still remember a fair amount of how to use , just not the bits I need.
  • Collated and posted my notes on .

Sunday, 2023-03-12

Low-level and electronics stuff.

  • Upgraded my to new firmware versions to see if I can fix the niggling issues I’ve been having with devices falling off the network1:
cc2538-bsl.py -p /dev/ttyUSB0 -evw CC2652R_router_20221102.hex
cc2538-bsl.py -p /dev/ttyUSB0 -evw CC2652R_coordinator_20221226.hex
  • Tried to unbrick another CC2652R1F adapter I had put aside a while back, first with an FTDI adapter and later using a Pi as an impromptu adapter, but openocd just couldn’t detect the JTAG interface no matter what I did.
  • Rebuilt one of my ESP-01 prototypes on a clean breadboard. I have a dozen of the things and might as well make use of them.
  • Looked for updated applications for the Mac. Turns out there aren’t many (SDR Angel is an old fave, gqrx hasn’t been updated in a year, etc.).
  • Cleaned up and posted these notes.

  1. Right now I only have 28 devices, but reinforced concrete walls don’t help, even with routers on both sides of the “thicker” parts. ↩︎

The Beelink U59 Pro

Hot on the heels of getting assembled to go into my server closet, I decided to get myself a Beelink U59 Pro mini-PC to use as a beefier thin client, and these are my notes on it.

Read More...

Things You Should Read

It’s been a very tiring month (indirectly due to the ), and other than my weekly notes (which have turned out to be quite useful already) and circumstantial posts on hardware, I haven’t been able to put together more in-depth pieces.

Read More...

Notes for February 27-March 5

This is an abridged list of the non-work things I accomplished this week.

Monday, 2023-02-27

Long day.

  • Did some research on image formats and AI-driven compression optimization.
  • Got reacquainted with the , which I fiddled with for a bit in an attempt to start having fun with music again.
  • As a direct result, decided to succumb to gear acquisition syndrome and get something cheap to use as a dedicated DAW workstation.

Tuesday, 2023-02-28

Early day, brief outing.

  • Realized (belatedly) that multicast just doesn’t work on Windows at all like I need it to. Scrapped parts of my code and wrote another plain UDP fallback.
  • Played around with WASM to see how far I could go with a little test project.

Wednesday, 2023-03-01

No fun whatsoever was had in this day.

Thursday, 2023-03-02

Another day with late night calls.

  • Did some fiddling with –upgraded the NVIDIA drivers and migrated another container over.
  • Tweaked sizing for my instance from Standard_B2s to a Standard_B1ms, since CPU load is insignificant and I’ve set “lazy mode” on it and most co-located services, so it’s all running off cache and only spawning workers as needed.

Friday, 2023-03-03

Woke up at 5AM. Took the time to reorganize a few things.

  • Made a first pass at setting up a dedicated DAW box using a Celeron N5105 and… . Not entirely happy with the results performance-wise, especially given the OS overhead, so I installed on a secondary SSD and… it was overwhelmingly faster, even before realtime-setup and other tweaks.
  • Discovered still doesn’t work well with Wayland, which is a blocking issue for my current multi-desktop setup. Current workaround is to remember to log in to a standard X session in , which is sub-optimal.

Saturday, 2023-03-04

Moderate fun was had.

  • Personal inbox zero.
  • Spent a fair bit of time fiddling with and various VSTs. No actual music was made, but it was fun.
  • Figured out how to run Spitfire Audio’s VST installer under (it needs the dxvk package, which you can get via winetricks).
  • Hacked a workaround for stale connections in my home Snapdrop instance.

Sunday, 2023-03-05

Cleaning/rest day.

  • Read The Economist.
  • Spent a while unglamorously picking fluff from vacuum cleaner rollers with tweezers, which I sort of regret not having left until a work day since it was so cathartic.
  • Did some electrical work to wire a set of chargers under a secondary desk, including an Anker 737 120W brick that can fast charge two modern iPads or one laptop.
  • Got running on our Anbernic RG351MP, which means I now have a new music travel toy.
  • Printed some 3D models for kids’ school projects and taught sanding techniques.
  • Zoned out on Bad Gear videos.
  • Realized that a update broke PAM inside one of my LXC containers, and the only fix that restored the ability for a test user to ssh in (or for root to su into it) was commenting out pam_limits:
sed -i -r 's/^(session\s+required\s+pam_limits.so)/#\1/' /etc/pam.d/*
  • Cleaned up and posted these notes (twice, because I forgot a couple of things).

Notes for February 20-26

This is an abridged list of the non-work things I accomplished this week.

Monday, 2023-02-20

US holiday. Took the day off as well.

  • Punted on tidying office, put my work machine in a shelf instead and spent the day on my Mac.
  • Reviewed list of personal projects, which is in an that currently feels very much like this:
I have trouble believing all the pending stuff in there...
  • Took some time to relax, deal with some multicast madness and create wrappers for some of my utilities:
A wrapper for my Ambilight Swift script.
  • Did some more research for a future project.
  • Fixed a Nintendo Switch joy-con.
  • Added the final piece (a missing SSD heatsink) to and did a few stress tests by running a set of prompts in imaginAIry. Am quite impressed that I can absentmindedly RDP to it using my and things are still fast.

Tuesday, 2023-02-21

Mardi Gras. Short (non-Carnival-related) family outing.

  • Upon returning home, decided to celebrate by dressing up as a couch potato and trying to drum up the courage to dabble in music again.
  • On a whim, hooked up my to my as a second display and enjoyed a very nice 3440x1440 GPU accelerated experience, although it is clear at this point the Pi struggles a bit with both the extra display and what can throw at it–and crashes if I overclock it. Time to start researching fanless Celeron mini-PCs with dual HDMI and see if it’s worth replacing the Pi with one in six months or so.
  • Tried out pygwalker, which is a pretty neat way to explore my home automation metrics.
  • Briefly poked at building a new Klipper accelerometer module.
  • Installed the new PrusaSlicer alpha and began migrating my settings across.
  • Printed a couple of LEGO pieces in PETG.

Wednesday, 2023-02-22

Ash Wednesday, also known as “Monday” this week.

  • Filed #28 in pygwalker because I can’t really use the charting with the current default labeling.

Thursday, 2023-02-23

Mild chaos.

Friday, 2023-02-24

Relatively quiet day, spent mostly catching up on things. Hard to believe .

I checked on NEXTSPACE (which I tried ) and was sad to see the developer (an Ukrainian) hasn’t committed anything for a long time now.

Ended the day early and caught up on personal things as well:

  • Learned a magic incantation to reset a completely frozen, blank screen 2022 iPad that wouldn’t charge or show up on the USB bus: press volume buttons in order (moving away from power button), hold power button until Apple logo shows up. Phew.
  • Did a few test prints with Alpha 4 to try out some of the new fancy features, with spectacularly bad results (like Klipper freaking out with 200x extrusion rates). Turns out I need to scrub the configs a little more and remove some conflicting settings.
  • Realized some recent upgrade broke pulseaudio in xrdp, so I spent a while looking for systemd --user red herrings, eventually updated the xrdp-sink module and added an explicit startup item:
# cat .config/autostart/pulseaudio.desktop 
[Desktop Entry]
Version=1.0
Name=PulseAudio Sound System
Exec=/bin/sh -c "sleep 2; pulseaudio --start"
Terminal=false
Type=Application
X-GNOME-Autostart-Phase=Initialization
NotShowIn=KDE;
  • Fooled around a bit more with my setup and the binary I use to update my office bias lighting to match to what’s on my monitor. Unfortunately Wayland doesn’t make it easy to take screenshots, so I wrote a fallback that extracts the color hints from the current wallpaper instead.
  • Realized exists, downloaded the source and wrestled with GNOME Builder to get an actual non-Flatpak binary and, for good measure, set ICON_SIZE to 72 because the default 32 is obviously meant for ants. Seems to work OK with my , which opens up entirely new possibilities:
A minimal setup (yes, these are my theme icons).

This reminds me I haven’t written about how I use the Elgato Stream Deck on Windows and Mac, but the above is , with that “off-by-one” feel you get when you’re porting code across in a hurry.

Saturday, 2023-02-25

Family outing.

  • Read The Economist.
  • Personal inbox zero.
  • Light piku backlog/issue grooming.

Sunday, 2023-02-26

Catch-up day, mostly devoted to personal projects and cleaning up.

  • Migrated my instance from to an Azure Standard_B2s with minimum disruption. Part of it is due to piku since it was just a git push after I had copied the data over (and this machine currently runs five other applications), part of it Cloudflare, all of it just so nice and tidy (at least for now).
  • Finally replaced our old Huawei all-in-one ONT/router with the nicer (hopefully far less buggy) Vodafone Smart Router Sagem-designed model I infamously spent multiple months a replacement base for.
  • Literally screwed a Gigabit switch to a wall to make our utility cabinet tidier.
  • Cleaned up these notes for posting.

Notes for February 13-19

This is an abridged list of the non-work things I accomplished this week.

Monday, 2023-02-13

Back to the grind. Decided to go back to my setup for a few days.

  • Did my usual ritual tidying up of office before work.
  • Installed a Guacamole RDP proxy to allow remote access to my without a desktop RDP client. Was a bit amazed at how dog slow it is compared to a native RDP client when talking to an xrdp server.
  • Turned on Wake-on-USB on my ‘s BIOS and made sure I could power it on/off remotely using the .
  • Did a little more 3D modelling stuff in the evening to relax.
  • Finished last week’s The Economist.

Tuesday, 2023-02-14

I could never get the hang of Tuesdays.

  • Added a Wi-Fi/Bluetooth module and doubled the RAM on my .
  • Figured out a hack to get mstsc.exe to upscale properly in HIDPI mode: Just copy the binary, right-click it, and fiddle with compatibility mode settings.
  • Printed the filament holder arm I’ve been designing:
Not the prettiest design in the world, but it is load bearing and saves me a ton of space.

Not particularly happy with the way standard supports print in PETG, though. Can’t wait for to be updated with tree supports.

Wednesday, 2023-02-15

Very, very early morning calls, which does not jive with having two consecutive evenings of meetings past 10PM.

  • Cleaned up the source for my filament spool support and published it on GitHub.
  • Spent a little while trying to get xorgxrdp-glamor on my to use the NVIDIA GPU rather than the Xe iGPU inside an unprivileged container “the right wayTM“. May have to run my own X server build.
  • Directly migrated one of my LXD containers (my lovely ) into , with great success:
I love this theme--computer UIs have lost so much character.
  • As an indirect outcome, had an unamusing time trying to get VA-API working only to realize that it was ‘s fault (which I promptly fixed via RPM Fusion).
  • Fiddled with Bumblebee, which seemed promising but relies on a standard X server.
  • Paid bills, did paperwork of various descriptions.

Thursday, 2023-02-16

Routine medicals, some errands to run.

  • Since there is a long weekend coming up, decided to hedge my bets by setting up Windows in a secondary SSD and getting Steam installed.
  • Managed to try out Steam Link over lunch break. The performance on the Apple TV 4K is OK, and the PiKVM makes it a breeze to go back to a “work” configuration.
  • Started pulling together my notes on the new machine.

Friday, 2023-02-17

A long week of late night and breakfast-time calls finally took its toll–spent most of the day sleepily catching up on e-mail and pretty much trudged through the afternoon.

  • Tried to proactively tidy the office before the long weekend, because I just know I’m going to be leaving tools all over the place again.
  • I’ve let my stuff fall behind, but took 2 minutes to update my instance to support importing/exporting followers.
  • To unwind after work, tried Steam Link on the , which made Horizon Zero Dawn look amazing. Back-to-back NVIDIA encoding tricks, I suppose.

Saturday, 2023-02-18

Zoned out and spent the day reading, writing, fiddling with my new machine and doing chores.

  • Read most of this week’s The Economist.
  • Investigated that nemesis of gaming parenting, Nintendo Switch joycon replacement parts.
  • Migrated more containers and VMs into to free up rogueone.
  • Wrote up and published .

Sunday, 2023-02-19

Finally, a slow(ish) day to clear out my backlog.

  • Personal inbox zero.
  • Migrated my RSS feed collector/translator/enricher off Oracle Cloud into Azure.
  • Began prep work to move my instance there as well.
  • Fiddled with Steam Link a bit more.
  • Did some more reading, research and a little gaming–in PICO-8.
  • Cleaned up and published these notes.

Borg, My Post-Pandemic Homelab Server

Resistance is, indeed, futile. I now have a new server and its name is borg, partly because it is a rough cube ~22 cm on a side:

Read More...

Notes for February 6-12

This is an abridged list of the non-work things I accomplished this week.

Monday, 2023-02-06

Great day to walk 5Km to lunch and back again.

  • Did one better and vacuumed the office instead of just tidying it before work.
  • Completed another Oxide and Friends podcast episode during my exercise–only twelve or so more to go until I catch up with real time!
  • Got another HDMI to CSI bridge in the post, which was a good reminder to twiddle with the CAD design for the PiKVM case again and printed a set of test catches to figure out my ‘s minimal tolerances when using PETG.
  • Checked on various electronics parts still in transit.
  • Paid some bills.

Tuesday, 2023-02-07

No time to go out to lunch, despite great weather.

  • Did some research on the Intel AVX-512 instruction set.
  • Printed an arm to mount the modular filament holder I did , on my , which turned out great despite some PETG warping.
  • Fine tuned the 3D model for my PiKVM case a bit more.

Wednesday, 2023-02-08

Another 5Km walk during lunchtime, albeit a rainier one.

  • Finished off the PiKVM case design with better, more resistant snap fits and board stand-offs, did a final print, and published it on GitHub:
The final PiKVM case design.

Thursday, 2023-02-09

Another round of , which impacted my organization directly this time.

  • Ended the day early for sanity’s sake.
  • Lacking the mental bandwidth to use (which seemed like a good distraction, but requires effort and focus beyond what I can currently spare), spent a very late, sleepless evening poking at the model for a new filament holder to bolt onto my ‘s 2040 extrusion with minimal flex.
  • Did some overnight doomscrolling on LinkedIn, because I’m human and some of the best people I crossed paths with were posting their goodbyes.

Friday, 2023-02-10

More doomscrolling. .

Fortunately most of the remaining parts for my new server arrived (an i7-12700K, fast storage, a bunch of RAM and an RTX 3060 in a nice, compact case), so I had a ready distraction for close of business:

  • Put my work laptop on a shelf and removed Outlook from my phone temporarily. It can all wait for next Monday.
  • Took a few pictures, for when I actually blog about this:
Meet borg. Raspberry Pi for scale, since we're out of bananas.
  • Put together most of the components and got the machine to… Not POST.
  • Learned that naming a memory bank “A-1” does not mean it should be the sole DIMM slot.
  • Installed wirelessly by downloading the .iso image directly to my little PiKVM and booting it off its OTG port:
Yes, I set up the BIOS and Proxmox using my iPad. It's 2023, after all.

This is exactly why I decided to build a PiKVM a couple of months back, although when I planned for it I had .

Allowing me to do this from my iPad while sitting in bed unable to sleep pondering the layoffs was the little thing’s moment of glory.

Saturday, 2023-02-11

Dived into setting up my new machine, mostly managing to forget about work.

Deep Linux kernel and LXC hackery ensued, because I want to run it completely headless and avoid having the GPU dedicated to a single VM (or split into virtual partitions across several).

  • Spent a while trying various GPU sharing/virtualization/pass-through approaches in . Ideally I would like to be able to run ML workloads in an LXC container and Steam in another container, avoiding Windows altogether.
  • Spent an even longer while trying to align host device driver versions with client CUDA versions in the LXC userland, to no avail.
  • Learned that it is completely possible to have nvidia-smi work fine but have CUDA device detection fail to see your GPU, so PyTorch refuses to run.
  • Changed gears and tried setting up Steam in a container, but came across bug #8916, which Valve has closed without actually fixing. TL;DR: Big Picture mode is completely broken, and so is streaming.
  • Spent an ungodly amount of time trying to find a usable armv6 Arch Linux archive to install two packages on my PiKVM so that I can find it via /MDNS:
# cat /etc/pacman.d/mirrorlist 
##
## Arch Linux repository mirrorlist
## Generated on 2023-02-11
##

Server = https://alaa.ad24.cz/repos/2022/02/01/armv6h/$repo

They weren’t kidding when they said the Zero W was unsupported (Arch discontinued armv6 support in 2022), but it lives!

Sunday, 2023-02-12

I knew what I was getting into when I decided to get an NVIDIA GPU, but there is a limit to how much dkms and uid mapping one can enjoy.

Still, most things are now working the way I want.

  • After a bit more tinkering, got the NVIDIA CUDA drivers to work with LXC pretty much perfectly by installing driver version 525.85.12 on the host and in a container.
  • Tested CUDA with a known quantity: . Cycles rendering works great, everything is detected, but the DRI device nodes have the wrong permissions.
  • Spent a fair amount of time figuring out how to do uid/gid mapping in so I could access /dev/dri properly as a member of the video and render groups inside the LXC container.
  • Learned that LXC‘s lxc.idmap directive is particularly obtuse when you just want to map a gid. Will definitely blog about it.
  • Enabled the i7-12700K‘s iGPU (it’s Xe graphics, which is nothing to sneeze at) and made sure RDP sessions were able to use that for GL acceleration using glamor. It’s way easier than .
  • That got Big Picture mode working (and game streaming to my iPad), but rendered by the iGPU since Steam (apparently) can’t use the NVIDIA card if it is started in an X session managed by another GPU. Multiple GPUs are hard, let’s go shopping!

At this point, we literally went shopping.

  • Set up inside a fresh LXC container. Without any tuning and kneecapping the CPU bits by only giving it 6 virtual cores (but with no GPU restrictions yet) the 12GB RTX 3060 seems way faster than my and the 8GB I was playing with–but I’m using a completely different set of libraries on each and this is a meaningless tomatoes to apples to oranges comparison. Still, having a self-hosted ML sandbox was one of my key goals, so it’s a good outcome.
  • Played around with PRIME render offload by using __NV_PRIME_RENDER_OFFLOAD and __GLX_VENDOR_LIBRARY_NAME to tell well-behaving applications to use the NVIDIA GPU:
# This works fine (as do the Vulkan variants).
# Steam ignores it completely.
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia openscad 
  • Decided to call it a night and post these notes.

Will try forcing xorgxrdp-glamor to pick the NVIDIA card sometime during the week–after last Friday, I will certainly need something to keep my mind off things after work…

Notes for January 30-February 5

This is an abridged list of the non-work things I accomplished this week.

Monday, 2023-01-30

Felt feverish, insanely tired. Unsurprisingly, realized I got a cold.

  • Tidied the office.
  • Went through my personal project backlog and pruned a few things.
  • Ended the day early and spent some time doodling with music stuff.

Tuesday, 2023-01-31

Took part of the day off to nurse my cold, ended up spending a couple of hours in my blissfully sunny living room for a change.

  • Cleaned up my personal ~/Development folder a bit, archiving older projects and making sure only the ones I should be focusing on are readily available to avoid distractions.
  • Upgraded my MacBook to macOS Ventura 13.2. Nothing blew up, but nothing improved much either. System Preferences, in particular, is a buggy mess.
  • Upgraded zigbee2mqtt to new point release. Some sensors still dropping off the network, probably need to start debugging router firmware.
  • Reinstalled to check out the new version as well.

Wednesday, 2023-02-01

Got back to work. Was pleasantly surprised to receive most of my new server parts (except for the DeskMeet B660, which is, worryingly, yet to be shipped, even as I clean up these notes).

  • Resumed investigating how to automate cgroup resource limiting without going fully medieval on /proc.
  • Ran a number of tests on my cloud-init template to set up a nice sandbox that can also be useful for isolating apps running inside Piku.

Thursday, 2023-02-02

Busy day and family evening, so not much time to tinker.

  • Since Twitter decided to turn off free API access, I fast-tracked my RSS to bridge and got it to post . I expect it will have bugs, but as far as a quick and dirty 90-line aiohttp hack goes, it’s pretty decent.
  • As a direct result of the above, ended up fixing a long-standing bug in cron expression validation in Piku that had eluded me for a while and that was due to a reentrant call.

Friday, 2023-02-03

Nothing much to report.

  • Deployed the RSS-to- bridge in production. Yes, I know it’s a Friday, it’s less than 100 lines and non-critical.
  • Spent my lunch break fixing a double glazed window latch.
  • Filed some photos from 2008 I found in a relative’s old USB hard drive.
  • Went through my Drafts folder and noticed I hadn’t posted about yet–fixed that as well.

Saturday, 2023-02-04

A semblance of personal productivity, with actual tangible outputs.

  • Personal inbox zero.
  • Read most of this week’s The Economist.
  • Wasted some time chasing after a D3.js bug in an old custom chart I wanted to re-use.
  • Resumed a couple of my 3D printing projects that are in the fine adjustment stage: i.e., taking 2mm slices of pieces with incorrect dimensions, printing them out, and double-checking tolerances.
  • Published the first version of my replacement base for the Vodafone/Sagem ONT “Smart Router” that Vodafone Portugal provides to their fiber subscribers.

Sunday, 2023-02-05

CAD and 3D printing day. I think it’s the first time I had both printers going non-stop.

  • Tweaked some tolerances in yesterday’s 3D models, printed a final version of the ONT base and started planning for re-wiring the LAN closet.
  • Had another run-in with the Sunlu matte white “PLA of death”, which this time clogged my so badly it forced me to completely disassemble the extruder and actually drill through the clog.
  • Decided to step up replacing the extruder altogether with the Aero Titan I got last week and start designing an X carriage mount for it. Fortunately, I have all the original BQ Prusa Hephestos STL files (they’re still available online), so it’s mostly a matter of plugging holes and placing new ones.
Yellow for plugs, red for new holes.
  • Printed out some rough drafts of other printer parts (extruder mount, fan shroud, etc.)
  • Also printed out two sets of parts for a modular spool holder for which I had ordered some ball bearings a month ago. Pro tip: you can fine tune things by squirting WD-40 into the bearings and then fitting the roller with a heat gun.
  • Cleaned up and posted these notes.

Generating an RSS feed out of a Mastodon list

This is a little hack I’ve been running for nearly a month now to generate RSS feeds of some of my lists, namely the ones I and want to catch up on every few days.

Read More...

Notes for January 23-29

This is an abridged list of the non-work things I accomplished this week.

Monday, 2023-01-23

Decided to cut down on social and screen time to avoid the utterly depressing stream of farewell posts from laid off colleagues and acquaintances.

  • Did not tidy the office this week, for a change.
  • Dealt with personal e-mail, including arranging for a replacement of the package Royal Mail lost.
  • Waited for a technician to come round due to the inexplicable failure of one of my landlines last Saturday (during another technician’s visit to the building). Was a no-show, got a text to the effect that the fault requires someone else to come round within 3 business days, so I’m making plans to cancel that service and route around the damage.
  • To let off some steam, I decided to reach into the parts bin, dig out a 32MB(!) SD card and a 1B from 2011 (likely the first I ever bought) and flash bmc64 on it. Not terribly useful, and not a nostalgia trip (I’m a Sinclair ZX81 veteran), but it was a good 20 minutes well spent at the end of the day to build what is essentially an educational toy. Felt tempted to port Basilisk II to circle, probably would have taken a swing at it if there was SDL support.

Tuesday, 2023-01-24

Awfully cold day. Woke up with a sore neck, spent most of it squirreled away in the office with the heater on.

  • Dealt with personal e-mail, set up a couple of mailbox categories for 2023.
  • Fiddled with runc to see if I could use it as sandboxing duct tape for uwsgi. The requirement for a rootfs is an annoyance, and I don’t want to reinvent the wheel, so I’m back to fiddling with cgrulesengined and the like. They’re still being shipped as OS packages across most distributions that matter, and I can automate their setup with minimal fuss.
  • Added cgrulesengined and cgconfigparser to my cloud-init setup.
  • Futzed with a little to see if I could understand why VE does not apparently support cloud-init for LXC containers, which is annoying.

Wednesday, 2023-01-25

Added more things to my “3D printing projects” backlog.

  • Finally received my missing Royal Mail package (a Titan Aero exruder for my ).
  • Researched suitable extruder mounts and short-listed one for printing out and doing a test fit.
  • Got my MacBook 12” main board to boot again thanks to a replacement USB-C port I ordered on eBay weeks ago. Started designing a case for it around a massive aluminum heatsink.

Thursday, 2023-01-26

Very busy day. Zero actual fun was had.

  • Handled personal e-mail, scheduled some routine maintenance.

Friday, 2023-01-27

Feeling a cold coming on, which is perfect to do research and reading.

  • Finally managed to watch the first episode of The Last of Us in the evening. Seems promising, although I pretty much loathe zombie movies, even if fungi are the new brain viruses.
  • Rebuilt the back-end for my little Preact app.
  • Spent some time investigating mini server options. The best fit for my needs seems to be the ASRock DeskMeet B660. The Ryzen version has only one M.2 slot, and I can’t get Ryzen CPUs that match what Intel can offer in this range, so I’m going to start putting together a new machine with an i7-12700.

Saturday, 2023-01-28

Couch potato day.

  • Read The Economist.
  • Re-learned that Cmd + . issues Esc in iOS, which is such a massive throwback to my Mac OS 6 days that it is absolutely insane that I forgot about it for this long.
  • Spent a while debugging a stupid sqlite3.PARSE_DECLTYPES mistake.
  • Did more hardware component research.
  • Wrestled with a few unreasonable YAML files and fixed a bug in some of my deployment scripts that could have been prevented by Kubernetes picking just about anything else as a configuration file format.

Sunday, 2023-01-29

Family day. Brief outing.

  • In an attempt to be optimistic about building a compact PC with a minimally viable discrete GPU in 2023, decided to order a bunch of parts off Amazon.
  • Helped one of my kids publish a game for his first game jam. The overall ease of the entire process is pretty amazing, but the best thing was watching him go through the end-to-end creative process.
  • Did some more research to plan for migrating LXD containers onto a new instance.
  • Tested an Azure deployment template for a new project. Am a bit annoyed at API drift over the past year, and need to upgrade a few portions of it.
Archives3D Site Map