An Update on SyncThing

It’s been since I began the process of switching away from and onto a combination of and and I’m overdue a status report of some kind, so here goes.

Migration

Like , I leveraged Synology’s Cloud Sync feature to migrate things across–I added and to my user account on my , waited a couple of days until it had downloaded everything, then moved files across inside my user account from one folder to the other.

This freed up my iMac from doing the heavy lifting and made the migration effectively instant, although I did have to wait another couple of days for everything to propagate across (and every time I fired up another machine spent a good while churning through the changes, which is one of its weak points).

Then I removed from all my Macs, and installed on the Synology–but not using the “standard” Synology SyncThing package.

in

To ensure I have full control in terms of folder isolation and networking configuration, I am running a dedicated SyncThing Docker container (this one, with the usual linuxserver.io trimmings1) under my Synology UID. Running an instance per user seems entirely feasible for 4-5 people, but right now I’m only doing it for myself (and since we all have at home, that is likely to be sufficient).

This instance both re-publishes what’s left of my folder (so that I can have a small subset of files available on iOS) and keeps a Synology-side mirror of a few specific folders (essentially this site, a couple of other large, standalone projects and a Development folder with all my active repositories).

So as long as the NAS is on, any machine I log on to is sure to have a fresh copy of everything synced from it.

History and Data Recovery

On the NAS, I have SyncThing’s history feature switched off for most of those shared folders (especially the ones, where it would be more hindrance than help), A notable exception is my photo inbox folder (where I sort and cull photos), that keeps local history on my iMac.

As a safety net, I rely on Time Machine, the Synology’s #recycle folder and the fact that the entire NAS backs up to Azure daily.

Considering that I very seldom needed to use Dropbox’s 30-day history, I think that’s good enough, although (to be fair) Dropbox’s UI makes recovery a lot easier.

In Practice

So far, has worked OK for its use case (i.e. letting me switch machines between my Macs and my Elementary laptop and pick up my projects exactly where I left off with minimal friction).

It also worked without any hitches when we were hanging off my 4G hotspot off in the country (I took my Elementary laptop and it was able to use STUN to talk to my Synology within seconds).

Something I’ve noticed is that carries with it a lot more mental overhead (which was what I expected, but still). Having multiple folders (or, more accurately, “roots” synced across multiple machines makes it harder to keep track of what is synced and isn’t, and the lack of a /-like graphical “selective sync” feature is especially annoying–I am too used to “fetching” old projects by just popping into the UI and telling it I want those synced to the local machine.

On the other hand, I’m also actively filing away old projects into a instance (also running on the Synology as a container from linuxserver.io) because my was just too cluttered with millions of source code files, so it’s early days yet–there is a lot of change afoot.

But, again, some form of sync status indicators in Finder would be nice.

There is the occasional bit of weirdness when I do a git status on a slower machine that has just woken up and it throws up fatal: bad object HEAD until sync finishes (usually within a few minutes when it needs to catch up on a few days’ changes when I’m remote), but nothing that gets in the way–in that regard, has been more than fit for purpose.

, on the other hand, has been a nuisance to use, especially in iOS—the file provider still fails to sync occasionally (especially uploads), so I’ve been uploading files to my Macs using the free tier and having the Synology shunt the files across via .

Data Loss Events

I’ve had two instances of data loss, one inside and another on .

The one was easy to figure out and solve: one of my repos failed to sync upon pull to my Elementary machine (which only has a 64GB EMMC, and ran out of space), and I had to re-clone it since somehow it corrupted the local repo and that propagated back to all the other machines. It was a bit extreme (and certainly an odd corner case), but understandable.

The one, though, was a showcase of how spectacularly bad it can be from a UX perspective, and happened like this: I was editing an Excel spreadsheet on my Mac, and after adding a dozen new rows to a dataset across a number of tabs I “saved” the file, closed Excel and noticed that was hung (with an X showing in the menu bar).

I popped the menu, and it listed the Excel file as being a conflict. It also offered to either keep both files (renaming the local one) or open both in Office to resolve the conflict (which is what I picked).

Excel opened only one file (the original one). There was nothing left of my changes (not even in cache folders, where I occasionally do some spelunking).

Now, why did this happen? Because Office applications appear to bypass , confusing each other in the process2. I’m guessing that either Excel or downloaded the old version and somehow deleted or overwrote the conflicted one, which is why Excel only opened one file.

Sadly, the change window was too short for Time Machine to be of any use, and version history in (either locally or on the web) had obviously not picked up anything, so… I had to do all those calculations again.

As a result I’m strongly considering having keep local history for some folders, just in case.

Conclusion and Next Steps

None of these problems would happen with , but then again one of the things I’ve noticed is that my machines are now much faster when moving files about (which I suspect is due to not being around anymore to intercept file change events, which it notably did even outside its own folder).

But, all in all, the change was entirely manageable, and I can still use the free tier (and the grandfathered 15GB of storage I still have on it) on my iOS devices (and indirectly through Cloud Sync).

And since I have to use iCloud Drive regardless, I am now investigating just how reliable it is (or not) for some key documents I want to have “on” me at all times.

It’s a bit of a pain to have things segmented across multiple sync systems, but the ground rules I set ( for project workspaces and for anything “serious”) seem to be holding out.

Let’s give it a year or so.


  1. If you’ve never used any of their containers, I heartily recommend them–they’re reasonably up to date, use a sane init system, and make it a point of breaking out configuration into separate volumes/environment variables whenever possible. ↩︎

  2. This is most apparent in OneDrive for Business and the way Office replaces standard file pickers with its own thing, but it seems to also happen in Personal. ↩︎

This page is referenced in: