This is why I keep telling people that third-party kernel extensions should be banned from production servers, period.
And shipping LIVE cloud updates direct to endpoints, unchecked, without any canaries? Borderline criminal (and I certainly hope that IT admins can choose not to update machines immediately–and that they didn’t just click through the setup and called it a day).
Even though there is a “fix” out, recovery still consists mostly of mounting the system volume and removing \Windows\System32\Drivers\CrowdStrike\C00000219*.sys
.
But since most of the affected systems are in a boot loop that may well require physical (or IPMI) access to the machine. And $DIVINITY
help you if you did something stupid like installing this piece of garbage on your hypervisor host.
Given the overall chaos that this update has created, it may take days to sort all of this out.
But one thing is actually surprising to me–people don’t seem to actually be doing due diligence on how security software works before buying it and deploying it across critical systems.
My view of IT procurement was always… biased, but given the widespread reach of this, I now believe most are just doing cargo culting and generally… winging it and doing “checkbox compliance” instead of actually considering security products as sources of risk in and by themselves.
That said, CrowdStrike does seem like a very apt name… They are never going to live this down.