Tech outage brings massive disruption worldwide including major air carriers to full stop (1 Viewer)

And healthcare :mad:

Edit: Thankfully I don’t have to worry about fax machines because I am no longer in management and don’t have to send in weekly updates. A fact for which I am eternally grateful
I mock my spouse incessantly (in a good natured way of course) for her use of fax machines so much. She's in healthcare of course.
 
Basically none of our staff knows how faxes work. Only the couple who support the hospital.
 
We finally calmed down maybe 2 hours ago and now we're just letting the staff coast for the day. I feel for the companies who didn't happen to have someone awake when this started and just started the entire process at open. Some of these random devices like the display screens and piece of work systems won't even have remote troubleshooting options. It may be weeks for some of them.
 
Wife flying Delta is trying to get outta Logan, has now been delayed 5x and instead of leaving at 5pm is now scheduled for 10pm...ugh. I told her to start looking for a hotel tonight.
 
Long day... Relatively easy fix if you can get into the drive, but our stuff is all bit-locker encrypted, so easier said than done. Once I got my permissions squared away and was able to pull the bit-locker keys I needed, the entire fix was about 10 minutes. Boot onto PE boot-disk, access drive using 48 character Password> C:>Windows>System32>Drivers>Crowdstrike> Delete C-00000291- file > Reboot

Boom Boom Pow. Once you have the bit-locker key and are ON SITE it isn't bad. Unfortunately I have 20 offices of my own and another 20ish I am covering for the retired Tech in the Central region, so it takes longer to drive between sites than it does to get people back up once there. Monday is going to be another long day unless there is some sort of fix pushed over the weekend. Unfortunately Crowdstrike seems to blue-screen things faster than any update can get installed, so...
 
Wife flying Delta is trying to get outta Logan, has now been delayed 5x and instead of leaving at 5pm is now scheduled for 10pm...ugh. I told her to start looking for a hotel tonight.

Sucks! It could be worse - my friend is stuck in Memphis. I cannot confirm nor deny whether he has the Mobile blues.


I think the flights are all just so full right now in peak season - it’s really hard to rebook canceled flights
 
Sucks! It could be worse - my friend is stuck in Memphis. I cannot confirm nor deny whether he has the Mobile blues.


I think the flights are all just so full right now in peak season - it’s really hard to rebook canceled flights
She got out! No plane was there at re-re-re-re boarding time, but suddenly one showed up. Oddly, I think going to MKE helped because there were a lot of folks trying to get out...
 
Long day... Relatively easy fix if you can get into the drive, but our stuff is all bit-locker encrypted, so easier said than done. Once I got my permissions squared away and was able to pull the bit-locker keys I needed, the entire fix was about 10 minutes. Boot onto PE boot-disk, access drive using 48 character Password> C:>Windows>System32>Drivers>Crowdstrike> Delete C-00000291- file > Reboot

Boom Boom Pow. Once you have the bit-locker key and are ON SITE it isn't bad. Unfortunately I have 20 offices of my own and another 20ish I am covering for the retired Tech in the Central region, so it takes longer to drive between sites than it does to get people back up once there. Monday is going to be another long day unless there is some sort of fix pushed over the weekend. Unfortunately Crowdstrike seems to blue-screen things faster than any update can get installed, so...
The exact fix.

I still don't know if all of our 100k-plus remote workers are fixed.
 
I'm really curious to see what exactly happened. Because the failure is clearly, a) very obvious, and b) very widespread, it's not like it's a really subtle issue that only affects systems with some unusual configuration, so it's hard to imagine how it couldn't have been picked up in automated QA testing before the update was rolled out. But it's also hard to imagine that no-one involved does that testing. So it'll be revealing to find out how this happened (assuming we ever do).
They've put out a technical update now with some details:


Basically sounds like it was a configuration update that triggered a flaw in the platform. So I'd surmise that the underlying flaw was missed in the platform testing, and they don't routinely test against the configuration updates themselves because those are seen as trivial things that couldn't possibly trigger any flaws in the platform because those would have been caught by testing, wouldn't they.
 

PULEEAASE Tell me that’s real…

Related side note: I predict SWA is going to be targeted for a cyber attack soon. Windows 3.1 and 95 were not developed with internet security in mind at all. I’m willing to bet SWA doesn’t have them all properly isolated in virtual machines. Heck, I bet many of them are on the original hardware.

Only one of my customers was impacted. Fortunately our software catalogs bitlocker keys for easy retrieval. I think we’re going to see a lot of companies that didn’t do that one simple thing.

Some of our customers have newer desktops with intel vPro. They were able to script and automate the deletion of the Crowdstrike file for a quick recovery.
 

Create an account or login to comment

You must be a member in order to leave a comment

Create account

Create an account on our community. It's easy!

Log in

Already have an account? Log in here.

Users who are viewing this thread

    Back
    Top Bottom