If you think having a backup is too expensive, try not having one

  • Thread starter Thread starter nsaspook
  • Start date Start date
AI Thread Summary
The South Korea data center fire has resulted in the potential loss of 858TB of government data due to a lack of backups, highlighting the critical importance of data protection strategies. A senior officer overseeing recovery efforts tragically died, underscoring the human impact of such data loss. The Ministry of Personnel Management is particularly affected, as it relied on a G-Drive system that failed to preserve eight years of work materials. Discussions emphasize the need for robust backup and restore strategies, as past experiences reveal that automated backups do not guarantee data recoverability. This incident serves as a stark reminder of the vulnerabilities in data management practices and the necessity for regular testing of recovery systems.
nsaspook
Science Advisor
Messages
1,489
Reaction score
4,970
https://www.datacenterdynamics.com/...-for-good-after-south-korea-data-center-fire/

858TB of government data may be lost for good after South Korea data center fire
Destroyed drive wasn't backed up, officials say
Meanwhile, a government worker overseeing efforts to restore the data center has died after jumping from a building.
As reported by local media outlet The Dong-A Ilbo, the 56-year-old man was found in cardiac arrest near the central building at the government complex in Sejong City at 10.50am on Friday, October 3. He was taken to hospital, and died shortly afterwards.
The man was a senior officer in the Digital Government Innovation Office and had been overseeing work on the data center network. His mobile phone was found in the smoking area on the 15th floor of the government building.

https://www.chosun.com/english/national-en/2025/10/02/FPWGFSXMLNCFPIEGWKZF3BOQ3M/

The Ministry of Personnel Management, where all affiliated officials use the G-Drive, is particularly affected. A source from the Ministry of Personnel Management said, “It’s daunting as eight years’ worth of work materials have completely disappeared.”
...
It provided 30GB, gigabytes, per public official. At the time, the Ministry of the Interior and Safety also issued guidelines to each ministry stating, “All work materials should not be stored on office PCs but should be stored on the G-Drive.”

G drive as in GONE drive.
1759972971676.webp
 
  • Sad
  • Like
  • Wow
Likes bhobba, Greg Bernhardt, Rive and 1 other person
Computer science news on Phys.org
Long ago, before backup utilities were so widely available, I worked in a computer lab that did backups to tape every night. The disk capacities were not big then. The administrators used a program that they wrote, which looped to copy every file to tape and finally rewind the tape for them to dismount. Every night, they started many backups for many computers and later dismounted all the tapes. They did that for a long time.
One day, an administrator noticed the tapes seemed to rewind many times in a single run. They saw that the program had the rewind inside the loop after every file copy. So only the last file remained on the backup tape and all others had been overwritten every night.

The moral of the story is to check that archived files can actually be retrieved.
 
Last edited:
  • Like
  • Haha
Likes bhobba, russ_watters, jack action and 2 others
Years ago my wife and I arrived at Heathrow to find our airline was processing check ins on paper after a massive IT failure. The story we heard was it was due to a data center fire that had destroyed the primary system, and although they had had a backup it had been in the next room.
 
  • Like
  • Wow
Likes sophiecentaur and FactChecker
In the latter part of my career, it became increasingly obvious that many data centre operations were dependent on the reliability of modern disk technology. They had a "backup strategy", of course, but little in the way of a "restore strategy". The first example I remember was in about 2005, where the tape storage facility was next to the disk storage units. And, there's was no way to manually extract tapes from this facility. A fire in that case could have destroyed everything, including all backups.

Restoring from tape became increasingly rare, with clever disk-based solutions like snapshots meaning that tape had become a last line of defence. Increasingly, the backups were completely automated, to the point where no one seemed to think very much about testing the restoration of data. It was assumed, more or less, that if the backups were running, then the restoration of data from them would be possible.

Generally, there was always a difficulty in organising a full restore from backup. If the test went wrong, then in principle the data was gone! Often a full restore was tested before a system went live - but once the system was live, it could take quite a bit of ingenuity to test a restore properly. In the old days, when a system required a single Unix server with a tape backup every night, it was relatively easy to restore onto a similar server somewhere else and test the restored system. It might take one techie less than a day to do the whole off-site restoration, using tapes that were stored off-site, where nothing from the live site was needed. In the 1990's we did this sort of thing regularly - although we always kept our fingers crossed when a restore from tape was involved.

But, as systems became increasingly complex and interdependent - with an environment of perhaps several hundred virtual servers - creating a test installation was a project in itself. In one of the last projects I worked on, it took about six weeks to configure the server infrastructure. If an additional user-test environment was required, then it took about six weeks for the various Windows, Linux, Oracle and other techies to do their thing. Then, all the application software had to be loaded. (I think it cost about £1 million per environment!). By contrast, in the old days, creating a new test environment could take a couple of hours - copy the executables, copy an existing database, do some configuration, update the backup schedule(!) and that was it. And, it came at no additional cost - as long as the server had enough memory for another Oracle database. In one case, we had a single Unix server with nine test enviornments on it.

I worked on a project in about 2012 where we did a major upgrade by going live on what had been the "disaster recovery" (DR) site and swapping the roles of live and DR data centres. That all went well, but it was a controlled project over several months and not a DR test as such.

In general, there seemed to be a strong reluctance to do a DR test, even where the facilities were available and an annual test was part of the contract. People always seemed to have better things to do! Also, there was a new generation of system support techies who had a different outlook on things. I was telling anyone who would listen that we were essentially dependent on the reliability of the technology. By 2014, when I retired, I strongly believed that a fire at a data centre would have been a real disaster, with little chance of full systems and data recovery.
 
Last edited:
  • Like
  • Informative
Likes bhobba, berkeman, Ibix and 1 other person
I've also set up systems from scratch where my immediate boss seemed to think that just RAIDing the database server was enough. My argument that the eggs were still in one basket in case of fire or other force majerure went unheeded until I went over his head. That didn't make me popular, ironically.
 
  • Like
  • Sad
Likes nsaspook and FactChecker
At my first job in 1973, on a multi-computer multi-tasking database server using HP 2100s, the system had the advantage that it was offline from 3 am to 7 am each day allowing for offline backup. During that time, every disk pack was backed up to another disk pack (there were 10 drives, so 5 backups done in parallel) and the backup code included a check to make sure the backup was in the correct direction. I don't recall how often a set of backup drives were exchanged with an offsite location.

About once a month, all of our source files were punched into Mylar paper tape (lifespan something like 100 years), put in shoe boxes and stored in safety deposit boxes at banks.

Mainframe sites since the 1960's using tape backups routinely cycled backup tapes offsite.

I assume most server sites now store backups at physical offsite locations or at external cloud storage servers.

Image of one side of the computer room from that 1973 job:

octal.webp
 
Last edited:
  • Like
Likes Rive, sbrothy and FactChecker
Now, that's a computer! Not this modern cell phone or laptop stuff! :kiss:
 
  • Haha
Likes Rive and FactChecker
sbrothy said:
Now, that's a computer! Not this modern cell phone or laptop stuff! :kiss:
(sigh) Nostalgia. The good-old days of "big iron". And a smart-phone today has a lot more capability than a room of such computers!
 
FactChecker said:
(sigh) Nostalgia. The good-old days of "big iron". And a smart-phone today has a lot more capability than a room of such computers!
Yes. 8-inch floppydisks. Magnetic tape. And this was lightyears beyond punchcards! Dial-up modems. Pre-internet BBSs. Measuring your speed in baud!

As a Japanese WW2 survivor says in Archer (adult cartoon):

“There cannot be a an electronic brain in this room [Archer is referring to his cell-phone]. It’s simply not big enough! By gods, it only 8x8 meters!”

:smile:

EDIT: or something to that effect.
 
  • Haha
Likes FactChecker
  • #10
Ibix said:
Years ago my wife and I arrived at Heathrow to find our airline was processing check ins on paper after a massive IT failure. The story we heard was it was due to a data center fire that had destroyed the primary system, and although they had had a backup it had been in the next room.
So many stories like this one. Trouble is that it's a catch 22 situation for making a good system. Senior staff and government ministers tend not to have a clue about the nuts and bolts and their priorities are costs and timescales. They are hopeless about things like security and their own passwords. Designers know the essential details about projects but ignore costs and how 'dumb' the rest of the system can be; they can design for technical perfection, when left to themselves. Recipe for disaster and we keep getting disasters.
 
  • #11
And, whilst on the subject: how many versions are there of my valuable (?) files which Apple keeps on the cloud?
 
  • #12
sophiecentaur said:
And, whilst on the subject: how many versions are there of my valuable (?) files which Apple keeps on the cloud?
I'd like to think they're spread out over several colocated computer centers, but who knows?
 
  • #13
sbrothy said:
I'd like to think they're spread out over several colocated computer centers, but who knows?
And heh, valuable? You mean your memes and your cat photos? :woot:
 
  • Love
  • Haha
Likes harborsparrow and sophiecentaur
  • #14
sbrothy said:
I'd like to think they're spread out over several colocated computer centers, but who knows?
I read that as chocolate centres. :wink::wink::wink:

My passwords are very important, which would be problematical. The important ones are, of course, all different and non-memorable.
 
  • Like
  • Agree
Likes sbrothy and FactChecker
  • #15
sophiecentaur said:
I read that as chocolate centres. :wink::wink::wink:

My passwords are very important, which would be problematical. The important ones are, of course, all different and non-memorable.
Absolutely! My iPhone has passwords, contact information, messages, map locations, etc. It would be a real problem if they were all gone at once. A lot of photos are easy for me to back up to my computer, but not the rest of it.
Can the data on Android phones be backed up to a PC?
 
  • #16
sophiecentaur said:
I read that as chocolate centres. :wink::wink::wink:

My passwords are very important, which would be problematical. The important ones are, of course, all different and non-memorable.

1760994673815.webp
 
  • Like
Likes sophiecentaur and Baluncore
  • #17
There are nice memory tricks for passwords that are good -- until you have a dozen to remember and you are occasionally forced to change some of them.
 
  • #18
FactChecker said:
There are nice memory tricks for passwords that are good -- until you have a dozen to remember and you are occasionally forced to change some of them.
Mnemonics.

"Mum very easily made a jam sandwich using no peanutbutter."

=

Merkur, Venus, Earth, Mars, The Asteroid Belt, Jupiter, Saturn, Uranus, Neptune, Pluto. :smile:
 
  • Like
Likes FactChecker
  • #19
sbrothy said:
Mnemonics.

"Mum very easily made a jam sandwich using no peanutbutter."

=

Merkur, Venus, Earth, Mars, The Asteroid Belt, Jupiter, Saturn, Uranus, Neptune, Pluto. :smile:
That is sort of what I do. I keep all mine in PasswordSafe. Most that I very rarely use can be any random thing and I can look them up when I need them.
The ones that I use often and need to memorize are tied to lyrics of songs I know. Each one would have the first letter of each syllable in its lyric line and some special characters/numbers scattered in a pattern.
It surprises me how long the passwords get and I can still remember them.
 
  • #21
FactChecker said:
There are nice memory tricks for passwords that are good -- until you have a dozen to remember and you are occasionally forced to change some of them.
I use basically the same pattern for all my passwords. Except for my e-mail because from there I can reset them all. So guess my password to PF and you may have a couple to other sites (if you can guess which ones), but you’re not getting into my e-mail.

Even though I don’t even follow the logic laid out in the XKCD strip above. :smile:
 
  • #22
sbrothy said:
So guess my password to PF and you may have a couple to other sites :smile:
PF? That's how I do it and still make them all different. ;-)
 
  • #23
PeroK said:
People always seemed to have better things to do!

How true it is.

At one time, I was the computer security officer for the Australian Federal Police (by law, every government agency had to have one, and I was it, until the head of IT decided, correctly IMHO, it was really his responsibility, and I reported to him on security matters while still doing the work).

Anyway, we religiously kept backups, as good computer security practice dictates (as well as DBA practice - I basically just verified that the DBA had done it).

One day, I was asked to conduct a risk evaluation of the backups stored on-site at the computer centre and determine whether they needed to be relocated.

I did the evaluation, and as was pretty obvious, concluded that they should be in a different location.

The decision was made (I conducted the evaluation; upper management makes the decisions) that they were to be kept at the police training college. That seems ok, right? The issue was that the computer centre was at the police training college. I was flabbergasted, but upper management makes those decisions, not me.
Relevant to people who always have better things to do, nothing was done. The story I heard was that someone had checked the computer centre was already at the college - problem solved.

That was over 30 years ago now, so hopefully someone has since realised a better solution was needed.

Thanks
Bill
 
  • #24
FactChecker said:
o0) PF? That's how I do it and still make them all different. ;-)
Well, I cycle between a couple variations on patterns. Something I learned working alongside a statistician as a developer. He was so scatterbrained (or rather buried in his work) that he would forget to pick up his children from kindergarden. He didn’t hear his phone either (even though it was right beside him and I’d turned it up to full volume). When it rang out mine would ring and it would be his wife asking me to ask him to get going. Sometimes I had to go actually touch to get him to snap out of it. I’ve never seen anyone so concentatred. I dunno what it says about me that they’d usually give the jobs he left to me. I guess I should be flattered cause noone else dared touch them. But that was perhaps just because they were boring.

This didn’t have much to do with passwords. So it goes.
 
  • Like
Likes bhobba and FactChecker
  • #25
bhobba said:
How true it is.

At one time, I was the computer security officer for the Australian Federal Police (by law, every government agency had to have one, and I was it, until the head of IT decided, correctly IMHO, it was really his responsibility, and I reported to him on security matters while still doing the work).

Anyway, we religiously kept backups, as good computer security practice dictates (as well as DBA practice - I basically just verified that the DBA had done it).

One day, I was asked to conduct a risk evaluation of the backups stored on-site at the computer centre and determine whether they needed to be relocated.

I did the evaluation, and as was pretty obvious, concluded that they should be in a different location.

The decision was made (I conducted the evaluation; upper management makes the decisions) that they were to be kept at the police training college. That seems ok, right? The issue was that the computer centre was at the police training college. I was flabbergasted, but upper management makes those decisions, not me.
Relevant to people who always have better things to do, nothing was done. The story I heard was that someone had checked the computer centre was already at the college - problem solved.

That was over 30 years ago now, so hopefully someone has since realised a better solution was needed.

Thanks
Bill
Yeah. That sounds like real life.
 
  • Like
Likes bhobba and FactChecker
  • #26
sbrothy said:
Yeah. That sounds like real life.

Too true.

I have had numerous similar experiences during my 30-year stint in IT. A lot were more of a laugh than anything else, but some, like the above, could have ended in disaster. Even worse were those that made my life worse. Self-serving - of course :DD:DD:DD:DD:DD:DD

Thanks
Bill
 
  • #27
So if we're gonna talk about passwords--I had to work on a project for my algae scientists that was funded by the US Geological Survey. I had to maintain a login on the USGS site, and their system required me to choose a difficult-to-remember password. But also, the password had to change every three weeks and could share no character string with any recent password, After a couple months, I was so annoyed that I started writing the latest password on a piece of paper in giant letters and taping it on my monitor.

This is how going overboard ends up backfiring. Moderation is a virtue.
 
  • #28
harborsparrow said:
So if we're gonna talk about passwords--I had to work on a project for my algae scientists that was funded by the US Geological Survey. I had to maintain a login on the USGS site, and their system required me to choose a difficult-to-remember password. But also, the password had to change every three weeks and could share no character string with any recent password, After a couple months, I was so annoyed that I started writing the latest password on a piece of paper in giant letters and taping it on my monitor.

This is how going overboard ends up backfiring. Moderation is a virtue.
It could be worse. As of about six months ago I'm no longer allowed to log in to any US government web site. A USA drivers license (or state ID) is required, and mine expired a year ago since I live overseas. Passports are not accepted. I applied to Social Security just before this happened, which was a very good thing. It took five months for the application to go through, during which I experienced sleep disturbance due to worry.

US National Park websites don't require a login but last time I tried I was forbidden access to them, presumably because I'm in Asia. They've taken the national security thing too far.
 
  • Sad
Likes harborsparrow
  • #29
bhobba said:
That was over 30 years ago now, so hopefully someone has since realised a better solution was needed.
I would hope that someone would have re-visited the problem more than once since then :wink: .

Biometric identification seems pretty secure - life's not like Tom Cruise's situation in Minority report (iirc) with his eye transplant for changing his ID..

My Apple system seems to look after me quite well, even to the point where my Apple Watch (6 digit passcode) logs my MackBook Pro in as I sit down to work (fingerprint not needed). Then the iPhone recognises my face and the Bank does the same but it asked me to wink to avoid the stationary picture problem.
 
  • #30
harborsparrow said:
So if we're gonna talk about passwords--I had to work on a project for my algae scientists that was funded by the US Geological Survey. I had to maintain a login on the USGS site, and their system required me to choose a difficult-to-remember password. But also, the password had to change every three weeks and could share no character string with any recent password, After a couple months, I was so annoyed that I started writing the latest password on a piece of paper in giant letters and taping it on my monitor.

This is how going overboard ends up backfiring. Moderation is a virtue.
That is dangerous in companies that deal with classified data. A security person will see that, try it, and you will be fired. A password safe is a better place to store passwords. In the old days, that was a physical safe. Now it is an app.
 
  • Like
Likes harborsparrow
  • #31
FactChecker said:
That is dangerous in companies that deal with classified data. A security person will see that, try it, and you will be fired. A password safe is a better place to store passwords. In the old days, that was a physical safe. Now it is an app.
Of course I would not have done it in that situation. But USGS had been very heavy handed, coming in and demanding that we send them all our data but without even understanding why our data model was more complex than theirs. So we had to do all this extra work to send them dumbed-down data because they didn't actually know anything useful about algae. And then they had this sanctimonious attitude about privacy and security. I was extremely underpaid and overworked, in a research institution that could barely keep its toilets functioning. When we needed that password, we needed it, and it was either write it on paper or send it over email, which would actually be worse I think.

Please understand I did admire a lot of things about USGS, but they were one of the first agencies gutted by the cutbacks of the new administration in 2016. This whole website disappeared overnight, as did the contacts we had at USGS. We couldn't even raise them on the phone and they had never given us cell phones out of an abundance of privacy zeal. And of course, a lot of our funding went with them. The algae data I worked with--a precious scientific resource reaching back to the 1960's covering streams and lakes all over the US--has now all been archived offline, whereas before the funding cuts, it was available to scientists anywhere in the world via my institution's web services.

Well, that is a vent and a digression. I knew it was boneheaded, but I'm just saying that's what people DO when password security goes overboard.
 
  • Like
Likes FactChecker
  • #32
FactChecker said:
That is dangerous in companies that deal with classified data. A security person will see that, try it, and you will be fired. A password safe is a better place to store passwords. In the old days, that was a physical safe. Now it is an app.
Demanding people change passwords in that way is a known risk. Password safes will eventually be cracked. I'm possibly a crackpot, but where it's important, I use an automatically-generated password once and then change a few characters to ones I can remember (and don't allow those to be saved).
BTW, I don't need it for online banking because they already have a second stage that serves the same purpose and is slightly more robust. (On the other hand, having the three-digit code printed on the same side of their debit/credit cards as all other details makes them even less secure than they were originally)
 
  • Agree
Likes harborsparrow

Similar threads

Replies
1
Views
10K
Replies
8
Views
5K
Back
Top