The risks of automated deployment of software.
Sounds funny in the context, doesn't it?
I guess that some people will get a lesson in backing up their data.
I'm not a big fan of automated deployments. My company sent out an XP update last year that caused all of the USB ports to not be recognized on docking stations for some laptop models. They never did figure that out - the solution was to wait until the Windows 7 upgrade a few months later.
Isn't that how SkyNet got started?
Ah, the good old days back in the 1970's when sending out a software update to our customers involved
- several visits to different computer service companies around the UK to get time to build and test the software on lots of different types of computer and operating system
- several more visits to make copies on reels of magnetic tape in lots of different formats.
- For a few customers, transferring the binary data for the executable programs onto punched cards (maybe 20,000 cards per customer), and packaging them and labeling them up so even an untrained gorilla couldn't get them in the wrong order, and shipping them by courier.
Sending an update to 200 customers was about 3 weeks work for somebody - plus the time to fix the mistakes caused by wrong labeling of 200 identical looking but incompatible reels of tape, etc!
Some advice never changes:
IIRC, didn't a box of punch cards hold 2000 cards? So, ten boxes of cards per customer to hold 20K cards?
Yup. And that was only about 2 mbytes of data!
And sometimes the company mailing department didn't bother to read the shipping instructions, and sent the package half way round the world by sea to save money....
I've been following what seems a major change in the Linux world and while it is highly controversial (in fact I have often opposed it, at the very least on the Desktop) it apparently is also very compelling to many as one by one each major distro but 2 have fallen to it. I am referring to systemd which started as a replacement for the old, tried and true SysVInit, but has been revealed to be a thrust to a Core OS. There is now one called CoreOS which, partly because of extreme parallelization, can be deployed on thousands of systems in minutes. Furthermore it employs a read-only root system (partly for it's resistance to both hacking and inadvertent screw ups) and has whole system updates on a scheduled basis.
Whatever else it is, it is also a very big deal, and is worth watching it's development if you have any interest is Enterprise systems. I am mentioning this in this thread not to hijack it but because exactly these sorts of problems propel such development, either internally or in competition, or both.
Exactly! After being burned by the IT dept more than once by pushed updates that messed up some odd ball device driver to a special interface we hired our own engineering IT group person to manage our computers. (with orders to leave things alone unless it's been approved by the user or is a emergency virus or security update)
I was referring to the inability to stop them. The computers that I work with are set up to match customer requirements with regard to software versions of Java, browsers, etc. The IT department at my company thinks nothing of pushing updated, corporate-specific versions of software that are nothing like the versions being used by the customer. The updates also frequently reset customized settings causing hours-long bug hunts. It's gotten so bad that the company has split off its network in two - one development network that doesn't get automated updates and the main corporate network for everyone else.
My experience of SCCM isn't good so far, but it's early days yet, maybe it's just a learning process. No disasters, just a lot of problems getting deployments to actually work.
Separate names with a comma.