# Please explain why cloud computing is so important now

1. May 18, 2014

### Niaboc67

I know cloud computer has been around in some form for a good deal of time but all the sudden it is taking off. People are deeming it the future of computing, why is this? what does cloud computing mean for the future of computers and storage et cetera.

Thank you

2. May 18, 2014

### DavidSnider

Owning your own hardware and colocating it in a datacenter and maintaining it is expensive. The cloud essentially turns CPU power, HD space, backups, maintenence etc into a metered utility. It's just a much more scalable way of doing things.

3. May 23, 2014

### enorbet

It's actual importance depends a great deal on how you use computing and also your perspective. Cloud computing is just the latest name for the concept of servers and workstations utterly commonplace in Enterprise. That concept has been extended by people who work mostly in that environment, and also often wish to control it better (another term requiring perspective), into the rest of the world at large. The idea and it's benefits are much like Mass Transit. Many things would likely be more efficient and smoother if everyone "rode the bus". I don't see people giving up their cars or PCs anytime soon, even though Mass Transit and The Cloud have made considerable and important advances and have enjoyed increased implementation.

4. May 23, 2014

### AlephZero

The importance to computer vendors is a way to get a continuous and reliable income from users. Hardware is getting too powerful, cheap, and reliable for the old business model of selling bigger, faster, and preferable incompatible boxes every 2 or 3 years to work. (From the vendor's point of view incompatible is good, because you have to buy new versions of all your software again, and/or pay for consultancy advice on how to migrate your data from the old hardware to the new).

5. May 23, 2014

### phinds

From the point of view of developers it's important because (1) it is, as has been pointed out, an increasingly important model of usage so developers are more valuable if they know how to do it, and (2) it is, from the developer's point of view, significantly different in terms of the input/output since cloud apps do not directly use the user's keyboard and screen but rather go through a web interface. Also, database usage can be different if it is using a distributed database rather than a local hard drive.

6. May 23, 2014

### nsaspook

Cloud computing looks to be mainly a rehash of the X-Terminal/central server concept of a few years ago. Back to the Future with better networking and faster hardware. In an industrial environment where computers interface directly with complex hardware Cloud computing is not very useful because applications are very specific on function and response times but that's not usually the environment the vendors sell to.

Last edited: May 23, 2014
7. May 24, 2014

### enorbet

I think you may be confusing smartphone usage of The Cloud with The Cloud as a whole. If you can't control what is being run at all or see it, it is useless. Keyboards or touchscreens or whatever input device and especially screens are a prerequisite. It is data and processing power that is being "shared", not I/O devices.

What is now called The Cloud used to be called "Push Technology" or "Thin Client" which simply means a relatively weak client can control part of an extremely powerful server. This gives the user access to more power but considerably less freedom. It gives the server ultimate control to isolate the client - the "Walled Garden" effect.

The Walled Garden effect is important to those who wish to make money from their servers. Freedom is chaos. They prefer Cash Cows and no mixing of chickens, goats, ducks, and geese. :)

8. May 24, 2014

### phinds

I think you must not be a software developer.

9. May 24, 2014

### enorbet

Technically, or rather commercially, and in the modern popular sense, you are correct. I do write code in several languages but I do not develop for sale. I am a Linux devotee and advocate of the FOSS movement. I used to struggle to write device drivers but these days I tend to stack like legos and write scripts.

10. May 26, 2014

### Routaran

Back in the old days, the primary driving force behind cloud computing was the fact that computers were as big as houses and stupid expensive. It wasn't feasible to give individuals their own computers.
Solution: Build a giant mainframe and give people terminals to interact with the system. Depending on how loose and fancy you want to play with the term "cloud" it fits but the general idea is still the same.

Computers have gotten exponentially cheaper, smaller and faster. A vast majority of the things that needed a mainframe can now be done at your desk. But at the end of the day, if you want to calculate and predict how exactly a hundred atoms will interact and behave in a given situation, you will be sitting at your desktop for months and years waiting to find out. You still need those giant house sized, absurdly expensive systems for the actual heavy lifting.

These days, thanks to the internet, the concept has been extended from just being cloud that extends a single company to clouds that are over the internet. Where in the olden days, the cloud your company setup would only service your company, Google, IBM, VMWare, etc. setup clouds that are available for the entire internet to use.

By separating the User from the system that actually does the work, developers can gain almost complete control of both the hardware and the software aspect of the solution being offered. There is the option where the cloud provider could design both the hardware and the software.

The immediate benefit I see is that one could do away with device drivers for example by doing something similar to apple's OS. By controlling/making the hardware yourself, you can build that into your OS and not have to rely on an extra layer of device drivers from 3rd parties. More streamlined, faster, fewer bugs. (P.S. no I still hate apple)

Then of course there's the added benefit of using a web interface as phinds noted. The hardware employed by the end user doesn't really matter at all. Our lives as developers becomes quite rosy when we know we don't need to code for a gazillion different hardware builds. Plus, if your business requires some specific solution not currently offered, you can always negotiate with the cloud providers and develop your own application to leverage the computing power being offered.

Standards are still in development, as is with any new development in IT, but once the dust settles, I would imagine that cloud computing would become the cheaper, more economical way to work. I think everyone will still have their desktops and notebook computers for everything they do now but all the heavy lifting will move away. Why spend 4k on a powerful workstation if you can instead spend 500 on a basic desktop and use the deal you already have with say IBM for server time. Right?

Now, if only everyone could get affordable gigabit internet connections. Damn ISPs are bleeding us dry!!!

11. May 26, 2014

### enorbet

While "thin client" or "Cloud" does hold many advantages, there are also distinct advantages to what could be called isolated clouds, that is clusters that are not connected to the web or for only very brief periods of time (therefore high levels of guaranteed security). I suppose many here are aware of the cluster built by Gaurav Khanna, a professor in the Physics Department of UofM at Dartmouth, consisting of eight (8) Sony PS3 gaming consoles for the purpose of modeling gravity waves around black holes. The investment was under $3000 US iirc. It was so successful that the USAF research lab clustered some 1700 PS3 consoles. Shortly after this Sony built all newer consoles with lockouts to prevent installing of "alternative" operating systems, bringing an end to this incredible windfall. I would imagine this greatly pleased vendors of "real" supercomputers since the SuperMUC, for example, cost roughly$100,000,000 US, substantially more than the \$60,000,000 PS3 equivalent.

There may be good reason to doubt that so many PS3 consoles would scale as well (keeping costs linear) but with such large sums of money, one might wonder why any corporation would close shop on such a business opportunity, not to mention the value to Science and humanity for low-cost supercomputing.

The pressure to "go cloud" is mounting for certain and likely driven like everything else, by big bucks and the concomitant need for isolation and control. That said, the advanced configuration allows for timesharing as a possible offset. Apparently they plan to serve the scientific community but I don't know yet if that means "Mohammed must come to the Mountain" or vice versa. In any case, recent advancements and Moore's Law point to an extremely exciting decade ahead.