Thursday, October 14, 2010

What is Cloud Computing and What Does This Stupid Buzzword Mean?

According to the National Institute of Standards and Technology, the definition for “Cloud Computing” is this incomprehensible piece of nonsense clearly written to be as confusing as possible:

Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

So what’s a definition for real people?

Cloud Computing = Web Applications

That’s all there is to it. If you’re using a web or internet-based application from a major provider like Google or Microsoft, you’re using cloud computing. Congrats!

Every web application that you’ve ever used, like Gmail, Google Calendar, Hotmail, SalesForce, Dropbox, and Google Docs, are based on “cloud computing”, because when you connect to one of these services, you’re really connecting to a massive pool of servers somewhere out there on the internet. The client doesn’t need to be a web browser, but that’s the direction everything is heading.

So Why Cloud Computing?

We’ve already established that it’s a pointless term that simply describes web applications, which have been around for a very long time—but in order to get businesses to start switching to web applications instead of self-hosted servers, the marketing types invented a new buzzword.

The reason why they used the word “cloud” in the buzzword is simple: in network diagrams, the internet is usually represented with a cloud in the middle of the drawing. Those marketing drones are inventive, aren’t they?

So basically the term itself is just a way for consultants and companies to sell more services in a shiny new package.

How Can Cloud Computing Help Me?

Since businesses everywhere are moving their applications to the web and coming out with new and interesting features accessible through your web browser, you’ll soon be able to access virtually anything from any browser on any PC, and the lines will blur between desktop and the internet.

Now that Microsoft has finally released the beta for Internet Explorer 9, which supports new web standards like HTML5 and uses hardware acceleration to make the whole experience speedy—every browser will finally be on the same footing. When Microsoft said that IE9 is going to change the web, they weren’t kidding—they were the only ones holding the web back with their anemic IE7 and IE8 browsers, not to mention the ancient IE6. And now the nightmare is finally almost over.

It’ll get even more interesting whenever Chrome OS is finally released, which is basically an entire operating system built around a web browser as the primary interface, with all of your applications as web applications instead of local—hopefully it will support web integration like IE9 does with the Windows 7 taskbar.

How Is Cloud Computing Different for Businesses?

If you’re in the IT world you’re probably scratching your head at this point and thinking that I’m oversimplifying the idea behind cloud computing, so let’s explain the real difference from the more technical side of things.

In the past, every company would run all of their applications on all of their own servers, hosted at their own location or data center. This obviously requires a lot of maintenance and money to keep everything running, upgraded, and secure.

From a business perspective, businesses can now move much of their computing to cloud services, which provide the same applications that you would install on your own servers, but now they are accessible over the internet for any of their customers. Have you read about companies switching to Google Docs? That’s a perfect example of companies switching from hosting their own local servers to using cloud computing instead.

But what if your company provides a service to others? You can also take advantage of cloud computing by creating applications that don’t run on your own servers, but actually utilize server resources provided by one of the big providers—Google has App Engine, Microsoft has Windows Azure, and Amazon has their EC2 framework.

Most of these services operate on a pay-for-resources basis—so your application only gets charged for the amount of CPU and network use that it actually uses—when your application is small and doesn’t have a lot of users, you don’t get charged much, but the benefit is that it can scale up to 10,000 users without any trouble (though you’ll be paying a lot more for the added CPU usage).

Web Applications are the future. Cloud Computing is a stupid buzzword.

No comments:

Post a Comment