As the Windows 7 commercials exclaimed, “To the Cloud,” many IT purists cringed. The term cloud computing first gained popularity in 2007 (though the concept began in the ‘60s) and has come to mean many things to many people. Most commonly, in anyway related to virtualization or internal or external repositories of data, it’s the cloud don’t ya know?
The generic understanding involves taking one’s data and, rather than housing it locally, storing it elsewhere. Hence, the cloud. In reality, the data is hosted in a data center, such as Amazon, Apple, or another hosting company data center.
The term is used by consumers, businesses, VARS, and analysts alike. Ironically, it’s referenced in a multitude of ways yet never truly defined. The inferred meaning grew and deepened while continuing to lose any clarity, bringing it to the scenario it’s at today—a non-lucid term that causes more heated conversations than last week’s republican debate.
This ambiguity has many denying the cloud even exists, logically extrapolating that since it can’t mean one thing to everyone then it’s gone the way of grid-computing, Friendster, and the dodo. The counter-point will argue how can one deny something that has changed everything from how businesses perform daily tasks to the video and music industry? Defined or not, these technologies are now affecting most people’s daily lives. Maybe the true definition lies in the word itself: clouds by nature are ever-changing and evolving. A stratus to a cummulus to a cirrus.