Confusion around Cloud Computing
5th Oct 2016
There's seems to be only more Confusion (Not Less) Mounting about What Cloud Computing Actually Represents
One of the most commonly misinterpreted aspects of cloud computing is how it actually works. Let's back up a bit here; in a traditional grid computing system (the current standard, probably what you're using right now) every individual user has their own private stock, if you will, of hardware, software, storage, and operational components. While there is certainly nothing wrong with this model, it does highlight some fundamental problems in terms of energy / resource usage for us all. In grid computing, nearly everyone has their own wildly overpowered system; a majority of users never even using but a fraction of their available resources during the life of their machine.
All of that extra power and resources simply goes to waste in a grid computing scenario; but in a cloud computing setup, all hardware (and most software too) is centralized, allowing users access to as much power as they require. Here's how it works (as an example); let's say you have 2 systems, a grid and cloud. On each system, you have 50 users. In the grid model, most of the users, let's say 45 of them, only use their computer for light applications and browsing the net. The other 5 might be doing resource intensive work with a number of programs. The remaining 45 users have a wealth of computing power right underneath their fingertips and will never use it, but because the grid system doesn't allow for computing resources to be shared, these extra resources will go to waste.
[caption id="attachment_527" align="aligncenter" width="300" caption="cloud computing training and certification"]
[/caption]Now, let's apply this same logic to a cloud computing system. Same numbers apply here as well. What makes the cloud superior is the fact that it is able to provide users with as much power as they need and on an individual basis no less. Users in a cloud computing network don't actually have (or need) full fledged systems, their interface devices are usually nothing more than a browser-based OS/interface with a negligible amount of RAM and/or a somewhat limited hard drive. Users on a cloud network don't need hardware or software resources to be local (local meaning 'with them' or 'on them'). The great thing about cloud computing is that it provides everything for each individual user (and I mean everything).
In a cloud computing model you have an extensive set of centralized resources, often called the 'hardware stack'. This hardware stockpile is capable of housing and running the contents that you would find on multiple PC's, for example. The centralized hardware can literally create virtualized machines for individual users, which they can further customize. Inside the cloud you have a total amount of power and resources available, often comparable to a set of super computers.
The cloud actually simulates individual users' operating systems, applications, everything they need from their interface, or access point. What's beautiful about it however is the fact that if said user needs to requisition extra power or resources to accomplish something it is done automatically by the cloud itself. What does this mean exactly? It means that every user on a cloud computing network can have access to near unlimited computing resources instantly, that's what.
So, cloud computing is really an entirely new way of looking at computing and networking isn't it? In many ways it is blurring the line between what is network interfacing and what personal computing. It's sort of a system for remotely deploying virtualized systems either solitary, or in complex groups. Because cloud computing is making use of certain technologies to perform critical tasks and services (which were originally designed solely for use on the web) it is in many ways, bridging the gap between web-based technologies and more conventional ones (hardware and software).