The cloud is a shitload of computers connected in such a way that it's far more reliable than any single computer, and so you don't need to care about which computer is doing what.
Yes, those computers physically exist somewhere and are owned by someone, but saying the cloud doesn't exist is just ridiculous. May as well say clouds in the sky don't exist either because they're just water.
That's just it, the "cloud" is just a fancy name for a cluster that's owned by someone else. Everything you've described as what a "cloud" is, has already been defined.
The term "cloud" is a marketing vapor term that loosely refers to a cluster of hypervisors. That's how hypervisors at large scale are pretty much always organized.
The hypervisors in use are not something most people have ever heard of. The most commonly known contenders are hyper-v (which is the basic technology that Azure is built on), and VMware. But most major "cloud" providers, with the exception of Azure, are using something else entirely.
The same description you've provided can also be applied to modern super computers, mainframes, and pretty much anything that lives inside a datacenter.
A personal computer has a multitude of single points of failure. A single power supply on a single circuit, a single processor, with all memory controllers in that same processor, a single OS drive, a single network interface. Servers generally have multiple power supplies, multiple CPUs, multiple disk drive controllers, connected to multiple disks in some kind of raid or equivalent. Basically all single points of failure, with few exceptions (such as power management/distribution, and the motherboard) have been removed.
Then you take the servers and scale up to a whole cluster of servers and you get so much more redundancies. A cluster, when done properly is basically bullet proof for failures. Making it larger both increases capacity and redundancy. Without increasing latency. Again, when done right.
In all, "cloud" is a marketing buzzword. I don't know of anyone in tech that calls a "cloud" by that name unless they're talking to someone who doesn't know that a "cloud" is fictitious.
The cloud is a shitload of computers connected in such a way that it's far more reliable than any single computer, and so you don't need to care about which computer is doing what.
Yes, those computers physically exist somewhere and are owned by someone, but saying the cloud doesn't exist is just ridiculous. May as well say clouds in the sky don't exist either because they're just water.
That's just it, the "cloud" is just a fancy name for a cluster that's owned by someone else. Everything you've described as what a "cloud" is, has already been defined.
The term "cloud" is a marketing vapor term that loosely refers to a cluster of hypervisors. That's how hypervisors at large scale are pretty much always organized.
The hypervisors in use are not something most people have ever heard of. The most commonly known contenders are hyper-v (which is the basic technology that Azure is built on), and VMware. But most major "cloud" providers, with the exception of Azure, are using something else entirely.
The same description you've provided can also be applied to modern super computers, mainframes, and pretty much anything that lives inside a datacenter.
A personal computer has a multitude of single points of failure. A single power supply on a single circuit, a single processor, with all memory controllers in that same processor, a single OS drive, a single network interface. Servers generally have multiple power supplies, multiple CPUs, multiple disk drive controllers, connected to multiple disks in some kind of raid or equivalent. Basically all single points of failure, with few exceptions (such as power management/distribution, and the motherboard) have been removed.
Then you take the servers and scale up to a whole cluster of servers and you get so much more redundancies. A cluster, when done properly is basically bullet proof for failures. Making it larger both increases capacity and redundancy. Without increasing latency. Again, when done right.
In all, "cloud" is a marketing buzzword. I don't know of anyone in tech that calls a "cloud" by that name unless they're talking to someone who doesn't know that a "cloud" is fictitious.
I'ma stop you right there. I'm a software engineer who's implemented a lot of cloud-based stuff. It's a term of art, not just a marketing word.
Call it whatever you want, just don't call it a technical term.
It can be anything, except a technical term.