“The cutting edge of innovation is on the consumer side — digital technologies for consumption activity, play, entertainment and social-networked communication — and not in corporations anymore,” observed Timothy F. Bresnahan, an economist at Stanford.
Nowhere is that more apparent than in cloud computing, the technology industry’s buzz term for customers’ accessing information held in big data centers remotely over the Internet from anywhere, as if the services were in a cloud.
It’s hard to read any technology news that doesn’t include an emerging technology or new products with the cloud moniker attached. We’ve all heard the term since it was coined in 2007 and many of us are still wondering, “what is cloud computing?” I’m assuming it’s important to understand since it seems to be the direction of IT.
An application is built using the resource from multiple services potentially from multiple locations. Cloud computing really is accessing resources and services needed to perform functions with dynamically changing needs. An application or service developer requests access from the cloud rather than a specific endpoint or named resource. What goes on in the cloud manages multiple infrastructures across multiple organizations and consists of one or more frameworks overlaid on top of the infrastructures tying them together.
The cloud is a virtualization of resources that maintains and manages itself. There are of course people resources to keep hardware, operation systems and networking in proper order. But from the perspective of a user or application developer only the cloud is referenced.
Dell said that it would invest $1 billion over the next two years to build 10 new data centers and expand customer support, largely for cloud offerings.