History of Cloud Computing: Marking a significant shift in its business model, IBM announced last week that it is splitting itself into two public companies. Its IT infrastructure support unit would be listing itself as a separate company by the end of 2021.
The core business of IBM would now be high margin cloud computing, with focus on open hybrid clouds and AI solutions. It has not come as a surprise to industry watchers and insiders because IT infrastructure support is rapidly becoming a shrinking, low-margin operation due to cloud computing and automation.
Moving to cloud computing as a core offering is a natural progression for IBM since it played a pivotal role in the evolution of cloud computing from time-sharing mainframes to virtual data servers. After all, IBM was the company that manufactured the first commercially viable mainframes. Let us look at the history of cloud computing briefly to appreciate this better.
What is cloud computing?
Cloud Computing can be defined as the availability of shared computer resources as per users’ requirements. In other words, flexible availability of data storage space, computing power, technical tools, and applications, where users pay only for those resources that they use. As many users are utilizing the same set of resources, the cost of setting up, maintaining, and using the assets decreases considerably.
Time Sharing Mainframes - Concept behind evolution of Cloud Computing
The concept behind cloud computing has evolved over the past eight decades, starting from the 1950s when companies like IBM and DEC proposed and implemented the concept of time-sharing for their Mainframe machines to help clients optimize costs.
By implementing time sharing, organizations installed just one or two mainframes on the premises and multiple terminals from anywhere in the building could hook up to it. This led to the maximization of their investments. This concept of time sharing became popular as a means to optimize usage of expensive physical assets and evolved beyond optimizing Mainframe computational capabilities.
If you look at the Mainframe arrangement, what it essentially meant was users sitting on their terminals were using infrastructure that was physically distant from them. Whenever they wanted, the resources were available as if they were the sole users when in reality, they were sharing it with many other users. This is the concept behind Cloud Computing.
Let's now look at how cloud computing has evolved over the decades.
Sharing computing resources over network (1955 – 2000)
In 1955, the computing time sharing theory was proposed by American computer scientist John McCarthy. It enabled businesses to make optimum use of their Mainframes.
In 1963 the Defence Advanced Research Projects Agency (DARPA) granted USD 2 million to Project MAC, whose aim was to develop high availability time sharing systems. It developed a primitive cloud where computational resources were shared between 2-3 users.
It was not yet called Cloud Computing; the term used was virtualization. Software-based representation of hardware resources like servers, storage devices and networks is called virtualization.
The concept of virtualization evolved with the Internet and use of virtual private networks (VPNs) became very popular in the 1970s, which eventually led to the development of modern cloud computing. VPNs were like fully functional computer systems with their own operating systems.
Modern use of the term cloud computing
The origin of the term cloud computing is still debatable. A deeply researched MIT Technology Review article says it could have been either Sean O'Sullivan, a young technologist who later tried to patent the term "cloud computing" but failed, or George Favaloro, a Compaq marketing executive. In the said meeting in 1996, the two were discussing Compaq taking over Sullivan's start-up selling Internet computing.
Even though it is not clear who used the term cloud computing first in the 20th century, in its modern avatar the term cloud computing was first used and explained by Google CEO Eric Smith in 2006 in Search Engine Strategies Conference.
Development of cloud computing platforms (2001 – till date)
Amazon was the first major company to offer cloud computing services. It launched Amazon Web Services and introduced its Elastic Compute Cloud (EC2) in August 2006. It was a pay-as-you-use subscription model that gained in popularity quickly.
Google App Engine was launched in April 2008 as a Platform as a Service (PaaS) to capture the growing interest in web applications hosted on the cloud. It allowed developers to build and host apps on Google Cloud infrastructure. It was formally adopted as the Google Cloud Platform in 2013.
Microsoft released Microsoft Azure in February 2010.
In July 2010 Rackspace Hosting and NASA jointly launched an open-source cloud-software initiative OpenStack.
In March 2011, IBM announced the IBM SmartCloud framework.
In June 2012 Oracle announced Oracle Cloud. It provides users with access to an integrated set of IT Solutions, including applications, platforms, and infrastructure layers.
Most popular cloud service providers in 2020
If you want to read about their offerings in detail, you can do so here. And just to give you a glimpse into the growth achieved by these cloud platforms, here are 2020 Q2 revenue figures for some of them.
|Cloud platform||Revenue in USD billions||Year-on-Year growth|
|Amazon Web Services (AWS)||10.8 (2020 Q2)||29%|
|Google Cloud||3.007 (2020 Q2)||43%|
|Microsoft Azure||11.9 (2020 Q2)||27%|
|IBM Cloud||6.3 (2020 Q2)||3%|
|Oracle Cloud||6.8 (2020 Fiscal)||1%|
|Alibaba Cloud||21.76 (2020 Q2)||34%|
All these cloud platforms are high availability, pay as you use models that provide its users easy access to latest technologies like artificial intelligence, edge computing, IoT, big data, multi-cloud, machine learning, etc.
This is proving to a great leveler in the business landscape as organizations without high-end IT infrastructure and knowledge can also leverage these technologies to drive growth.
Why cloud computing became popular at the beginning of this millennium
The primary reason behind rising popularity and acceptance of cloud computing is widespread digital transformation of businesses across all industries. As businesses turn digital, workloads and data volumes grow bigger, increasing data storage and related services requirements.
Not all enterprises, big or small, have the necessary in-house IT capabilities to manage on their own. Cloud computing service providers take care of data servers, databases, networking, applications, analytics and other related infrastructure.
Of course, some industries like manufacturing, healthcare, retail, banking and entertainment are leveraging cloud computing technologies more than others. This is because some operational models are easier to adapt.
Why should business adopt cloud computing
Outsourcing data storage and analysis to cloud is a smart business decision for organizations of all sizes and capabilities. Even those with a decent amount of in-house IT skills can benefit by this as it offers many advantages. The time and resources saved can be utilized in core business functions.
- No need to set up infrastructure: The most significant advantage of using cloud computing is that organizations do not need to install software or hardware. Besides taking care of the installation headache in the cloud service provider is also responsible for maintenance and upgradation of the infrastructure. This means that the user does not need to have skills for these; cutting down on both time and cost.
- Automatic upgrades: Keeping up with the latest technological trends is the most challenging aspect of any technology. When you sign up for cloud computing services, you do not need to devote your time or resources on up-gradation and integrating with the latest version. It is taken care of by the service provider.
- Integrating legacy applications: Barring startups, most organizations come with their own Legacy app modernization services. It is a no-brainer that customizing them to meet the latest technological advancements can be both expensive and time-consuming. On top of that, one can never be sure of the outcome. Replacing them completely with new applications would be time and resource-intensive, as also wastage of perfectly usable infrastructure. In contrast, integrating it with cloud technologies is a better and easier proposition.
- Reduction in IT infra capital expenditure: Setting up and maintaining IT infrastructure is an expense for any organization, however big. Using a public cloud means organizations can cut down on initial investment; all they need to worry about is the operational expenses for using the cloud. As most cloud service providers use a subscription or pay as you use the model, the expenses can be adjusted as per requirement. This results in a reduction of overall IT cost overhead for the company.
- Easier maintenance of applications: Whether you are using a public cloud or a private one, maintenance of cloud computing applications is easier because there is just one instance of the application that needs to be maintained. In a traditional software model, each application is installed on multiple systems and each system must be managed.
- Easy scalability: Cloud delivery model also ensures that businesses do not need to think about the availability of infrastructure when planning to scale up. For on-premise, capital expenditure on adding IT infrastructure like data servers, applications, networking, etc. can become a major cause of concern in case of expansion.
Cloud delivery models
Cloud services can be delivered in three models – public, private and hybrid.
In the public cloud model, the IT services are delivered over the Internet. It is the most popular model of delivering cloud computing services because it is highly scalable and usually has low-cost subscription-based pricing.
Services may be free, freemium, or subscription-based where users pay as per the number of resources consumed. The cloud service provider is responsible for developing, managing, and maintaining the compute resources being shared between multiple users across the network.
Even though public cloud is very cheap because no investments are required and subscription fees are typically low, it has downsides as well. Public cloud is least secure and hence it should not be used for sensitive, mission critical, workloads.
In lieu of low pricing, users also have minimum technical control and hence ensuring compliances can be an issue. Even though the subscription costs are low, if usage reaches a certain level, the total cost may rise exponentially.
Any cloud solution that is used by a single organization is called a private cloud. The resources and the tools are not shared between multiple users. As resources are not shared, data centers may be located on-premise or offsite. The resources are delivered via a secure private network and not shared by any other channel.
The most important advantage of the private cloud is it is highly secure and customizable according to the unique business and security requirements of the organization. But that also renders it expensive. Therefore, it should be used in cases where sensitive data needs to be stored and the IT compliances are very high. Usually, highly regulated Industries like banking and government agencies use private clouds.
As the name indicates, hybrid cloud is a combination of both public and private cloud. Typically, it combines on-premises infrastructure with a private cloud in such a way that data and apps can move between the two environments seamlessly.
The decision to use hybrid cloud model is usually taken keeping in mind multiple factors like data security requirements, regulation and compliance requirements, level of data control, applications used, business goals, etc.
Hybrid cloud is also gaining popularity due to the growth of IoT over the past decade. Hybrid cloud can easily be customized to include edge computing at scale. Edge computing brings computing closer to IoT devices, which ensures that time and resources spent in communication with data servers are decreased drastically.
Data privacy and security concerns in cloud computing
No discussion on cloud computing can be complete without considering data privacy and security concerns related to using Cloud Computing applications. Like any technology, cloud computing is a double-edged sword. On the one hand, as the data is centralized it is easier to secure it.
On the other hand, as data is stored on the cloud that is third party location, data privacy and security depend upon the policies adopted by the cloud service provider.
Businesses must research the privacy and security policies of a cloud service provider before signing up. Ideally, the service provider should be able to keep customer data safe from leaks and theft, and also customize services to implement security measures specific to any organization.
TechAhead’s expertise in cloud computing
TechAhead offers end-to-end cloud consulting services. We possess expertise in the development, deployment, migration, transformation, management, and maintenance of cloud services.
Our team can help you in selecting the right cloud deployment model – public, private, or hybrid – for your business by sharing technical knowledge. We host apps through all three models. The public cloud services are hosted on Amazon AWS, Microsoft Azure, and Google Cloud Platform, the top three cloud service providers in the industry.
Cloud computing is the flexible availability of data storage space, computing power, technical tools, and applications where users pay only for those resources that they use. Since many users are utilizing the same set of resources, the cost of use decreases drastically. Also, enterprises that do not have in-house IT skills can use IT tools easily.
The decision to select the right cloud model should be taken on the basis of data security requirements and regulation and compliance requirements, applications used, business goals, etc.