Chairman of the Board of Directors
A Look at the History of Cloud Computing

A Look at the History of Cloud Computing

Over the last several years, cloud computing has emerged as one of the biggest driving forces behind business agility and efficiency of modern organizations. Where CIOs once struggled to determine whether cloud adoption would be worthwhile, they now focus on how soon they can deploy cloud resources and what they should move into this digital environment.

As essential as this technology has become to companies in all industries, its current iteration is still fairly young. Cloud computing as we know it today has only evolved over the last decade or so. However, it can trace its roots back nearly half a century, long before the Internet or modern computers.

Read on to explore the long history of cloud computing, from early theories to current applications.

 

1950s – A preliminary vision

During the 1950s, computing was a luxury reserved for government entities, large companies, and learning institutions, due to the amount of equipment required for operation. By the middle of the decade, the world had only 250 computers. Acquiring computers required users to spend millions of dollars, which made compute power largely inaccessible to the masses.

In 1955 John McCarthy sought to overcome this hurdle by proposing a new theory that he called “time sharing.” This setup made it possible for users to rent time on computers in groups using innovations known as “dumb terminals.” This allowed users to access numerous instances of computing mainframes at once while maximizing processing power and minimizing downtime. This groundbreaking concept represents the earliest use of shared computing resources, which is the cornerstone of modern cloud computing.

 

1960s – Thoughts of a computer network

Resource sharing continued to rise in popularity through the beginning of the 1960s when the National Physical Laboratory adopted “packet switching” at its facility. The lab expanded on the concept of time sharing, separating compute information into “packets” that made it easier for numerous users to operate within a network at once.

McCarthy made additional strides toward the modern idea of cloud computing in 1961 when he proposed that computer users would someday share resources as they do with other utilities. Later in the 1960s, J. C. R. Licklider, a computer scientist, introduced his theory for an “intergalactic computer network.” He envisioned a system in which computers would be able to connect with one another in a way that would allow users to access information and programs from any location. Licklider’s idea ultimately led to the creation of the Advanced Research Projects Agency Network (ARPANET), which facilitated connections between computers at four universities.

 

cloud-computing

 

1970s – The introduction of virtual machines

Cloud computing began to take a more tangible form during the 1970s when the first virtual machines (VMs) came into existence. This modernized the concept of sharing mainframes that became popular during the 1950s by allowing users to run more than one computing system within a single physical setup. As such, computers allowed for different computers (and different operating systems) to function concurrently within these secluded computing environments. The cutting-edge functionality of VMs gave rise to the concept of virtualization, which went on to become a major influencing factor in the further progress of this technology.

 

1990s – Major leaps forward

After the introduction of supercomputer centers and some smaller, personal computers in the 1980s, the 1990s ushered in a new era of computing innovations. It was in this decade that telecommunications firms, which previously focused on “point-to-point” data systems, began to shift their focus toward virtualized private networks (VPNs). This not only allowed these companies to lower their prices, but it also made it possible for businesses to share their data in unprecedented ways. Using VPNs, they could transmit data digitally over a communal network just as they would if their computers were linked via a private network.

The 1990s also saw the birth of the World Wide Web, with bandwidth speeds that allowed organizations to facilitate even more seamless connections between their computing equipment. During this time, people used the word “cloud” in reference to the invisible space that separates provider and end users. It was not until 1997 that Emory University professor Ramnath Chellapa first used “cloud computing” to refer to the “computing paradigm where the boundaries of computing will be determined by economic rationale rather than technical limits alone.”

 

cloud computing

 

2000s to now – Expansion of cloud computing services

As the dot-com bubble formed and burst in the early 2000s, Internet businesses began to re-evaluate their IT architecture to boost their efficiency and bring them into the modern era. Cloud computing began to flourish as these companies turned to Web-based service delivery as part of their business models.

Amazon Web Services (AWS) was one of the first major cloud offerings to emerge during the 2000s. At its inception, this platform delivered various cloud computing services on a pay-per-use basis. In 2006 Amazon introduced Elastic Compute Cloud (EC2), which was one of the earliest Infrastructure as a Service (IaaS) offerings on the cloud market.

Over the next several years, cloud services began to take off at full speed, with numerous vendors beginning to offer their own products. In 2008 providers began to shift their focus toward private clouds as an alternative to the public cloud model.

As cloud computing began its slow maturity in the late 2000s, an increasing number of clients at both the enterprise and small business level began to adopt this technology. The rise of smartphones also propelled the cloud movement forward by giving users more seamless access to the cloud from their mobile devices.

Cloud services have grown exponentially since the 2000s, incorporating numerous other service models and deployment environments. Some of the largest companies in the world, such as Netflix and Apple, now leverage cloud computing to facilitate their operations. As of 2018, 96 percent of all organizations were using at least one cloud.