When we talk about the ‘cloud’ -something that is often discussed in IT- we can trace the origins of this term back to the 1990s. Cloud computing technology was conceptualized even earlier than that, somewhere around the 1960s when some fundamentals were laid down by universities and research institutes for the building blocks of the internet and also for what was to become possibly the most important storage solution of the 21st century and beyond. With the public opening of the gates of the internet for the first time in 1991 (World Wide Web), more than one million machines suddenly began to talk to each other for the first time via the TCP/IP and client-server architecture systems. In a research paper by Professor Ramnath Chellappa (Emory University), the term ‘cloud’ was coined. In an internal document belonging to the then computer giant Compaq in 1996, there was also mention of this concept. Cloud computing was to be an evolution of ‘grid computing’, which at its primitive state than just before the 2000s was a series of connected organization resources. Grid computing, however, was not cloud computing in that these resources were nowhere near ready to be accessed freely and remotely due to the lack of development in the technology at the time, as well as the fact that internet speeds were turtle-like. It was around 2002 that we saw the first iterations of modern-day cloud services with Amazon’s AWS (Amazon Web Services) when they launched their public cloud option in 2002. Amazon was a pioneer without competitors in this subject, albeit there was also no practical use for the cloud at the time even though the industry would have breathed a sigh of relief from the server maintenance nightmares back then.
With the start of the dot-com boom, more and more domains (websites) would start to slowly transfer to the cloud, which would bring in new cloud generations over the next decade. Between 2005 and 2010, the OpenNebula project was to be the first software development kit that would facilitate cloud use in 2005 which allowed cloud providers to host data that users could interact with via mobile and web applications. In 2006, Amazon once again released their EC2 or Elastic Compute Cloud which would consolidate the use of data centers and give more trust to users. After this, services like Dropbox appeared as well as several agreements and policies such as QoS (quality-of-service) and SLA (service level agreement) that would support the cloud movement. Another important movement, called ‘edge computing’ came to life in 2009. So when the 2010s came, and especially between 2012 and 2018, we would see the second generation of cloud computing and storage. Developments in services and an increase in the number of providers of the technology would put cloud technology on the map. The ability to monitor was introduced as well as the critical real-time streaming service model as we approached ‘container services’ around 2014. Now, cloud technology was finally fully public, integrated, and in use. Both private and public clouds could now be combined into hybrid clouds.
So, now we approach the period of the 2020s. Modern-day cloud storage systems use a critical piece of software called a ‘hypervisor’. According to IBM, “By installing and configuring a piece of software called a hypervisor across multiple physical nodes, a system would present the environment’s entire resources as though those resources were in a single physical node”. IBM continues to explain some fundamentals of cloud computing today, in that “To visualize that environment, technologists use terms like utility computing and cloud computing, since the sum of the parts seemed a nebulous blob of computing resources you could then segment out as needed (like telecommunications companies did in the 1990s)”. As hypervisor technology and cloud developments grew in sophistication, organizations began to understand the benefits of the big pool of remotely accessible data that is the cloud.
We have come from the primitive TCP/IP client-server protocols, virtualization processes, and basic grid computing systems to a world where IoT is everywhere (Internet-of-Things) and billions of smart devices that have on-the-go mobile internet are connected to multiple clouds at the same time all hours of the day at ultra-high speed. Now, it is time to look at the problematic side of this wonderful cloud technology, that is the cybersecurity aspect of it. Today, all the most popular PaaS and SaaS applications (cloud-native apps like Google mail, Google Apps, and many more) are in danger from incidents that are both internal and external. What does this mean? This means that there are two key vulnerabilities that can bring down an entire organization by breaching its cloud storage; human error and cybercrime.
As we have mentioned earlier, internal and external threats to a sensitive cloud storage environment are very real issues. Today, cybercrime has grown to be very sophisticated and merciless when it comes to attacking, disrupting, and/or destroying an organization’s backbone, that is the cloud storage environment. Additionally, the fact that human error and negligence can lead to even more unsecured endpoints as well as expose vulnerabilities in cybersecurity is very worrying for an economy that is statistically unprepared for the future, in terms of securing critical cloud environments. Let’s take a look at some of the foremost concerns when it comes to cloud security;
Above are several issues that plague the cloud storage subject that organizations are tied to, ranging from technical insufficiency, human error, unpreparedness to lack of policy and compliance readiness. In a dynamic, ever-changing, and complex multi-cloud (private, public and hybrid combinations) environment, cybersecurity issues are bound to happen, even in 2021. Today’s cloud solutions are very robust, with examples from AWS, GCP, Azure, and others that offer native security and services from the get-go, however, the responsibility falls on the end-user who has to think about data leaks, breaches, targeted attacks, and misconfigurations and/or negligence due to human error for which cloud providers are not responsible. Industry cybersecurity leaders and IT giants recommend the following approach to an innately more cyber secure cloud environment;
The remainder of this decade will show a sort of walling-off of cloud environments, with increased security controls as the technology is accessed by even more users and the number of apps grows out of control. With an increasing dependence and demand for cloud storage solutions, cybercriminals’ jobs are getting easier and easier. Building security into a cloud storage cybersecurity plan from the ground up is critical, which will stop the intrusion of unauthorized users into the system, as well as prevent any lateral movement. This means a much stricter, no-mercy cybersecurity approach that will require more levels of authorization from the user side and have negative effects on the user experience (UX) due to extra layers of security, but this is mandatory for protecting enormous cloud environments. We should see a steep rise in the demand for cybersecurity training and education in organizations, which should greatly benefit the future cybersecurity of these systems.