Microservices are quickly changing the face of cloud computing, giving cloud architects the tools needed to move away from provisioning resources statically, such as with servers and virtual machines. New types of workloads, like serverless and containers, are allowing for greater operational efficiencies, and compute as a service (CaaS) is now more affordable and scalable than ever before.
Despite the advantages of microservice architectures, and the rush to implement them quickly, they require they require careful thought when it comes to security. Providing workload security at various layers can ensure that microservice architectures remain as secure as possible. It’s vital to understand all of the moving parts when it comes to microservices.
Serverless and Container Components
Before discussing the challenges, we need to better understand the components that make up both serverless and container workloads and the security needs. From a cloud security standpoint, for both serverless and container-based architectures, one cannot apply traditional security methodologies, as each retain inherent differences. With these new workload architectures, there are no longer static servers with a fixed IP running the infrastructure and applications as an indivisible unit.
Instead, serverless is an execution-based architecture that runs application code on demand—automatically building out infrastructure whenever the function (e.g., an AWS Lambda) is invoked.
Needless to say, serverless doesn’t mean that no servers are involved.
Rather, it means that the servers used to provide the computing that the user requires can update dynamically. While this may save system admins great headache by not having to think about infrastructure, it creates a nightmare for cloud security practitioners as they quickly lose visibility and control.
For a continuation of this piece, click here.