Why implement a Cloud Platform

Category: Cloud Platform

March 14, 2022 by Andre Verheij

A common scenario

Imagine a random AWS customer: Nick’s Xmas Light Inc. Nick has built an AWS backend for a very cool feature to control and visualise his Christmas lights on his house. Passers-by can use a QR code or NFC tag on the mailbox to open a website that allows them to select the next christmas light sequence and song, which then plays and the lights react to the music playing. Christmas Lights

Nick has an EC2 instance, running a website, behind an Application Load Balancer (ALB). This EC2 instance runs a NodeJS website which the passers-by interact with on their phones. The NodeJS website sends requests via API Gateway to a backend API at Nick’s house which triggers the lights, as well as writing data to a DynamoDB table storing the records of what song was selected and what type of phone was used.

The potential problem

Some issues with the current setup

  • Single AWS account
  • Single ec2 instance, no high availability
  • No development environment. The result of this was that the external developer wasn’t able to test updates. The updates were done straight in production.
  • Not using Infrastructure as Code (IaC). The setup was done through the AWS console.

Nick’s Xmas Lights Inc isn’t the first AWS customer that sets up things like this, to get something going, see how it works and then moves on to other matters in the business.

There are some solutions for the above issues:

  • Use multiple accounts. One for Dev, one for production.
  • Build multiple ec2 instances and deploy the same code to all ec2 instances. Spread the ec2 instances over multiple availability zones.
  • Start using Cloudformation for Infrastructure as Code.

For some of the above issues, there are a raft of other things to consider and also various solutions, which we’ll explore in other blog posts. In this post, we’ll discuss the multiple accounts setup.

Multiple account structure

Using multiple AWS accounts is a suggested way to improve security and impact from issues in one account that won’t impact other accounts. The idea behind this is to reduce the “blast radius”. If a dev account is misconfigured and someone by mistake turns off all ec2 instances, it’s not impacting the production environment. The same goes for logging and auditing. All actions logged in dev and prod accounts can be logged into separate accounts, where nobody has access to delete the log entries, so all the data in these logs is still protected, even if someone is able to do damage to a Dev account. ou-design This is where AWS Control Tower comes in. AWS Control Tower is a service that makes it easy to set up a baseline best practice starting point for your AWS cloud environment.

Control Tower uses existing AWS services to combine this into a “self service” management environment, where new accounts are easily created. These new accounts are then set up with the required security settings, centralised logging, networking and access control. This is done in the “Account Factory”, using AWS Service Catalog to make this happen.

Basic setup of AWS Control Tower

To set up AWS Control Tower, you start with a brand new AWS account. This will be the management account. There are some tricks with regards to email addresses that are best to set up before you get started, we have a blog post for this: How to set up email addresses for AWS Control Tower

In this new account we suggest setting up AWS Organizations and AWS SSO (Single Sign On) first. There is another blog post coming for SSO. Once AWS SSO is set up, then AWS Control Tower can be set up and AWS SSO will then be used in the new AWS accounts by Control Tower.

Control Tower will set up a Logging account and an Audit account as part of the process. After the completed setup, you could go to town on creating more accounts, but best to do this after a bit of a design phase, to make the environment fit for purpose.

Customisations for Control Tower

As part of deploying AWS Control Tower it’s also recommended to deploy an AWS solution called “Customisations for Control Tower” and as the name suggests, it’s allowing for customisations on top of what control tower natively provides. The customisations process is triggered automatically when a new account is created, that will then deploy additional configurations in your new AWS accounts. Some ideas for customisations could be:

  • Block S3 public buckets
  • Enable EBS encryption by default
  • Enable GuardDuty automatically
  • Enable SecurityHub automatically
  • Deploy Github identity provider (used for CI/CD pipelines, another blog post)
  • Networking configuration

The customisations framework uses Cloudformation templates that get deployed to the various accounts using Stacksets. Using a configuration file for the environment you specify which accounts or Organisational Units you want to deploy to. It’s flexible enough to use the name of the account to deploy to. No need to hard code account numbers.

Setup of guardrails in Control Tower

Control Tower enables and enforces policies in your AWS accounts that stop bad things from happening or from things that might be costly. For example, it will stop you from deploying anything into AWS Regions you don’t want to or can’t deploy into for legal reasons. Another example is to “disable lifecycle operations on S3 buckets” to stop people from changing the lifecycle setting on S3 buckets so objects in the S3 buckets can’t go into AWS Glacier for long term storage.

Use of Organisational Units (OU’s) in Control Tower

To structure the AWS accounts in organisations units is a common practice because it allows you to set “Service Control Policies” (SCP) for the OU. Some of the above guard rails are in fact SCP’s. Using the mechanism of OU’s you can be selective where these SCP’s are applied. So developers might be allowed to modify S3 bucket lifecycle settings in the developer type accounts in the Development OU, where they are working out what settings might be needed for their application. But when it comes to production, developers shouldn’t have access anyway, nor should their deployment tools be able to change the lifecycle settings.

Workload accounts

After the creation and deployment of Control Tower’s default accounts like Logging and Audit it’s time to create accounts for your workloads. This is the part where we “solve” the problem described earlier in this blog post. Yes we have created the accounts for logging and audit, but that isn’t where you run your workloads from. We need to create a few workload accounts. Like a development, test and production account. Not everyone needs a test account but at a minimum you need Development (dev) and Production (prod).

Developers should only have access to the dev environment, where they develop the software for your organisation (or whatever services your company provides). Once the developers are ready to deploy the software, a copy of the development environment is deployed to the production environment. This deployment should really be done using CI/CD processes. This automates the deployment to the point where testing and deployment is scripted and completely hands off. The production environment should be locked down, not many people need access to this environment.

So does this solve the problem?

Control Tower helps to create and maintain the multiple accounts for dev and production. Having multiple accounts and allowing for growth in your organisation will set you up for the future. Creating new accounts for new ideas or testing new features will enable you to keep the “Business as Usual” accounts clean, until a deployment method has been developed for the new product or application.

Want to learn more? Contact us here!