Build Your Own AWS Innovation Lab

Build Your Own AWS Innovation Lab

Introduction

As an AWS practitioner, I need a place to experiment with new AWS services, Infrastructure-as-Code scripts and build small projects - let's create one that satisfies the following needs:

  • Minimal costs during periods of no Lab activities (it is OK to spend a few dollars here and there if needed)

  • Ability to deploy AWS resources into a bare-bone AWS account, one or more Regions

  • Using GitHub.com repositories for source control

  • Support for multiple Terraform projects running simultaneously

  • Low sensitivity of data and, as a consequence, broader security permissions

Innovation Lab Architecture

This Lab environment comprises of

  • Personal GitHub.com account and repositories for Terraform

  • Google authenticator and Gmail account

  • Amazon Web Services Organization with all features enabled, including Identity Center

  • Optional - development instance of Okta Identity Provider

Since this environment will be used exclusively for experimentation on mock-up data, it would be acceptable to implement fewer security safeguards in comparison to any "real" environments on AWS. You will avoid significant out-of-pocket costs for using Control Tower, etc.

New AWS accounts can be created and terminated through Amazon Management Console (each time restarting the 12-month Free Tier) with root identities in a "plus" email format below:
<your_Gmail_username>+<project_name>@gmail.com
(example: )

CI/CD Pipeline for Terraform Deployments

After some deliberation, I decided to use Amazon CloudFormation to provision a CodePipeline that

  • Pulls code from the GitHub.com repository

  • Runs a Terraform Plan command

  • Pauses to give me time to review the Plan

  • Applies Terraform Plan to the target Region

In anticipation of having multiple Terraform projects running in parallel, the Amazon resource naming convention is based on a short CloudFormation Stack name, with an occasional sprinkle of Amazon account numbers to make sure they are distinct within Amazon Web Services namespaces.

Parameters

You need to understand the concept of CodePipeline GitHub Connections, but the gist of it is:

  • It is a way to grant AWS access to your repository

  • Connections are global resources - you can create it in one region and use in multiple regions

  • When created programmatically (through CloudFormation or CLI), an additional action is required to activate the Connection via Amazon Management Console.

The rest of the Parameters should be self-explanatory.

Parameters:

  ExistingGitHubConnection:
    Type: String
    Description: Provide an existing GitHub Connection ARN or leave blank to create a new one    

  GitHubRepo:
    Type: String
    Description: Name of GitHub Repo to be used by the pipeline (user_name/repo_name)

  GitHubBranch:
    Type: String
    Description: Name of GitHub Branch to be used by the pipeline
    Default: main    

  BuildImageName:
    Type: String
    Description: Docker image for build projects - Ubuntu is recommended
    Default: aws/codebuild/standard:7.0

  TerraformVersion:
    Type: String
    Description: Version of Terraform to be used by the pipeline
    Default: 1.4.6

Conditions

An existing condition can shared by all services and Regions within the account, so this condition will define if a new Connection will be used by the pipeline.

Conditions:

  NewGitHubConnection: !Equals
    - ""
    - !Ref ExistingGitHubConnection

Resources

This template will create the following resources:

  • GitHub Connection (I was not able to make the creation of the Connection optional even if you decide to re-use an existing Connection. Good news is that it is free and can be deleted at a later time)

  • S3 buckets for CodePipeline artifacts and Terraform state file

  • DynamoDB for maintaining Terraform locks

  • IAM policies for CodePipeline and CodeBuild projects

  • Finally, the CodePipeline itself with the following stages

    • Source (GitHub)

    • Build (Terraform Plan)

    • Manual Approval

    • Build (Terraform Apply)

Resources:

#####################################################################
# GitHub Connection will be created in a PENDING state
# Use AWS Management Console -> CodePipeline -> Settings to activate 
# If re-using the existing Connection, the new one's name will end with "to_be_deleted"
#####################################################################
  GitHubConnection:
    Type: 'AWS::CodeStarConnections::Connection'
    Properties:
      ConnectionName: !If [NewGitHubConnection, !Sub "${AWS::StackName}-${AWS::AccountId}-github", !Sub "${AWS::StackName}-to_be_deleted"]
      ProviderType: GitHub

#####################################################################
# S3 Bucket for pipeline artifacts 
#####################################################################
  PipelineBucket:
    Type: 'AWS::S3::Bucket'
    Properties:
      BucketName: !Sub ${AWS::StackName}-${AWS::AccountId}-codepipeline 

#####################################################################
# S3 Bucket for Terraform state files 
#####################################################################
  TerraformStateBucket:
    Type: 'AWS::S3::Bucket'
    Properties:
      BucketName: !Sub ${AWS::StackName}-${AWS::AccountId}-tfstate

#####################################################################
# DynamoDB table for Terraform locksv
#####################################################################
  TerraformLockTable:
    Type: AWS::DynamoDB::Table
    Properties: 
      TableName: !Sub ${AWS::StackName}-${AWS::AccountId}-tflock    
      BillingMode: PAY_PER_REQUEST 
      AttributeDefinitions: 
        - 
          AttributeName: "LockID"
          AttributeType: "S"
      KeySchema: 
        - 
          AttributeName: "LockID"
          KeyType: "HASH"

#####################################################################
# IAM Role for CodePipeline 
#####################################################################
  PipelineServiceRole:
    Type: AWS::IAM::Role
    Properties:
      RoleName: !Sub ${AWS::StackName}-pipeline-service-role
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Action: sts:AssumeRole
            Principal:
              Service:
                - codepipeline.amazonaws.com
                - codebuild.amazonaws.com
      Policies:
        - PolicyName: !Sub ${AWS::StackName}-CodePipelineInlinePolicy
          PolicyDocument:
            Version: '2012-10-17'
            Statement:
              - Sid: UseGitHubConnection
                Resource: '*'
                Effect: Allow
                Action:
                  - codestar-connections:UseConnection
              - Sid: CodeBuildPermissions
                Resource: '*'
                Effect: Allow
                Action:
                  - codebuild:StartBuild
                  - codebuild:BatchGetBuilds
                  - sns:Publish
              - Sid: CloudWatchLogs
                Resource: '*'
                Effect: Allow
                Action:
                  - logs:CreateLogGroup
                  - logs:CreateLogStream
                  - logs:PutLogEvents
              - Sid: AccessPipelineBucket
                Effect: Allow
                Action:
                  - s3:Get*
                  - s3:ListBucket
                Resource:
                 - !Sub arn:aws:s3:::${PipelineBucket}
              - Sid: AccessPipelineBucketObjects
                Effect: Allow
                Action:
                  - s3:PutObject*
                  - s3:GetObject*
                Resource:
                  - !Sub arn:aws:s3:::${PipelineBucket}/*

#####################################################################
# IAM Role for CodeBuild
# This role should be able to produce all Amazon resources you need,
# thus elevated privileges are given 
# DO NOT DO IT IN HIGHER ENVIRONMENTS WITHOUT PERMISSION BOUNDARIES!
#####################################################################
  ProjectServiceRole:
    Type: AWS::IAM::Role
    Properties:
      RoleName: !Sub ${AWS::StackName}-codebuild-role
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Action: sts:AssumeRole
            Principal:
              Service:
                - codebuild.amazonaws.com
      ManagedPolicyArns:
        - arn:aws:iam::aws:policy/AdministratorAccess   

#####################################################################
# CodeBuild Project to run Terraform plan command
#####################################################################
  TerraformPlanProject:
    Type: AWS::CodeBuild::Project
    Properties:
      Name: !Sub ${AWS::StackName}-terraform-plan
      Artifacts:
        Type: CODEPIPELINE
      Environment:
        ComputeType: BUILD_GENERAL1_SMALL
        Type: LINUX_CONTAINER
        Image: !Ref BuildImageName
      ServiceRole: !GetAtt ProjectServiceRole.Arn 
      LogsConfig:
        CloudWatchLogs:
            Status: ENABLED         
      Source:
        Type: CODEPIPELINE
        BuildSpec: !Sub |
            version: 0.2
            env:
              exported-variables:
                - BuildID
                - BuildTag
            phases:
              install:
                commands:
                  - "curl -s https://releases.hashicorp.com/terraform/${TerraformVersion}/terraform_${TerraformVersion}_linux_amd64.zip -o terraform.zip"
                  - "unzip terraform.zip -d /usr/local/bin"
                  - "chmod 755 /usr/local/bin/terraform"
                  - "rm terraform.zip"                  
              pre_build:
                commands:
                  - terraform init -input=false -backend-config="bucket=${TerraformStateBucket}" -backend-config="key=${AWS::StackName}.tfstate" -backend-config="dynamodb_table=${TerraformLockTable}" -backend-config="region=${AWS::Region}"
              build:
                commands:
                  - terraform plan -lock=true -input=false -out=${AWS::StackName}-terraform.tfplan -no-color 
            artifacts:
              name: TerraformPlan
              files:
                - ${AWS::StackName}-terraform.tfplan

#####################################################################
# CodeBuild Project to run Terraform apply command
#####################################################################     
  TerraformApplyProject:
    Type: AWS::CodeBuild::Project
    Properties:
      Name: !Sub ${AWS::StackName}-terraform-apply
      Artifacts:
        Type: CODEPIPELINE
      Environment:
        ComputeType: BUILD_GENERAL1_SMALL
        Type: LINUX_CONTAINER
        Image: !Ref BuildImageName
      ServiceRole: !GetAtt ProjectServiceRole.Arn 
      LogsConfig:
        CloudWatchLogs:
            Status: ENABLED         
      Source:
        Type: CODEPIPELINE
        BuildSpec: !Sub |
            version: 0.2
            env:
              exported-variables:
                - BuildID
                - BuildTag
            phases:
              install:
                commands:
                  - "curl -s https://releases.hashicorp.com/terraform/${TerraformVersion}/terraform_${TerraformVersion}_linux_amd64.zip -o terraform.zip"
                  - "unzip terraform.zip -d /usr/local/bin"
                  - "chmod 755 /usr/local/bin/terraform"
                  - "rm terraform.zip"             
              pre_build:
                commands:
                  - terraform init -input=false -backend-config="bucket=${TerraformStateBucket}" -backend-config="key=${AWS::StackName}.tfstate" -backend-config="dynamodb_table=${TerraformLockTable}" -backend-config="region=${AWS::Region}"

              build:
                commands:
                  - cp $CODEBUILD_SRC_DIR_TerraformPlan/${AWS::StackName}-terraform.tfplan .
                  - terraform apply ${AWS::StackName}-terraform.tfplan       

#####################################################################
# Finally, the CodePipeline 
##################################################################### 
  IaCPipeline:
    Type: AWS::CodePipeline::Pipeline
    Properties:
      RoleArn: !GetAtt PipelineServiceRole.Arn
      Name: !Sub ${AWS::StackName}-pipeline
      ArtifactStores:
        - Region: !Ref AWS::Region
          ArtifactStore:
            Type: S3
            Location: !Sub ${PipelineBucket}
      Stages:
        - Name: Get-GitHub-Source
          Actions:
            - Name: GitHub
              RunOrder: 1
              ActionTypeId:
                Category: Source
                Provider: CodeStarSourceConnection
                Owner: AWS
                Version: '1'
              Namespace: GitHubSource
              OutputArtifacts:
                - Name: GitHubCode
              Configuration:
                ConnectionArn: !If [NewGitHubConnection, !Ref GitHubConnection, !Ref ExistingGitHubConnection]
                FullRepositoryId: !Ref GitHubRepo
                BranchName: !Ref GitHubBranch
                OutputArtifactFormat: CODE_ZIP
                DetectChanges: true

        - Name: Create-Terraform-Plan
          Actions:
            - Name: terraform_plan
              RunOrder: 1
              Namespace: TFPlan
              InputArtifacts:
                - Name: GitHubCode
              OutputArtifacts:
                - Name: TerraformPlan
              ActionTypeId:
                Category: Build
                Provider: CodeBuild
                Owner: AWS
                Version: '1'
              Configuration:
                ProjectName: !Ref TerraformPlanProject
        - Name: Approve-Terraform-Plan
          Actions:
            - Name: review-plan
              RunOrder: 1
              ActionTypeId:
                Category: Approval
                Provider: Manual
                Owner: AWS
                Version: '1'

        - Name: Apply-Terraform-Plan
          Actions:
            - Name: terraform-apply
              RunOrder: 1
              Namespace: TFApply
              InputArtifacts:
                - Name: GitHubCode
                - Name: TerraformPlan
              ActionTypeId:
                Category: Build
                Provider: CodeBuild
                Owner: AWS
                Version: '1'
              Configuration:
                ProjectName: !Ref TerraformApplyProject
                PrimarySource: GitHubCode

GitHub Repository for Terraform

Notice that the pipeline provides necessary Terraform backend information through CLI parameters to make it compatible with running the same Terraform code in all accounts and Regions

terraform init -input=false \
     -backend-config="bucket=${TerraformStateBucket}" 
     -backend-config="key=${AWS::StackName}.tfstate" 
     -backend-config="dynamodb_table=${TerraformLockTable}" 
     -backend-config="region=${AWS::Region}"

So in the Terraform code, you only need to specify the backend type

terraform {
  backend "s3" {
    encrypt = true
  }
}

Setting Up the Pipeline

  • Create a CloudFormation Stack from this template through Amazon Management Console or CLI (CodePipeline will try to execute, but fail immediately due to GitHub Connection being in a PENDING state)

  • Activate GitHub Connection

  • "Release Changes" in CodePipeline

  • Once Terraform-Plan stage is completed, review "Details" to check the output of Terraform Init and Terraform Plan commands

  • If satisfied, review and approve the Manual step

  • Check creation of Amazon resources upon completion of Terraform-Apply stage.

Cleaning Up

Since both Terraform and CloudFormation are involved, there are multiple steps to clean up the resources

  • Update your Terraform code to remove the resources and re-run the pipeline

  • If you plan to use Amazon Management Console to delete CloudFormation stacks, keep in mind that S3 Pipeline and Terraform state buckets need to be emptied first. You can force the deletion of S3 buckets with content when using AWS CLI.

  • Delete CloudFormation Stack

  • Delete CloudWatch Log Groups that were created for CodePipeline and CodeBuild Projects

Conclusion

The ability to quickly launch and then destroy a temporary Cloud environment without breaking the bank sometimes makes a difference between taking on an innovation or learning project, or not.