2018-02-19

Peanut Butter and Chocolate: Azure Functions CI/CD Pipeline with AWS CodeCommit (Part 1 of 6)

2018-02-19-01
Source: 1981 Reese's Peanut Butter Cup Advertisement

Intro

This blog series will cover a Proof of Concept (POC) Project for creating a PowerShell-based Azure Functions CI/CD pipeline where the code is stored in AWS CodeCommit git-based version control system. The pipeline will be created and deployed using Windows PowerShell 5.1. Every step of the pipeline deployment process will be verified with Pester tests. The result of the project will be the ability to push changes to an AWS CodeCommit repository and those changes will be automatically deployed to Azure Functions.

This blog series is targeted at intermediate level PowerShell users and basic PowerShell concepts will not be described in detail. Also, this series will require some basic understanding of both Azure and AWS clouds and their PowerShell based management. Git and C# .NET Core are also leveraged in this project but they will not be covered in depth as this is a PowerShell-centric blog. Readers need only be familiar with basic concepts of git and C#.


Series Table of Contents


Part 1

In this part of the series, I will cover why this project was done, why the project was named the way it was, the project prerequisites, the primary components of the project, and how the CI/CD pipeline flows.

Disclaimer: this is a Proof of Concept only. None of this has been production tested. Deploying cloud resources may or may not incur cost and those are the responsibility of the cloud account and subscription owner to pay. The PowerShell-based Azure Functions are considered experimental. Do not deploy this to production environments. Mark Kraus is not responsible to for any physical or financial harm, damages, loss of life, or financial remuneration as a result of using this project. "Don't try this at home!"


Why Was "Peanut Butter and Chocolate" Chosen for the Project Name?

The name choice for this project might be odd for younger readers or for those not familiar with American culture. There is a candy product named Reese's Peanut Butter Cups which is essentially chocolate covered peanut butter in the shape of a miniature pie or cup. In the 1970's and 1980's Reese's ran a series of television advertisements where two passing strangers, one with peanut butter and one with chocolate, accidentally collide. The chocolate and peanut butter accidentally mix resulting in a serendipitously delicious treat.

The phrase "You put peanut butter in my chocolate!" taken from this series of advertisements has become a kind of catch phrase of mine. Years ago I began using it to describe disparate or incompatible technologies when used together.  I use the phrase "Putting peanut butter in chocolate"  to describe the process of integrating seemingly incompatible technologies. I used this often when dealing with Windows and Linux cross-platform solutions and integrations. Now that we live in a world where "cloud" is becoming the platform, I find applying this "peanut butter and chocolate" phraseology to cross-cloud solutions to be just as apropos.

For this project, Peanut Butter refers to AWS and Chocolate to Azure. Both alone are tasty, but together they are even better! Since this is a cross-cloud POC and my first adventure into cross-cloud solutions I wanted a fun name for this project. Thus "Peanut Butter and Chocolate" or "PBnC" for short.


More Importantly, Why The $Expletive Would You do This?

"Why would anyone architect something like this?"
- David O'Brien

This is a fair question. However, I have a real-world use-case for this. My Company, Mitel, has selected AWS for its cloud infrastructure. AWS is awesome! But so is Azure! Due to several confidential circumstances, we will have an Azure commitment indefinitely. We will never be a one-cloud company, and that is perfectly ok, in my opinion.

I want to leverage serverless technologies as much as possible for my team's automation. I'm a bit enamored by serverless. For years now, servers have been nothing more to me than a place to run code. They are a pain to maintain, keep updated and secured, and they sit around doing nothing for long periods of time.

I reached out to AWS and had a nice long meeting with a certain member of the PowerShell community famous for their AWS PowerShell skills. It was great and informative, but the ultimate answer was that Windows PowerShell based serverless was not an option in AWS. I could do cool things with EC2 and/or ECS to make execution containers for PowerShell dynamically available, however, these are not "serverless" solutions. They also would be more expensive than leveraging Azure Automation or Azure Functions.

Also, I can't use VSTS. The decision was made for IT to store all of its code in AWS CodeCommit. This leaves me with either using a "serverfull" solution in AWS or to go cross-cloud.

This POC is to show the possibility for us to leverage both our AWS and Azure commitments to realize cost-savings and leveraging all the benefits of serverless solutions and version controlled source code.

Why Windows PowerShell? Why not PowerShell Core or Python? Because my team already has a very significant amount of automation and tooling written in PowerShell (running on a single server under Task Scheduler) and many of the APIs required (On-prem AD and SharePoint) are not yet available in .NET Core or there is insufficient tooling available. Sure, we could spend several years getting up to speed on Python, working through feature parity limitations, and converting all of our code to Python to leverage AWS Lambda Functions, but that's just not realistic. To be clear, there are other options on the table to accomplish our needs, but this one is the running to be selected.

Finally, I needed to learn the AWSPowerShell module. This provided me an excellent opportunity to get acquainted with it. In the process I also expanded my knowledge of the AzureRM cmdlets. I had never done a template based resource deployment before. I also wanted to validate Jeffery Snover's claim that PowerShell can manage any cloud by managing two clouds at once.


Prerequisites

First, You will need the project code git cloned from https://github.com/markekraus/PeanutButterChocolate. Next, You will need both an Azure subscription and an AWS account. Cloud pricing is not my forte, but the resources deployed in this project will either be free or very cheap depending on your accounts and subscriptions. Through the 10 or so test deployments of this I have accrued a total of US $3.00 usage fees between both AWS and Azure. Be advised that this may cost you a small amount and I am not responsible for any charges you incur as a result of deploying this project.

This project is intended to be deployed from a Windows system with Windows PowerShell 5.1. You will also need to install the .NET Core SDK and Git for Windows. The deployment will be need to be done by accounts with admin level access in both Azure and AWS clouds (instructions not provided). I don't know what all fine-grained permission the admin accounts require to create all the resources, so this project assumes the AWS and Azure account use to deploy cloud resources are full admins.

Additionally, you will need to follow the instructions for associating and configuring an SSH key for use in Git to push/pull to/from AWS CodeCommit for your AWS Admin account.

Finally, this project makes use of Pester 4.2.0 features. You will need to update to at least Pester 4.2.0 to avoid errors in pester tests.


Primary Components

This project is made up of several parts. First, the deployment script used to deploy all of the cloud resources. Configuration.ps1 is script to deploy the resources.It is intended to be run interactively as there is one part which can currently only be done in the AWS Web Console and cannot be done ahead of time through the deployment. This script includes Pester tests for every step of the way to ensure a smooth deployment.

The next is the PBnC Azure Function demo code. This is located in the src folder. It contain a single PowerShell Azure Function "PBnC" which takes a "name" parameter and returns a string.

There is a C# Lambda Function .NET Core Solution and Project project in the CSLambda folder. This lambda is invoked by CodeCommit when any action is taken on the master branch of the repository. It then triggers an external git deployment in Azure. That trigger causes the Azure Web App to then pull from the CodeCommit repository.

The Azure resources that will be deployed are a Resource Group, a consumption plan Azure Function Web App, and the storage account used by the Azure Function Web App.

The AWS resources deployed are a CodeCommit repository, an IAM user (used for HTTPS Git Credentials), a lambda function (the C# one mentioned earlier), a role for the lambda to assume, and a KMS key used to encrypt and decrypt secrets.

The project will also generate a cc2af.yml file that will be added to the CodeCommit repository. This YAML file is used by the AWS Lambda Function to generate the Azure Web App deployment trigger. It contains the Azure Web App deployment credentials and the CodeCommit IAM user's HTTPS Git credentials. The passwords are encrypted with the KMS key created by the project. the lambda then decrypts those. This is similar to how AppVeyor allows you to store encrypted strings in its YAML file. "cc2fa" stands for CodeCommit To Azure Functions. The general idea being that the lambda is reusable for multiple CodeCommit repository and Azure Functions and the YAML files makes that possible.


CI/CD Pipeline Flow

Ok, I'll be honest here. I have no idea whether this qualifies as just CI, just CD, or CI/CD. The terms for those are kind of vague. This pipeline is a Proof Of Concept and does not include testing steps before deployment. I don't actually work in a DevOps role so these terms are not at all clear to me.

Anyway, Here is the general flow of the completed project's pipeline:

  1. Azure Function "developer" clones the CodeCommit repository to their local computer
  2. The developer branches from master in their local git repository
  3. The developer applies code changes to the Azure Function
  4. The developer commits the change to their local repository
  5. The developer pushes the local branch to a new CodeCommit branch
  6. The developer submits a CodeCommit Pull Request to merge their branch into master
  7. A code review is performed by CodeCommit Repository reviewers
  8. After a successful review a CodeCommit repository maintainer merges the developer's branch into master
  9. CodeCommit trigger is activated for master branch activity
  10. CodeCommit invokes the C# Lambda Function and sends it a CodeCommit event
  11. The C# Lambda assumes the IAM role
  12. The C# Lambda Function receives the CodeCommit event then:
    1. Retrieves the location of cc2af.yml from either default or environment variable
    2. Retrieves the CodeCommit repository information
    3. Retrieves the blob of the cc2af.yml from the repository
    4. Parses the cc2af.yml
    5. Verifies the correct branch is being acted on.
    6. Decrypts the passwords from cc2af.yml  the using KMS
    7. Generates and submits the Azure Web App deployment trigger
    8. Writes the execution environment information and trigger results to CloudWatch Logs.
  13. Azure Functions Web App receives the deployment trigger
  14. Azure Functions Web App git pulls from the CodeCommit repository
  15. Azure Functions Web App performs the deployment from the cloned repo
  16. The new code is now live on Azure Functions

It's a somewhat simple pipeline. The intersection of the two clouds is fairly minimal. AWS Lambda makes an HTTPS call to the Azure Web App deployment trigger URL. The Azure Web App makes a git pull to the AWS CodeCommit repository. That is the extent of the cross-cloud activity.

This is simple, but powerful. This bootstraps anything that Azure could possible want to do with code stored in AWS CodeCommit. An Azure Function could be written to be triggered after the deployment to then perform azure related tasks. Also, steps could be added between any of this to include testing or resource deployment/deprovisioning. That could be done in either cloud. This project is just the initial connecting of dots needed to make more robust pipelines possible.


Part 1 End

I know there is no actual PowerShell in this part. I needed to lay the ground work first. In Part 2 I will dive into the Configuration.ps1 and the first few steps of the pipeline deployment. I'm hoping to have Part 2 out in a week.

Join the conversation on Reddit!