Amazon Web Services Solutions Architect Associate Certification

Yesterday I took my Amazon Web Services Solutions Architect Associate certification exam and I’m stoked to have passed with 90%.

I found the Solutions Architect exam too be much harder than the Developer exam; the questions are much more verbose - you have to make much more of a conscious decision as to what information in the given scenario is relevant and what isn’t.

The major themes in my exam were:

  • VPC (ACL, security groups, private subnet connectivity, VPC peering)
  • S3 (hybrid cloud storage, ACL/bucket policies/IAM permissions and storage classes)
  • EBS (moving volumes between regions/AZs, encrypting volumes, setting up RAID)

If you’re looking to take the Solutions Architect exam I’d also recommend understanding the AWS Shared Responsibility Model.

As with the AWS developer associate exam I took last year I found the whole experience an excellent test of the knowledge I’ve picked up after two and a half years of building and maintaining multiple production environments on AWS.

I can’t recommend the ACloud Guru courses enough; whilst I am very familiar with day to day operations with the vast majority of the AWS services, the guru courses really helped me better understand services that are useful for hybrid cloud environments which I’m unlikely to make as much use of.

Re-Introducing Jenkins: Automated Testing with Pipelines

I’ve written an article for SitePoint about using Jenkins and pipelines to test and build your code. Check it out here -

Using AWS CodePipeline and CodeBuild to update a Jekyll website

This week at the AWS Re:Invent 2016 event in Las Vegas a new CodeBuild service was introduced. CodeBuild is essentially a build service which given an input (generally code), will process it in some way and then output a build artifact.

My blog is a static website created with Jekyll and hosted with Github Pages. I’ve been wanting for sometime to move away from Github Pages and store the site in an S3 bucket however this would require some sort of mechanism to update the S3 hosted version whenever I publish a new post (which Github Pages does automatically). There are a few methods to this using CircleCI and TravisCI out there but I wanted to see if CodeBuild could be used instead.

My first step was to setup a new S3 bucket (using the very speedy new S3 console):

The bucket needs to be configured for static website hosting which is trivial to enable in just a few clicks:

Finally the bucket policy needs updating to allow anonymous reads:

    "Statement": [
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::alexbilbiecom/*"

Then I navigated to the CodeBuild console and created a new project using the following settings:

  • Project name: alexbilbie_com
  • Source provider: github
  • Repository:
  • Environment image: Use an image maanged by AWS CodeBuild
  • Operating System: Ubuntu (currently the only available option)
  • Runtime: Ruby
  • Version: ruby:2.3.1
  • Build specification: Use the buildspec.yml in the source code root directory
  • Artifact type: No artifact
  • Service role: Create a service role in your account

Through some experimentation I discovered that if you let CodeBuild automatically upload the Jekyll build output then you can’t have the site in the root of the bucket (it will be under <artifact name>/_site/ prefix). The CodeBuild Ruby image comes with the AWS CLI pre-installed so I used a build command to upload the result directly to the bucket root using aws s3 sync command.

Next up I edited the service role that the CodeBuild wizard created to allow write access to the website S3 bucket. Normally I would create the service role manually however at the time of writing you can’t create a CodeBuild service role (at least through the console).

I added the following to the generated policy:

    "Effect": "Allow",
    "Resource": [
    "Action": [

The final step was to add a buildspec.yml file to the root of the Github repository:

version: 0.1
      - gem install jekyll jekyll-paginate jekyll-sitemap jekyll-gist
      - echo "******** Building Jekyll site ********"
      - jekyll build
      - echo "******** Uploading to S3 ********"
      - aws s3 sync _site/ s3://alexbilbiecom

Inside the buildspec.yml you can hook into the install, pre_build, build and post_build lifecycle events as well as specify artifacts to upload to S3. In this case I just needed to hook into install and build events.

From here you can start a new build, I used the default settings and clicked Start Build.

A few minutes later…success!

At this point I could navigate to the bucket static site URL to verify the generated site was working correctly.

CodeBuild is set up but it currently requires manual triggering of build. Fortunately last year’s new service, CodePipeline can help out here.

In the CodePipeline console I created a new pipeline linked to the Github repository:

Under the build setting I selected AWS CodeBuild as the build provider and selected the CodeBuild project I’d created.

Note: Currently CodePipeline requires the build provider to produce an output which the CodeBuild project doesn’t currently have. I went back to the CodeBuild settings and set them to this:

I skipped the beta section of the wizard as a deployment isn’t required.

Finally I let the console create a CodePipeline service IAM role for me and completed the wizard.

Now whenever I make a change in the Github repository CodePipeline will automatically trigger CodeBuild and my website will be updated.

From my brief look at CodeBuild it looks like the start of a really useful service. Usually I set clients up with a Jenkins build server however from my brief play here and some reading of the documentation I’m keen to explore the service more to see to what extent it could potentially replace Jenkins in the future.

Can't Eat That - An app for food allergy sufferers

This year I’ve been fortunate to travel a lot more than I ever have done - from Denmark to Japan to France to Canada and the US as well.

I have a severe allergy to tree nuts and legumes (such as peanuts) and this often presents a problem when expressing my dietary requirements both abroad and in new restaurants at home due to language barriers.

There are a few different companies out there that will happily sell you credit card-sized allergy translation cards at £5-10 per language but in a world where I’m more likely to have my iPhone on me than my wallet I decided an app makes more sense.

About 18 months ago I took a short iOS Swift course with Treehouse but I never really got into it. This time, armed with an actual end goal in mind (always useful when learning new things), I sat down and refreshed my understanding and built an app I’ve called Can’t Eat That.

Working with a fantastic translation agency in Cardiff (hi Rob!) the app has phrases describing 11 allergies in 18 languages.

The allergies included in the app are based on the EU’s food allergen label requirements (generally the items highlighted on an food wrapper ingredients label):

  • Eggs
  • Fish
  • Gluten
  • Milk
  • Soy and soybeans
  • Nuts and nut oil
  • Peanuts and peanut oil
  • Crustaceans (such as prawns and crabs)
  • Molluscs (such as mussels and squid)
  • Celery and celeriac
  • Mustard
  • Sesame
  • Sulfur
  • Lupin

The lovely team at Applingua expertly translated these allergies into the following languages:

  • Arabic
  • Chinese
  • French
  • German
  • Greek
  • Hindi
  • Indonesian
  • Italian
  • Japanese
  • Korean
  • Malaysian
  • Polish
  • Portugese
  • Russian
  • Spanish
  • Thai
  • Turkish

If you have an iOS device please do have a play with the app - both the app and the French language pack are free to download.

Likewise if you yourself have an allergy or have any friends or family who have allergies please do let them know about it.

I will write about my experiences building an iOS app and learning Swift in a future post.

Starting a Jenkins multi-branch pipeline build from a Bitbucket commit

After some experimentation I’ve finally worked out how to start a Jenkins multi-pipeline build via a notification from Bitbucket when someone pushes a commit.

The first step is to disable CSRF protection (I know…I know…but it’s necessary to allow remote access). You can do this by selecting Manage Jenkins, Configure Global Security then unchecking the Prevent Cross Site Request Forgery exploits option.

Next step, navigate into your job from the main page and select Branch Indexing from the menu on the left. From this page right click on the link that says Run Now, copy the link address and paste it into a text editor.

Now go back to the Jenkins root menu and select People, then choose your user (or preferably a dedicated Jenkins user), choose Configure, and then reveal the API token. Copy it and head back to your text editor.

Assuming the branch indexing run now URL is, the Jenkins user is jenkins and the API token is RdFrCiEwgs9boUsJVHoi modify the URL so it looks like this:

Next go into the settings of your Bitbucket and create a new webhook, paste in the modified link and save the form.

Commit something and you should see that a moment later Jenkins will run a branch index on the job and then run the build for the branch you committed to.

Multi-branch pipelines are a very new feature to Jenkins 2.0 and are still very part-baked in my opinion. As I get more proficient I will write a tutorial because whilst frustrating to get setup and going they’re a huge time saver.