Can't Eat That - An app for food allergy sufferers

This year I’ve been fortunate to travel a lot more than I ever have done - from Denmark to Japan to France to Canada and the US as well.

I have a severe allergy to tree nuts and legumes (such as peanuts) and this often presents a problem when expressing my dietary requirements both abroad and in new restaurants at home due to language barriers.

There are a few different companies out there that will happily sell you credit card-sized allergy translation cards at £5-10 per language but in a world where I’m more likely to have my iPhone on me than my wallet I decided an app makes more sense.

About 18 months ago I took a short iOS Swift course with Treehouse but I never really got into it. This time, armed with an actual end goal in mind (always useful when learning new things), I sat down and refreshed my understanding and built an app I’ve called Can’t Eat That.

Working with a fantastic translation agency in Cardiff (hi Rob!) the app has phrases describing 11 allergies in 18 languages.

The allergies included in the app are based on the EU’s food allergen label requirements (generally the items highlighted on an food wrapper ingredients label):

  • Eggs
  • Fish
  • Gluten
  • Milk
  • Soy and soybeans
  • Nuts and nut oil
  • Peanuts and peanut oil
  • Crustaceans (such as prawns and crabs)
  • Molluscs (such as mussels and squid)
  • Celery and celeriac
  • Mustard
  • Sesame
  • Sulfur
  • Lupin

The lovely team at Applingua expertly translated these allergies into the following languages:

  • Arabic
  • Chinese
  • French
  • German
  • Greek
  • Hindi
  • Indonesian
  • Italian
  • Japanese
  • Korean
  • Malaysian
  • Polish
  • Portugese
  • Russian
  • Spanish
  • Thai
  • Turkish

If you have an iOS device please do have a play with the app - both the app and the French language pack are free to download.

Likewise if you yourself have an allergy or have any friends or family who have allergies please do let them know about it.

I will write about my experiences building an iOS app and learning Swift in a future post.

Starting a Jenkins multi-branch pipeline build from a Bitbucket commit

After some experimentation I’ve finally worked out how to start a Jenkins multi-pipeline build via a notification from Bitbucket when someone pushes a commit.

The first step is to disable CSRF protection (I know…I know…but it’s necessary to allow remote access). You can do this by selecting Manage Jenkins, Configure Global Security then unchecking the Prevent Cross Site Request Forgery exploits option.

Next step, navigate into your job from the main page and select Branch Indexing from the menu on the left. From this page right click on the link that says Run Now, copy the link address and paste it into a text editor.

Now go back to the Jenkins root menu and select People, then choose your user (or preferably a dedicated Jenkins user), choose Configure, and then reveal the API token. Copy it and head back to your text editor.

Assuming the branch indexing run now URL is, the Jenkins user is jenkins and the API token is RdFrCiEwgs9boUsJVHoi modify the URL so it looks like this:

Next go into the settings of your Bitbucket and create a new webhook, paste in the modified link and save the form.

Commit something and you should see that a moment later Jenkins will run a branch index on the job and then run the build for the branch you committed to.

Multi-branch pipelines are a very new feature to Jenkins 2.0 and are still very part-baked in my opinion. As I get more proficient I will write a tutorial because whilst frustrating to get setup and going they’re a huge time saver.

Updated: a guide to OAuth 2.0 grants

One of the most popular articles on this site is my guide to OAuth 2.0 grants.

I’ve spent some time updating it (I first wrote it in 2013).

Check out the updated article here →

Laravel Passport and league/oauth2-server

Taylor Otwell, the creator of the Laravel framework, has announced a new Laravel project - Laravel Passport - which uses my league/oauth2-server project to “build an OAuth 2.0 backed API server in five minutes”.

This will be the second League of Extraordinary Packages project that has native Laravel support; the first being the amazing Flysystem by Frank de Jonge which power’s Laravel’s Filesystem APIs.

I will hopefully have a tutorial available shortly demonstrating how to use Passport once I’ve got my hands on the code.

Amazon Web Services Developer Associate Certification

Yesterday I took my Amazon Web Services Developer Associate certification exam and I’m very happy to have passed with 92%.

I’ve been using AWS services in production for over 18 months now and I’ve been wanting to validate my skills and extend my knowledge for a while by studying for one of the AWS certification exams.

In January had a flash sale so I picked up a few of the courses by Ryan Kroonenburg and I’ve been watching them on and off since then. After booking my exam in February I started to really knuckle down on the revision. I also attended this past week a DynamoDB fundamentals session at the AWS London Loft and added to my revision notes by reading all of the relevant AWS service FAQs.

The 80 minute exam has 55 multiple-choice questions covering IAM, EC2, S3, Elastic Beanstalk, DynamoDB, SWF, SNS and SQS. The questions are either testing you on your general knowledge of AWS services or are asking you to pick the best option in a given scenario.

My exam featured lots of questions about S3 and only two on DynamoDB (which surprised me because I’d heard this particular exam is heavy on the DynamoDB questions). I had three questions about ELB and there were three questions on the AWS SDKs. There was also a really confusingly worded question about IAM SAML federation which required me to choose two options from five very similar options. Amusingly there were a few questions from the example questions on the AWS Certification website that I recognised too. I highly recommend reading the service FAQs as there were a few questions asking questions about limitations of services in specific regions.

All in all I feel like it was a valuable experience; I know many more of the AWS services better from the additional studying - I certainly wouldn’t have been able to answer many of the questions just on my practical knowledge. Having not taken any sort of exam in over five years either it’s opened my mind to potentially studying for additional professional accreditation in the future, not necessarily for the right to call myself a fancy title or to put letters after my name but to really focus in on learning and understanding a tool or a service at a greater depth than is absorbed through general day-to-day practical usage.