May 23, 2015 - Know Your Tools


A few weeks ago I was given the opportunity to give a presentation to the dev team at work on Docker and how it could help us from a development perspective. It was a timely presentation as our team recently began transitioning from one version of our flagship product to another, where a lot of the underlying technologies have changed drastically. However, because we will still need to support the currently-released version of the product, it means that our developers would need to reset their development environments in order to diagnose customer issues frequently. The prospect of having to pivot our development environments presented a serious challenge for us, since setup for each environment is non-trivial.

Now, I’ve been a fan of Docker since I first heard about it in 2013, and have actively used it since then. But because of how stable the platform is, and the nature of my job, I never really had many chances to really dive into the more recent updates to the Docker ecosystem. However, when I did some more research for my presentation, I found out about Docker Machine, Docker Compose, Docker Swarm, as well as a number of other Docker-related tools. It made sense that I wasn’t aware of these very useful tools since the containerized-application space has been changing rapidly in the last couple years, but I did not realize how much.

This realization that I did not know about these useful tools made me realize that there are many tools out that I use that I may not fully understand or are not up-to-date with. So, how does one keep up-to-date on developments for some of the most useful tools? I’m not sure, but I would suggest subscribing to GitHub projects, email lists and RSS feeds to be notified of changes. Or, whenever running a tool, check for updates to the tool and reading the changelog, if there is one.

Finally, don’t settle for “that’s just the way it is.” Always look for new ways to do something. For instance, for a long time I used standard Docker --link arguments to link containers, but with Docker Compose, I no longer need to deal with linking containers at runtime.

This post is about 2 weeks overdue. I had the first paragraph written and had a really good post, but I forgot what my point was when I got sidetracked and didn’t return to finishing this post until must later. I’m sorry it isn’t as useful as it could be.

Apr 16, 2015 - My Reading List


Over the last few years, I’ve amassed a pretty large repository of knowledge of technologies, tools and techniques within my profession. A lot of people have often wondered how I came to know so much, but the truth is, it was actually really simple: mostly I follow and keep up on certain Twitter users, follow a few subreddits on Reddit, and keep an ear out for any mention of cool things in news articles. Obviously, it definitely helps to maintain a natural curiosity in all things related to technology, since inspiration for techniques and solutions can come from a variety of sources, but also trying to apply those techniques in imagined - but likely - scenarios. This point is important, as it helps you answer the questions “What situations is this technology/solution well-suited for?” or “Is this technology appropriate for this kind of problem?”

Anyway, I wanted to share my list of “resources” on the Internet, and I highly recommend taking a look, maybe even adding it to your own bookmarks:

More to come, but these are just a couple off the top of my head.

Apr 12, 2015 - Blog Relaunch (Again)

A few weeks ago, I revisited the thought of how I set up my blog and the rationale about why I chose to run it all from a VM on DigitalOcean running a couple Docker containers, one with Ghost and the other running Nginx. At the time, I wanted to have a powerful, yet flexible solution, and hosting my own site through Docker containers gave me the ability to easily set up and connect together various services. It also gave me the ability to have some “useful” reason for using Docker. For the past few months, this solution worked well and I was able to claim a victory for 100% uptime, though this victory was likely only given to me due to my low site traffic. And so, given all the flexibility & power I had, I asked myself whether I really needed such a complex setup for just a low-traffic, personal blog that I rarely have the time to update?

My inevitable conclusion was No. It did not make sense for me to make such a complicated system just to serve a simple blog. Not only was it overkill, but it would end up costing me $60 per year to run - although this was a very small price to pay for a VM in the cloud that I had full control over, again, for a simple low-traffic blog, it wasn’t very cost-effective.

Moreover, despite the fact that I did like how Ghost was in active development and made Markdown blog management easy, it did not allow me to write posts in separate Markdown text files and upgrades were somewhat painful (due to the theme I was using - not really the fault of Ghost).

So I began searching for a new alternative, and came across GitHub Pages. I had known about the service for quite some time, but I had never thought to use it - mainly because I was unfamiliar with Jekyll and Ruby. I did not want to have to learn how to set up Ruby on my laptop, then get Jekyll running in order to start writing posts and just get a simple blog working.

Then I took a step back and thought “Wait a minute, I don’t have to!”. Again, I thought like a Docker user and found gekkie/gh-pages, a full-fledged Jekyll container that would allow me to develop locally. Paired with Vagrant, I had everything I needed to set up my new GitHub Pages-hosted blog without having to quickly learn Ruby and everything along those lines - and I didn’t need to install anything.

So I forked the Jekyll-clean theme into my own GitHub repo, cloned it to my laptop, created a Vagrant box (in a separate directory) with the following and I was off to the races:


Vagrant.configure(VAGRANTFILE_API_VERSION) do |config| = "williamyeh/ubuntu-trusty64-docker" "forwarded_port", guest: 4000, host: 4000

  config.vm.synced_folder "../blog", "/blog"

  config.vm.provision "docker" do |d| "gekkie/docker-gh-pages",
      args: "-p 4000:4000 -v /blog:/app"

After a quick vagrant up, I browsed to http://localhost:4000 and saw the blog. Now I could start editing the site’s config and posts. Easy peasy.

I considered bringing back my old posts, but I think this time around I won’t. I will leave them in the past. If Google ever decides to shut down Blogger, then I’ll move them somewhere else. My intent is to leave this space for more programming and tech-related posts.