A case for minimizing moving parts in a build

Original Tweet Thread

Unrolled Tweet Thread

If we take the chance that a tool (compiler, linker, batch, whatever) remains working for a particular codebase after one year as a given probability p, then the chance that build remains working after x years is pxn, where n is the number of tools used in the build.

That’s “just math”. Even if we assume a 99% chance that a tool still works on the codebase after a year (an extreme rarity these days!), that graph looks like this:

With just one tool, the build has an over 95% chance of working after five year. With ten tools, it has only a 60% chance! And that’s with every tool having a _99%_ chance of remaining working (meaning no breaking changes to the tool that affect the build in question).

If we assume a still probably favorable given today’s environment, but slightly lower p of 90%, the graph now looks like this:

This is, of course, an epic disaster. With just 3 tools used, after 5 years there is _very little chance_ that your codebase will still build correctly. And that’s at 90% and just 3 tools!

If you look at 90%/10 tools, heaven forbid, that bottom line says your build almost certainly doesn’t work after only 2 years… and in fact has barely a 30% chance of working after just 1 year!

Now imagine that we don’t say “tool”. We just say “dependency”. The equation _remains the same_. Modern codebases often have 10s, 100s, or even 1000s of dependencies! What does that do to this graph?

Here is the graph of 10, 100, and 1000 dependencies, assuming a never-happens-on-github percentage chance of a dependency not breaking your build at 99%:

10 dependencies sort-of works. It has a 60% chance of still working after 5 years. 100 dependencies doesn’t work. It’s less than 40% after just 1 year. 1000 dependencies breaks with almost complete certainty after a mere _four months_.

All of this is already something you know intuitively. Projects with lots of dependencies never work out-of-the-box. You are constantly updating, patching, and struggling to get their builds working, because every time something downstream changes, somebody has to fix it.

The “dependency culture” of modern programming has put us into a state where software requires perpetual, constant maintenance. No longer can we take a build and say “this works” and come back to it in a year. Great for job security, horrible for software quality.

As for speculation, I wonder, at least in part, if this answers the questions me and other people like me have, which is how do companies like Twitter employ thousands of developers while seemingly producing almost no additional software or improvements?

Well, if you assume that Twitter’s collective codebase is a 1000+ dependency nightmare, as I assume it probably is, then the math kind of tells us the answer: the vast, vast majority of their time will have to be spent simply keeping their existing code working.

Casey Muratori (@cmuratori)

One SSH Key to rule them all

I have searched high and low for a better way to use SSH key based authentication than what you learn in the default Linux tutorials. Those tutorials would have you generate one key per-machine/account and then on every box you SSH into add that to the authorized_keys file.

What I want is a single key (that’s easy to rotate) that I can use on any machine quickly and easily to get up and running. This is because I reformat my computers a couple times a year and set them up fresh, and because I use all 3 major platforms for coding.

When you search for single key ssh authentication, the most common set of results tells you to use a Yubikey. So I finally gave in, bought a couple Yubikeys to see what all the fuss was about in the DevOps community. I did manage to work through all the tutorials about how to use a YubiKey as a kind of secure single sign on where the YubiKey is your one true universal key for SSH login. After hours and hours of setting up special daemons for forwarding SSH authentication to PGP, I’ve come to the following conclusion: YubiKeys suck.

They suffer from the same problems that all PGP and SSL encryption suffers from. Archaic tools with WAY too many options, no explanation for what any of the options do, and absolutely NO guidance around if flipping a particular set of switches makes you more or less secure.

There is a better way and I’m here to tell you about it today. The answer my friends, is Krypton.

Krypton is a phone app for iOS and Android that implements the U2F protocol and in general is a more sane approach to using a physical second factor (your phone) across all of the platforms I use: Mac, Linux and Windows1.

There are two modes:

  1. General Web 2FA using U2F
  2. Developer Mode which allows you to use Krypton for your SSH authentication and as a bonus it handles automatic signing of Git commits if you want it to

For the first mode, you just add the Krypton plugin to your browser of choice and then Pair your phone to that specific browser using a QR code similar to how WhatsApp desktop works. After that you can use Krypton anywhere U2F is supported2

For Developer Mode: Install the kr daemon/tool and the rest is easy peasy. You type kr pair into a Terminal and a QR code appears in the console. Scan this with the Krypton app on your phone and you’re all set. The kr tool takes care of setting up the SSH daemon as well as the github signing if you’d like. You still need to add an entry into authorized_keys on every computer you’re going to be using for SSH authentication, but this time… magically… it’s the SAME key on every computer you pair Krypton with without having to manually move things around, figure out where to securely store the key (it’s in secure storage on your phone).

To get your SSH public key just type kr me and voila!

Krypton is focused on security and as such the default settings ask you to confirm your login every time on your phone. I found that a little annoying especially when running Terraform jobs (more on this later) but luckily they allow you to customize your paranoia level per-host right from the app. I do still have it ask me every 3 hours or so to confirm, but after that it doesn’t bother me with additional prompts.

This is the first post in a series of posts about my home lab where I develop/prototype my applications. The next installment of this series is going to be how to use Terraform at home with your own setup that’s not a cloud provider.

Footnotes

  1. Because of the SSL library Krypton uses it doesn’t place nicely with Windows out of the box. It works perfectly though with the Windows Subsystem for Linux (WSL) and so I use WSL for doing all my git and ssh work on Windows and have had zero problems.
  2. Not all platforms that support U2F seem to work. I use Firefox because I care about speed and privacy and while generally Krypton works just fine with sites like Gandi – I have had trouble getting it to be recognized by Google and I have no idea why. Yubikeys work fine on Google *shrug*