DNSControl is good software

So like, Google Domains.

I liked Google Domains. I had a bunch of domains registered there. I registered them with Google Domains because whatever Google's faults, they have an outstanding record on security, because they hire some of the best people in the world. It was a very well-designed, easy-to-use service that did what I wanted it to and nothing more which is exactly the sort of thing Google likes shutting down.

Of course I was notified that Google sold it to Squarespace, which I was sad about. I intended to do something before my domain was moved to Squarespace, and also that was effort, so I procrastinated and forgot about it, i.e. didn't do it. Because I didn't do it, I was automatically moved onto Squarespace, which didn't work out well.

I used Google Domains because of Google's security posture, and the fact the domain registry industry is a security tyre-fire. Ironically, the sale to Squarespace resulted in a massive security problem which allowed anyone to hijack any account. This proved past-me right, but only in the most irrelevant way possible; using Google Domains directly resulted in a time window where anyone could have hijacked all of my domains.

Squarespace also lacked basic functionality for DNS. I know why they did this; it's because their target market is non-technical people, a category which does not include me. But I do not consider complete inability to set a TTL on DNS records to be normal. I also don't consider inability to export a zone file to be normal either.

Squarespace does what it needs to do for the market it targets. It is nerfed as a DNS management tool. That plus a catastrophic security hole on day one made me want to switch to something else, which I did, eventually.


So, I made the obvious error of being on Google Domains in the first place, despite Google's history of randomly killing shit off even if you are paying for it. And I didn't migrate to something else before the Squarespace switchover, because I couldn't be bothered and/or procrastinated. Both of those things were entirely self-inflicted.

Why did I procrastinate over moving somewhere else? Because moving over DNS records was effort. That seems like an insanely trivial reason, because it is. But I have lots else to do, and that one thing to do fell by the wayside. Maybe I could have solved this for the longer term by divorcing my DNS management from my domain registrations, but that too would be effort, and then I would be paying for two things, either of which could disappear like Google Domains did. How, then, could I make moving DNS records between registrars easier?

What would be really neat is if I could manage my DNS configuration as a text file on my computer. That text file on my computer would be the source of truth, to be pushed to whatever DNS provider I am using at the time, so I can swap out providers at will. As with all text files on my computer, it could be managed as a Git repository, so I have the complete history of it available, so if I break something I can easily roll back to an earlier version. Wouldn't that be cool?

WELL IT JUST SO HAPPENS

...that this exists, and also that I eventually get to the point. It's called DNSControl, and I like it.

I make a text file called dnsconfig.js which looks like this:

// These two things are defined in my other config file, `creds.json`, which
// contains my API keys.
//
// The names in the files below do not have to correspond to provider names;
// they map to entries in `creds.json`. It's less effort to make them match
// though.
var REG_GANDI = NewRegistrar("gandi");
var DSP_GANDI = NewDnsProvider("gandi");

D(
    "lewiscollard.com",
    REG_GANDI,
    DnsProvider(DSP_GANDI),

    // Records for the apex domain and www.
    A("@", "138.68.161.203"),
    A("www", "138.68.161.203"),
    // all my other records go here
)

OK, not a text file; it's actually JavaScript with some stuff shoved into the global namespace to make it look like a DSL. Someone could mumble something about Turing-complete configuration files, and I'd mumble something back like "yeah innit". I mean, all programmers dislike the idea of writing configuration files in a programming language in theory, and then we're all secretly thankful that we get variables and loops and conditionals and stuff when someone else implements their configuration file with a programming language. And then we all go around with guilty consciences! I recommend sitting on your hands till they go numb before writing your config file; that way it'll feel like someone else is doing it.

Anyway, there's also a creds.json which looks like this:

{
  "gandi": {
    "TYPE": "GANDI_V5",
    "token": "XXXX",
    "sharing_id": "YYYY"
  }
}

This tells DNSControl which provider I want to use, and has API keys for it. I moved my domains and DNS to Gandi, because they exist, they were easy to migrate to, and they're probably as good as anyone for all I know. The "awesome" comes in when I can switch this to use any one of over 50 providers, meaning that moving my DNS records somewhere else is entirely automated (other than creating API keys for them). Awesome!

Squarespace is notably absent from that list of providers, because they don't have an API for managing DNS records. If it wasn't for this I might have even stayed with Squarespace for my domain registrations, because it is less effort. The security hole was forgivable; all software has flaws. But now that I know DNSControl exists, I don't want to manage my DNS records via a web interface ever again, so I won't.

After I changed my configuration, I run dnscontrol preview to see what things will be changed, then dnscontrol push to push them to Gandi. I could make things even fancier than this. I could set up CI, so that pushing changes to my remote Git repository automatically pushes changes to Gandi, which would save me having to run DNSControl manually after each change. That would eliminate the risk of my Git repo not reflecting reality. But for now I am happy enough that I can manage my DNS with version-controlled text files; maybe I'll do the CI thing some other day (and here I was to write "this is a magic spell which makes things disappear forever", then I realised I used that joke before).

I like that DNSControl stops you from making mistakes. It could have been a straight translator from zone files to API calls, and fortunately it is not. It has opinions, and those opinions are good because they stop you doing stupid things.

I like that DNSControl is a single binary, which is also a thing I like about Hugo. That means that I can check the binary into the repo via LFS and it'll probably work for as long as I'm using an x86-64 Linux system, which might be as close to "forever" as I care to look.

Also, finally, the documentation. It has it! Lots of it! I never had to Google anything, nor copy-paste magic spells from random GitHub issues and Stack Exchange posts as I do with most other config file formats.

Anyway, this post is really about me and things I like, because that's what this site is, rather than being any kind of useful introduction to DNSControl. But if someone somewhere reads this who hasn't heard of DNSControl might consider using it now, I wouldn't mind that at all.

I migrated this site to Hugo

On and off over the last few weeks I have been migrating this site from a Django app written by me to a static site generator called Hugo. If you don't recognise at least one of the technical words in the previous sentence, then you won't find anything interesting to read here. But if you find anything that is broken on here, it's probably something I broke during the migration, and you might want to let me know.

Migration wasn't hard

I had to move all my old content into the new site while not breaking any links. I used Markdown for all the content in the old Django app (in case I wanted to do something like this some day), so there was not much conversion required there. I had my own special snowflake markup for handling images and YouTube videos, which required a little work to convert.

The core of the migration was 161 lines of Python code (and then another 20 lines of throwaway Python to fix my breaking all the OpenGraph images). If I had entire days to spend on it and an unlimited attention span I might have finished the exporter in a day. I didn't have that, because I have a day job and a stupid car project and a mind that says well I'm going to have to Google this thought unrelated to any particular thing I am doing, but it was much more work thinking about than doing.

The slightly harder part was learning what I needed to learn about Hugo and doing a basic theme for it, but all of that might have been only a couple of long days - if, again, I had been focused on it.

After the migration, I rather wish I had used Hugo from the start. This site was a Django app, because Django is my day job and it is what I know; I figured I would more time fiddling with Hugo (or some other static site generator) than I would writing my own Django app. But I definitely spent far more time writing tests for my Django app than I did migrating this to Hugo!

Authoring is nicer

I like writing things in my favourite text editor. It seems a more obvious way to write, because that is what I do with most of the rest of my time at a computer. Actually, that was partly the way I was writing already; I was writing posts in a text editor, then copy-pasting into the Django admin interface when I was finished. And if I needed to do non-trivial edits I was copy-pasting it into my text editor, editing it, and then pasting it back.

This is not because Django's admin interface is bad. Although it wins no beauty contests, it's rather better than most web-based interfaces, because text boxes in the Django admin behave as text boxes have in your operating system for about 30 years, rather than the fallout of React meeting some UX designer's reinvention of "type text into a computer". But to those people for whom "favourite text editor" is a valid sequence of words, writing text into a web browser is not as natural as working on files in one's favourite text editor.

As with all static site generators, Hugo's master copy is just a bunch of text files. A bunch of text files can be version controlled with Git, which is also what I do with everything else I write at a computer. That, too, is what I do with anything else I write on a computer, and so it's a better way of working for me.

It's much less maintenance

This is the real reason for the upgrade; I wanted less work for myself in the long run. As said, Django is what I know. I like it a lot! But I have little interest in maintaining a Django app when I am not being paid for it.

It's easier to secure a static site than it is to secure a Django app. My web server is just serving some files from disk; it's nothing the unattended upgrades of my OS can't handle. Being a responsible citizen requires that I kept the various non-OS dependencies of a Django app up to date. That became a chore I didn't want, and if I'm honest I ended up getting Dependabot email alert blindness.

OS version updates were a source of dread. Serving static files does not change much, if at all, between major operating system versions. Serving a Django app does. That requires, invariably, a major Python version update and a PostgreSQL update as well.

Backups are easier

My old site was backed up multiple ways. Snapshots of the entire virtual machine (what Digital Ocean calls "droplets") were made daily, which cost money. That they would restore to a usable site was a matter of faith. I also periodically pulled the database and media files (such as images) to my local machine, which in turn is backed up elsewhere.

The entirety of the source code of this site is in a Git repository. That includes the theme, and all of the site's text and images. Simply by virtue of how this is deployed, that makes at least three up-to-date copies of the site in normal circumstances: one on my computer, one on GitHub, and one on the server. I'm getting those "for free", but then my local machine gets backed up, which makes four.

I also have the statically-linked Hugo binary checked in to the repo. This gives me some faith that, so long as I am on an x86-64 Linux machine, I will always be able to rebuild the entire site from source in a few seconds and re-deploy to a new server very quickly. This means I am more likely to have backups that can be restored.

Dear God it's fast

Really really fast.

It should load just about instantly on any device and any Internet connection that is likely to be in use. The technical people out there will say "of course it does", because of course it does; it's serving small files from a solid state drive.

The old site wasn't slow. I know how to make Django sites fast, and I did. But, without adding more piles of caching on top, it will never be as fast as directly serving files from a disk.

I decided to make that even faster, by inlining all of the CSS for the site on every page. It adds less than two kilobytes per page and saves a render-blocking HTTP request. Fast internet connections won't see this difference, but slow mobile ones will. I also added some magic tricks which make every link within this site to be prefetched, so once you actually click on it it'll load even faster.

"Faster" - of the "just serving file from disk kind" - also means I need fewer computing resources to serve it, which means...

It's slightly cheaper

...I could run this on the same host as a bunch of other static sites. Previously it was running on its own virtual server, because I figured a Django app would need it. Virtual servers are so cheap these days that they may as well give them away with Happy Meals, but that's still a few quid saved. I could reduce that to near-zero if I used S3, and actual zero if I used GitHub Pages or some other free host, but I like having a server I control.

Arguably the reason the Django app was maintenance in the first place was because I like having servers, rather than some Docker container running in some platform-as-a-service that I don't entirely understand. As it was, when it was time to upgrade the operating system on my server, it was also time to upgrade the database along with it, and also the Python version. The trick here is that when I upgraded the OS on my desktop I had to do the same thing on the server, and vice-versa. That could have been made easier by using a managed database, and a Dockerised Python. If I was doing this as a job I probably would. For something I am having fun with I do not like that loss of control.

I did a bunch of other things while I was there

I have some JavaScript on this site. It's used to handle the privacy-respecting, no-script-friendly YouTube embeds, such as this one. It was originally written in Vue, version 2. That version of Vue went out of support a while ago.

I didn't really want to maintain a JavaScript build system anymore. It felt like effort, and it turned out I could replace it with not many more lines of plain JavaScript than it was in Vue code; it's so small (like 500 bytes gzipped) that I didn't even bother minifying it. I can count on plain JS working in browsers for more-or-less ever. I can't count on a JS build system still working on my machine next year, so I do not feel much like maintaining one for a personal project. This fits the plan of making less long-term work for myself.

Not installing anything from NPM means I don't have any linters anymore. I don't feel good about this, but I feel far worse about the utterly insane dependency tree of ESLint and stylelint, and only slightly less bad about some of the (non-dependency) insanities of JSLint.

I did a minor visual revamp. The home page now lists every post, which means you can find any post by using Ctrl+F, and also go to all of the categories immediately. I also added support for high-contrast rendering, for those who have that preference set in their operating system. And the whole site now uses a monospace font, because I are programmer.

I could have done these things earlier, but I didn't. I didn't consider them meaningful enough differences to be worth doing by themselves. A rewrite of the site seemed like a good time to do that.

Hugo is really good

I like Hugo. Actually I tend to like anything that comes out of the Go programming community; it tends towards generating high-quality software.

I looked into Hugo in the past, and liked it a bit. As I recall its templating system was not as good back whenever that was as it is now. It, and I may be misremembering this, had nothing that would allow one to override blocks of a base template; instead, it only allowed including other templates, which is not slightly the same thing. That made it feel nerfed compared to Django or Jinja2 templates. Now that it supports extending from and overriding blocks of other templates I like it a lot.

Hugo is wonderfully documented and every maintainer should aspire to writing documentation this comprehensive.

Hugo builds to a standalone executable, which should continue to work on any of my Linux boxes into the indefinite future; for as long as I have a libc6 x86-64 Linux system, I'll be able to use the same Hugo binary I'm using today. I checked in to my repo with LFS in case it disappears.

Anyway

It all works! Please let me know if you find something broken.

Towards something that almost looks like a blog

I withdrew from social media at the start of that whole global pandemic thing. Initially it was to bring my anxiety under control, and turned out to be the best possible thing I could have done for my well-being.

On the other hand, I came to miss having a place to dump things that may or may not be of value to someone somewhere. That's what this is!

I also wanted a little place that was mine -- hosted on servers I pay for, with software I control. The current wave of deplatforming should terrify anyone that has content hosted by someone else. I like cars, cats, and writing Python code. It's unlikely I'll ever become a target for any of that, but that's probably what all those people thought who were posting nudity on that site that was 95% nudity, and we all know how that one worked out.

I have a site. Actually I have another site as well. Those sites do what they do, and one of them gets non-trivial amounts of traffic doing what it does. I did not want to repurpose those. They might get updated with the things they should be updated with, but sometimes a thing should be allowed to be the thing that it is.

So here it is, something resembling a blog. I'll be migrating a bunch of stuff from my Markdown diary files. I have a lot of these, which I gathered in the hope of getting them into a static site generator some day. They didn't end up going into a static site generator, because I was overthinking the solution (and I'll probably post about that soon).

I picked the name "Exhaust" because this is a bit of a vent. Not "vent" in the sense that makes so much discourse ugly, but a place to dump the occasional thought, a thing that has happened to me, a picture, maybe the occasional story and anything else that crosses my mind. And I can do that without caring how many likes or shares it gets! As Todd Snider put it,

I might share some of my opinions with you over the course of the evening. I'm not gonna share them with you 'cause I think they're smart, or 'cause I think you need to know 'em; I'm gonna share 'em with you because they rhyme.