Neo-Grotesk Crypto-Brutalist

End of March, right when I’m throwing finally together my design portfolio (I swear I resisted, and now love having one), Emile asked if I might want to hurl together something for him. Something Web1.0, something like we’d handcode in HTML in the late-’90s, not quite something MySpace in the days of its browser-crashing gif-frenzy inferno, but definitely something that would be in its lineage; something tuner Nissan Skyline, unassuming on the surface, but all Fast & Furious: Tokyo Drift when you pop the hood. Something Helvetica, Neo-Grotesk, what’s getting called Brutalist right now, though not traipsing behind a fashion; this is Emile and when I was looking through years of his work putting his new website together, he has a deep love and understanding of the aesthetic, and the art and philosophy underpinning it.

First things first:

Emile: I have two websites. Can we make them one?
Frances: Yes!
Emile: Can we do all these other things?
Frances: OMG Yes!

Lucky I’d just done my portfolio, cos that gave me the framework to build on without having to bodge together fifty different functions and stuff. Saves a few hours there, which we made good use of in timezone-spanning conversations on typography, aesthetics, and usability.

First off, getting all those years of blog posts and work projects into a single database / website / organism. I used the hell out of interconnect/it’s Search & Replace DB script, merging, shuffling, shifting, getting rid of old code, jobs that would take a week or more to do by hand, done in seconds. We’d pretty much sorted out structure and functionality in a couple of afternoons; for a website that looks so simple, it was most of two weeks diligent work, back-and-forth conversations, picking away at details, (stripping and rebuilding, stancing, slamming, tuning … we are very good at turning all this into hoonage, especially with 24h Le Mans in the middle).

Obviously it had to be ‘Responsive’, look hella flush hectic antiseptic no matter what device, and for me (recently taking this stuff proper serious) it had to also be ‘Accessible’. I put those words in scare-quotes cos they’re kinda bullshit.

It occurred to me as I was finishing, that for a website to be neither responsive nor accessible — for example it looks crap if the screen size is too small or not ‘right’, or you can’t navigate with keyboard or screenreader — you have to actively remove this functionality. You have to break the website and override browser default behaviour. It’s a very active process to systematically remove basic functionality that’s been in web browsers since the beginning. You also have to actively not think, not empathise, intentionally not do or not know your job. Me for probably all of my earlier websites.

The funny thing is, it’s not really any additional work to make sure basic responsive and accessible design / functionality is present; the process of testing it always, always, always brings up usability issues, things I haven’t thought of, little points that become involved discussions about expectations, interactivity, culture, philosophy. Like ‘left and down’ is back in time, and ‘right and up’ forward; 下个礼拜 / 上个礼拜. Next week / last week. Yet the character for ‘next’ is xià, down, less than, lower; and ‘last’ (in the sense of ‘previous’) is shàng, up, more than, higher. So how to navigate between previous and next posts or projects turns into an open-ended contextual exchange on meaning.

And ‘responsive’, ‘accessible’? Basic, fundamental web design. Not something tacked on at the end.

Back to the design. System fonts! Something I’ve not done in years, being all web-font focussed these days. Another trip through the wombat warren of devices, operating systems, CSS declarations. It’s crazy impressive how deep people go in exploring this stuff. Emile Blue! A bit like International Klein Blue, and a bit like Web / HTM 4.01 Blue. But not! We worked this in with a very dark grey and very slightly off-white, bringing in and throwing out additional colours, and managing in the end to sort out all the interaction visual feedback though combinations of these three — like the white text on blue background for blockquotes. Super nice.

As usual, mad props to DreamHost for I dunno how many years of hosting (it was Emile who said to me, “Frances. Use DreamHost.”), WordPress for running Emile’s old and new sites (and all of mine), and Let’s Encrypt for awesome and free HTTPS. And to Emile for giving me the pleasure of making the website of one of my favourite artist.

Emile’s new website is here:

DreamHost & Let’s Encrypt & WordPress

Mid-last year, Electronic Freedom Frontier announced Let’s Encrypt: free, automated, Open Source HTTPS Certificate Authority for everyone. That’s the padlock in the address bar. Supernaut and other sites of mine have either gone the parasitic paid route, or the “whole day lost and still not working” free route.

DreamHost, who has been my webhost for close to a decade (thanks Emile), announced in December they would be providing One-Click Let’s Encrypt setup for everyone, no matter what their hosting plan. Yesterday it arrived; It really is one-Click! Ok, two clicks, one checkbox, one select. Four things you have to do in DreamHost panel for what used to cost tens to hundreds of dollars/euros a year and hours of pain if you didn’t want to pay.

Awesome. Best thing that’s happened to the internet since ages.

Anyway, this isn’t about all that, it’s about “I have a WordPress site and how do I make the green padlock appear?” Cos that’s a couple more steps. Call it 15 minutes if you’re paying attention; an hour if you’re drinking.

So, you’ve already added the certificate in DreamHost Panel. No? To the DreamHost Wiki! Done that, wait for the confirmation email (might take a couple of hours), and on to your webserver.

I start with getting wp-admin all https-ified. Open your site in whatever FTP client you’re using, open wp-config.php and add:

define( 'FORCE_SSL_ADMIN', true );

Re-login to wp-admin, and check for the padlock in the address bar. Open Web Inspector (command-alt-i), select the Console tag and check for Mixed Content errors. Unless you’re doing weird things in wp-admin, that’s that side of things sorted.

In your site root directory, you’ll see a directory called “.well-known”. That’s added by Let’s Encrypt. Probably don’t want to delete that.

Open your root .htaccess and add two chunks of code:

# force redirect http to https
# ---------------------------------------------------------------

<IfModule mod_rewrite.c>
 RewriteEngine On
 RewriteCond %{HTTPS} off
 RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

# https security headers
# http strict transport security (hsts)
# X-Xss-Protection
# X-Frame-Options
# X-Content-Type-Options
# ---------------------------------------------------------------

<IfModule mod_headers.c>
 Header always set Strict-Transport-Security "max-age=16070400; includeSubDomains"
 Header always set X-Xss-Protection "1; mode=block"
 Header always set X-Frame-Options "SAMEORIGIN"
 Header always set X-Content-Type-Options "nosniff"
 #Header always set Content-Security-Policy "default-src https://website.tld:443;"

The first force redirects any requests for http to https. The second does some fairly obtuse header security stuff. You can read about that on (The Content Security Policy stuff takes a lot of back-and-forth to not cause chaos. Even wp-admin requires specific rules as it uses Google Fonts. Expect to lose at least an hour on that if you decide to set that up.)

Other ways of doing this are possible. It’s kinda unclear what’s canonical and what’s depricated, but WordPress Codex and elsewhere have variations on these (definite rabbithole, this stuff):

SSLOptions +StrictRequire
SSLRequire %{HTTP_HOST} eq "yourwebsite.tld"
ErrorDocument 403 https://yourwebsite.tld

<IfModule mod_rewrite.c>
 RewriteEngine On
 RewriteCond %{HTTPS} off
 RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

 RewriteBase /
 RewriteRule ^index\.php$ - [L]
 RewriteCond %{REQUEST_FILENAME} !-f
 RewriteCond %{REQUEST_FILENAME} !-d
 RewriteRule . /index.php [L]

Then go through the remainder of your .htaccess and change any specific instances of your old http://yourwebsite.tld to https://yourwebsite.tld.

Open robots.txt and do the same, http changed to https.

Now’s a good time to empty your cache and check your live site to see how it’s going. Everytime I’ve done this so far it’s been all green padlock and magic S. But there’s still a couple of things to do.

Off to your Theme directory, open functions.php, and add these chunks of code:

/* content over ssl
------------------------------------------------------ */
function themename_ssl_content( $content ) {
 if ( is_ssl() )
 $content = str_ireplace( 'http://' . $_SERVER[ 'SERVER_NAME' ], 'https://' . $_SERVER[ 'SERVER_NAME' ], $content );
 return $content;

add_filter( 'the_content', 'themename_ssl_content' );

/* advanced custom fields over ssl
------------------------------------------------------ */

add_filter( 'wp_get_attachment_url', 'set_url_scheme' );

The first one makes sure your old http links in Posts and Pages are spat out as https. (Can probably extend that for excerpts and other things if you’re that way inclined.) The second was for me specifically dealing with Advanced Custom Fields plugin, and is part of a larger issue that’s been bashed around on Make WordPress Core. There’s a few other bits of code floating around for issues like this and the Media Library not behaving properly over https, but the next step I think deals with that, if you want to go that far.

Before that though, go through your theme for irksome hardcoded http strings. If you really can’t use proper WordPress functions (like get_template_directory_uri() or whatever), then change them to https.


Turns out WordPress’ responsive images support using src-set needs its own attention:

/* responsive images src-set
------------------------------------------------------ */

function ssl_srcset( $sources ) {
 foreach ( $sources as &$source ) {
 $source['url'] = set_url_scheme( $source['url'], 'https' );

 return $sources;

add_filter( 'wp_calculate_image_srcset', 'ssl_srcset' );

If you fully commit to the next step though, these functions aren’t required.

(End of addendum.)

Database funtime! Backup your database, it’s time to search and replace. I’ve been using interconnect/it’s brilliant Search Replace DB for ages when I need to shift localhost websites to remote. Dump it in your WordPress folder, open it in a browser, and search-replace http://yourwebsite.tld to https://yourwebsite.tld. I do a dry run first, just to get an idea of what’s going to be changed. This step isn’t really necessary, and if you end up going back to http (dunno why), you’d need to reverse this process; it’s probably just that I like everything to be all orderly.

Another browser to make sure crap hasn’t fallen everywhere, and it’s cleanup time.

In wp-admin, check all the settings are showing https where they should (can even resave Permalinks if clicking buttons feels good). If you’re using a plugin like Yoast SEO, then checking the sitemaps are correct is good.

Caching plugins also need to be checked, and caches emptied. If you’re using W3 Total Cache and are manually minifying, check the file urls are all https, for some reason this was’t changed even with the database search replace. Also under Page Cache check “Cache SSL (https) requests”.

Then it’s checking in a couple of browsers with and without caching, particularly any pages that use plugins which embed or iframe content, or otherwise interact with stuff outside your website. Most sites like Vimeo, Twitter, YouTube etc that WordPress provides embeds for are already over https, but that doesn’t mean code in a plugin is up to date.

If you’re using Google Analytics or Webmasters or similar, you’ll need to set things up there for the new https version of your site as well.

Buncha caveats: Do some of this wrong and you will break things. At least backup your database before doing this. Some/all of this might be depreciated/incorrect/incomplete/not the best way to do it. Finding definitive, single ways to do things isn’t really how code works, so try and understand what the code does so you can ask if it’s doing what you want. For me to be able to do this in 15 minutes is because I’ve spent years scruffing around in stuff and breaking things horribly, and the best I can say is, this seems to work and cover the most common issues.

DreamHost. Let’s Encrypt. Excellent!

(I swear this is much quicker than it took to read.)

https:// + supernaut (& WordPress)

Every month, I get a newsletter from my webhost, DreamHost. And I read it! That’s how I found out that supernaut could be SSL, have https:// in its URL and have that fancy lock in the address bar. Up there, ^.

Why would I want to do something like that? Well, because I can. Because it’s useful for a lot of reasons, especially now. Since the Snowdon NSA whistleblowing, which gets worse and more damning with each document release, it became obvious to me I should take the implementation of privacy and security as seriously as I do reading about it. Recently this has meant beginning the move off Google, which I use for so much; installing PiWik instead of using Google’s Analytics, installing GPGTools for email encryption (and badgering my friends to do the same); and obviously, if easy website encryption was possible, I’d give it a spin.

The first things I tried it out on have been my private server (running Pydio, formerly AjaXplorer) – my self-running DropBox, and for PiWik, then on a couple of small, low-traffic sites, to test how SSL would play with my standard-ish WordPress setup, which led to some rewriting of htaccess rules, and quick/easy code cleanup. So then I thought to try it on supernaut, which gets enough traffic and is complex enough to really show the horror.

WordPress make it reasonably simple – provided you have STFP/shell access – to make the switch. First the admin and login side of things can be SSL’d simply by adding:

define('FORCE_SSL_ADMIN', true);

to wp-config. Then in the Admin General Settings changing WordPress Address and Site Address to https. Then in the root .htaccess, editing the WordPress with two additional lines:

RewriteEngine On
RewriteCond %{SERVER_PORT} !^443$
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R,L]
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]

This did the majority of the change; the remainder turned out to be annoying. Images in the content seemed very reluctant to make the jump. This is partly because they are effectively hardcoded via the process of inserting the using the Media Library. I tried several methods with varying degrees of success, and finally adding this to functions.php seemed to do the job, updated a bit to use WordPress’ inbuilt is_ssl() function, which is a wrapper for both isset($_SERVER[“HTTPS”]) and a check for server port 443, and uses str_ireplace() instead of the depreciated ereg_replace:

function supernaut_ssl_content($content) {
if ( is_ssl() )
$content = str_ireplace('http://' . $_SERVER['SERVER_NAME'], 'https://' . $_SERVER['SERVER_NAME'], $content);
return $content;
add_filter('the_content', 'supernaut_ssl_content');

Which just left data in custom fields, which I use primarily for video. These look like they are best edited by hand in supernaut, though it’s possible I could use the WordPress HTTPS (SSL) plugin … I’d rather bash around in the code so I understand what’s going on. (edit) well obviously I can just wrap any code in the template with the above function, and just replace $content with $variable that pulls the post meta, no? (edit 2) And also for situations where I’m using Advanced Custom Fields (sadly not here), adding this to functions.php takes care of the rest:

add_filter( 'wp_get_attachment_url', 'set_url_scheme' );

This just leaves changing the WordPress Address and Site URLs in General Settings to https://, and that’s it (I think).

The last thing then is my own certificate. While Chrome is reasonably calm about the fact my SSL is an unsigned certificate (i.e. I haven’t spent up to $1500 on one from a trusted vendor), and Safari drops down a fairly innocuous warning – which admittedly is enough to make most people anxious, FireFox turns on all the alarms and does a mad freakout that’s impossible to simply bypass. Horrible, no? I figured that $15 a year for a secure certificate was probably worth it, for the experiment alone.

supernaut on SSL then! Most Excellent!

And for those reading for whom this was all WTFBBQ?, here’s what DreamHost said:

SNI – SSL Without a Unique IP!

“Server Name Indication,” or SNI, is a biiiig deal in the world of web hosting.

Every site on the web is tied to a specific IP (v4) address, but a single address can be shared across several different domains. In fact that’s one feature that’s helped to keep the Internet from bursting apart at the seems up ’till now.

IPv4 addresses are the Brettcoins of the shared hosting world in that they are both EXTREMELY valuable and that there are only a finite number of them gifted to humanity by the Gods.

While IPs can be shared among websites, they cannot be shared among SSL-enabled (secure) websites. If you want to handle secure web transactions on your own without the use of a specialized third party payment platform you’ll need to lease (and pay for,) a unique IP address for your own personal use.

Or at least…that’s how things USED to work.

SNI extends the protocols used to process secure web transactions to allow for the usage of a single IP address across several different SECURE websites. And, as of not too long ago, we support it!

You can still obtain a unique IP address and tie your secure hosting to it if you’d prefer – but it’ll cost ya ($3.95/month.)

To add or modify the secure hosting settings on any of your domains, visit the “Domains/Secure Hosting” section of your control panel, and click to “Add” or “Edit” services on your domains.

For a little background on setting up secure hosting in general, including some caveats of SNI, check out our wiki!

Some musings on a simple, private cloud

Yesterday Google announced they were killing Reader, which has caused wailing, gnashing of teeth, general internet meltdown, and it trended on Twitter longer and harder than the new pope. supernaut has been around for longer than all three (previous pope included) and will probably outlive the remaining two, which has given me a little to think about as I simultaneously bash the command line in search of data anti-impermanence.

Despite my current roll of 300-something feeds, which I use to stay barely coherent on everything from astronomy to China, theatre, black metal, and porn, I’m not especially sad about the demise of Reader, as there was a time – pre-multi-device syncing – when it was unnecessary, and I hope the coming weeks will make that so again; monoculture leads to ecosystem collapse and so on. What it does denote is the precariousness of the current state of how anyone who engages with platforms that hold our data, and the urgent need for some non–open-source / GitHub mentality alternatives.

And considering I do the majority of my work in or around open-source and things that live on GitHub and other similar platforms, I can say that unless I am more useless than I can possibly imagine, the state of things is diabolical.

Some years ago, when my first webhost screwed up my domain registration and lost, I moved (thanks to Emile) to DreamHost, where I’ve been signing up others ever since. As far as cheap, reliable, ethical webhosting goes, I think they’re about as good as it gets – yes, even with the occasional downtime and other problems; I think they take what they do seriously enough that things would have to go very wrong for me to decide to move.

Late last year they began to offer DreamObjects, which is “an inexpensive, scalable, reliable object storage service for web and app developers and tech-savvy individuals” – basically whopping great amounts of secure hard drive – which I thought, “Ooo, I could back up my entire laptop to that! … if it were a bit cheaper …” which they duly are doing, and at 2¢ a gig, it works out to under 10,-€ a month for my 500 gig hard drive, so it becomes something of a no-brainer in terms of, ‘Yes, this is affordable”.

Affordable but conditional clause = ‘tech-savvy’. And now we return to the open-source mentality. I’m not especially tech literate. I couldn’t write a bash script from scratch to save my life, but I can scrape things together from various similar examples and usually my “repeat until unbroken” approach works, combined with trawling stackoverflow for error message solutions. So I approached DreamObjects with some fairly specific ideas of how I wanted to use it, and was prepared for a degree of gargling the command line.

Oh dear, did I gag.

Anyone who knows me, who has a Mac has been pestered to do their backups (anyone using Linux by definition has already done this, and Windows users can sod off), and over the years I’ve tried everything, all manner of syncing, local and remote backups, encrypted sparse disc images, rsync from the command line via SSH, TimeMachine and SuperDuper! and more that I’ve forgotten. I’ve also attempted various “in the cloud” services, which I’ve avoided for three reasons:

First, the pricing is not commensurate with the real costs of storage (effectively free); second, I don’t trust my data with anyone – I like privacy and I’m determined to stick with it as a human right; third, as evinced by Google Reeder, these things tend to die. Let’s forget for the moment the fourth: uploading 500gb of data and syncing it requires a certain monomanical effort.

So, DreamObjects meets the first, and conditionally the third (they’ve been around longer than I’ve been on the internet), and for the critical second … they’ve been good so far.

And off I go into the land of “backing up your life to the cloud”. Thus far I’ve tried boto_rsync, boto, s3cmd, Duplicity, Duplicati, GoodSync, xTwin, DragonDisk, Cyberduck, ownCloud… and currently (unsatisfiedly) settled on s3cmd. Most of these are command line tools which require building and installing and a degree of ‘tech-savvy’ that is way beyond the average person; others are proper apps but don’t make it easy (this is also to say most are designed for Amazon S3, so using them for DreamObjects is a bit of a hack), don’t allow for easy exclusion of files and folders, or are otherwise generally problematic, opaque, unintuitive, or just ugly.

Maybe to say that for most people I know on a Mac, TimeMachine is about as complex as they can handle or want to handle, and so backing up remotely needs to add at most one more straightforward step to this.

And this is the problem: none of what I tried is remotely simple, even if it was intended for S3 and I was hacking it to work on DreamObjects, the amount of suffering would cause most people to give up; the complexity of owning your own data in all instances from Reeder to backing up to FaceBook and Twitter is directly responsible for driving people to use free platforms where everything – security, privacy, longevity – is secondary to advertisers.

I looked at ownCloud also, which is close to my idea of a self-owned private cloud, and even that requires messing around with php and really, the whole rhetoric of self-controlled data in the cloud is just one giant elitist, chauvinist wank unless my (dead) Turkish grandmother can do it herself, without feeling confused or stupid.

Admittedly with DreamObjects it was also a test for me to see how possible all this is, because I’d like to be able to tell my friends, “hey, you can back up your laptop remotely for under 10 euros a month!”, so I have spent an obsessive amount of time messing around, and apps like xTwin come the closest to what I think is desirable, however what is crucial is that there isn’t a simple way to have remote backups that aren’t bound to some company like Amazon – or a simple way to manage RSS feeds that isn’t bound to Google Reader.

Given than cloud storage is so cheap now, having this as a ‘part’ of my or anyone’s computer is going becoming normal – it’s about where mobile phones were in 1998 – and I think some imagination is necessary to prevent a situation where this part of me is owned by a corporation with a specific agenda to make money off me, my data, and my privacy.

Using the cloud should be transparent; backing up to it as simple as clicking an icon – or setting once and then it’s automatic; my public presence, be it on the equivalent of Twitter,  FaceBook, G+, Reader, should be owned and controlled by me; that is to say while the interface of these networks might be public in the way these are now, my data remains with me, and is owned by me and I am not a saleable commodity of a corporation.

Or perhaps to put it another way, if we are ever going to have true personal and private clouds and social networks, we need the demise of Reader and we need open source and other developers to make simple software that isn’t bound to these monoculture platforms.

(s3cmd just crashed, and I feel like throwing a few hours at ownCloud…)

A bit of a Mess, (continued)


… supernaut has been one-columnised. Most of the index page is done. Older stuff is broken (Oh! Images, why do you hate me?). Things have been lost/misplaced/sorted into piles for later use or discarding. Looking forward to hand-grepping the database.

(This is trying to be an aside but breaks things … Trying to use supernaut differently.)

A bit of a Mess, Really …

I was planning on slowly flopping supernaut over to a newer look … well, mostly the same old look but wrapped around WordPress’ default theme, so I could a) take advantage of all the fun new things like post formats, b) deal with some irritations of having slung supernaut through two or three platforms (WordPress, MovableType, and I think something else), and the general cruft of 8 years blogging and z) just because …

Slowly became quick because supernaut got hacked again. Not looking at you, DreamHost, too hard, but it’s getting kind of irritating lately. It’s probably not the very old theme supernaut is running on which is responsible for this, but having looked through the logs, gah! Keine ahnung!

So I swapped over to the new theme early. Severely unfinished. Much mess everywhere. Embarrassment? Yes! It’s going to take a long time before it looks all proper, like.

And this post was written using the Aside new post format. More of a Status really.

Fuck it, I’ve got six opened bottles of wine by me feet and a … what am I doing sitting all alone in the darkness in Alte Kantine Wedding?

a bit of a test

I spent the morning upgrading my blog software to Movable Type 3.34, possibly a stupid move but it seemed like a good idea at the time. So I guess the absence or presence of this post will unequivocally indicate the success of said venture, and let me know just how much of staring-at-code I’ll be doing in the coming days.

The search is kind of working again, but I can’t find the template so instead you get ugly Movable Type default I-have-no-style style. I’m going to change to a different search engine though, as the MT one is like a neanderthal bludgeoning a Louis XIV chair with a stone club. So, any weirdness is my fault, and seducing me with chocolate will assist the weirdness in going away sooner.

edit …

Mostly working. I’m also changing the search engine to Fast Search, but I am lazy and can’t imagine I’ll get it finished today. So … no search.