5-Character Dev Environment

Messing with my .bash_profile this afternoon, post-diving into Laravel and Git (which I’ve been doing much of the last week), I realised I could boot my entire dev environment with 5 letters. Fewer, if I wanted.

So instead of going to the Dock, clicking each of the icons, going to each and faffing around, I could at least boot them all, and set off some commands in Terminal (or ITerm2 as I’m now using).

Weirdly, until Justine gave me an evening of command-line Git learning, and wanted my .bash_profile, “Like so,” I hadn’t realised you could do stuff like that, despite amusing myself with all manner of shell scripts. Now I know what’s possible, I’m over-achieving in efficient laziness.

What’s missing is:

  • Opening multiple windows in ITerm or Terminal and running a different command in each (I don’t want to boot multiple instances of an app).
  • Setting off a menu action in an opened app, e.g. in Transmit going to my work drive.
  • Extending it to boot the environment and then a specific project, e.g. “devup laravel” would open my laravel installation in each of the apps, like opening the database in Sequel Pro; cd-ing to the laravel folder after automatic SSH-ing into my Vagrant box, and so on.

Some of these are probably uncomplicated, but this was a 30-minute experiment that turned out to be very useful.

5-character dev environment
5-character dev environment


Why, Yes, Chinese Grinding & Boring Equipment Company Spam Email, I am Interested.

Dear Lei Zhang of Fujian Nan’an Boreway Machinery Co., Ltd. in Shuitou Town, Nan’an City, Fujian Province, China. We have never met. But you emailed me on my blog email this morning. Your email was beautiful. I sadly have no use for boring and grinding heads as I sadly have no boring or grinding equipment. I wish I did. I saw the attached photograph of bush hammer rollers and thought to myself, “This is good and useful spam. All spam should be this good and useful. Imagine if it was, I would buy things.” So, thank you, Lei Zhang, you have done the unique. In all my years of receiving spam, in all those tens or hundreds of thousands of impersonal and unwanted emails, yours is the first that’s made me say, “I wish I had a need for this, I would buy immediately.” It’s true, if I had spare cash, I’d buy some bush hammer rollers just because they look brilliant — and I’ve never even seen them before. Because of your email, I learned something new, which is what is best in life. You must think I’m being sarcastic and mocking you, but I’m not. Yours is genuinely the best unsolicited email I’ve ever received.

Bush Hammer Rollers from Fujian Nan'an Boreway Machinery Co., Ltd.

Bookmark Archaeology


I was cleaning out my browser bookmarks last night, first time in years, bookmarks going back to the early-’00s, thousands of them. I opened them in batches, every one, to see if I wanted to keep them. Hundreds, thousands of dead sites, no longer found, no longer existing. All that history and culture vanished as if it never was, only the link and title in my bookmarks proving they once existed, and once I deleted that …

Code Stupidity


I got sick of the tiny, Web1.0 images everywhere here, a hangover from the earliest days of supernaut, so I decided — ’cos I like visuality & pix — to make small, big. I thought it would be easy. Little did I know I also create and add to the pile of Technical Debt. So: most single images in the recent past are now huge-ified, 666px wide; recent image galleries which are not full of diverse image ratios are now evenly splitting the Number of the Beast. Older images and galleries should be retaining their previous diminutiveness, but who knows, 13 years of blog is difficult to homogenise. Mostly I got distracted with how to make portrait images not blow out of the available browser window space, which turns out to be a kinda traumatising process I didn’t achieve. Plus how to Lazy Load srcsets by preg_replacing the new WordPress caption shortcode. OMFG, Frances, WTF? All of which makes me think it might be time for yet another supernaut refresh. So much code. So many images. So much …

A Year Of My Heart

A year ago, I decided to get all analytic on my training. Mainly I just like tech and pretty representations of data. So I bought a heart rate sensor. And now it’s been a year of me using it almost every time I train. Which means I can look at a year in the life of Frances training, with all the … whatever that reveals.

What does it reveal, Frances?

Well, other Frances. I trained 156 times — that I recorded, let’s say 170 because I pretty much did not train without it unless I forgot either sensor or phone. For a total of 190 hours — there’d be a few more in that for the times my phone battery died. For a measly distance of 1481 kilometres — of actual training rides, not including cross-town, Kreuzberg-Wedding type stuff, so maybe double that at least, no wonder I spend so much on my bike and it feels like it’s constantly in need of repair. Hey, just like me! (Wow, there’s a realisation, right there.) About 1/3 of that was ballet, another third cycling (mostly road at the moment, but some cyclocross), 1/6 bouldering, and the remaining 1/6th a mix of yoga and core training.

Oh, and supposedly I burned around 121,000 calories, which is about 60 days of eating 2000 calories a day. I’m not really convinced about this. I think it’s more of an imaginary number, and not the mathematical kind.

What else? Speed, both average and top are derived from iPhone GPS. I’m not sure how much dispersion there is in this, but I suspect it can easily be 5km/h or more in either direction. My next gear purchase (after … umm … new brakes and probably new rear derailleur pulley wheels) is a speed/cadence sensor — which probably means also a proper cycling head unit instead of phone …

I seem to unintentionally train in 9-10 week blocks, then give up in despair for a couple of weeks, then, like a goldfish circling its bowl, forget all that and get right back into it. Knowing that this might be my natural rhythm though, it could make sense to train in 9 week blocks with a week off, if for nothing else than keeping my enthusiasm. Also I doubt I’ve been training like that this year, my rhythm’s all over the place.

My maximum heart rate seems to be constant around 190 (excluding the huge jumps into the 200s that were either the battery going flat, the sensor getting jostled, or actual random heart weirdness from having stupid fun training in -10º weather). I dunno, I have no context or expertise for reading anything into these figures, other than I seem to like training if it involves a degree of discomfort and some suffering — which I didn’t need a heart rate sensor to tell me.

So, a year of data. What to do with it? No idea! Will I keep using it? For now, yes. It’s become automatic to put it on. I don’t really use it during training, though I’d use it for cycling if I could find an iPhone mount that could hold my ancient 4S. But mostly I do it on feel, and that corresponds pretty closely to the various heart rate zones. I do do regular post-training gawks, to compare how I felt with actual data — and knowing that data across sessions gives me a bit of a feeling for where I’m at on a particular day or week. And one other thing: I train a lot less than I think.

Worth it for seeing a year of training all pretty like that? Yup!

Polar Flow and H7 Heart Rate Sensor — One Year Weekly Training Report
Polar Flow and H7 Heart Rate Sensor — One Year Weekly Training Report
Polar Flow and H7 Heart Rate Sensor — One Year Daily Training Report
Polar Flow and H7 Heart Rate Sensor — One Year Daily Training Report


Field Series 1

Me messing around with mediæval art, Photoshopping it until it’s far from the 3/4 of a millennium ago of its origin. It started as a visit to the Gemäldegalerie when I decided to do closeups of some of my favourite works. This is part of the Altarretabel in drei Abteilung mit dem Gnadenstuhl, from after 1250. Last night, feeling unexpectedly inspired around midnight, I realised I could mash another few score of layers into an image I was working on six months ago, and increase the density in ways that somehow appeal to my brain and eyes and emotions. I always zoom in on these images, like there’s myriad possible paintings in each. This time I took screenshots of those, and wanting to know what they might look like animated, threw them into Final Cut X and spat out 48 seconds of video.

I was asking myself if this is art. I know art and make art, but still. Maybe they’re sketches of possibilities. I like the artefacts generated from the process. I have no control over this. I have some control in which direction to push an image, but a lot of the detail is only minimally editable. Things happen, I make decisions, other things happen, possibilities open and close, I try and steer it towards a particular satisfaction, but each individual line and gradient and tone, no, that’s the software making its own decisions based on what I ask it to do. And as always, the further I get from using software as it was intended, the more interesting it becomes to me.


Land Speed Record

I know my new tires and wheels are mad fast, but kinda doubt I was the fastest thing on Tempelhofer Feld since the airport closed in 2008. Plus I’d have broken numerous Ordnungsamt and Straßenverkehrsbehörde regulations by laying down a solid hour of 217.6km/h — and not a tenth of a km/h faster or slower. Plus that would indeed be a land speed record for non-motor-paced bike on the flat by a huge margin. Then there’s my acceleration: zero to that in 1 second. The Porsche 918 Spyder can barely hit a hundred in twice that time. Takes me 3 seconds to slow to zero though.

Prepare for Takeoff
Prepare for Takeoff

Sandy Stone at The Future Is Unmanned: Technologies for Corrupt Feminisms

Presented by the agent of slime Virginia Barratt, and Petra Kendall, at The New Centre for Research & Practice (in Grand Rapids, Michigan, US). And, that’s Sandy Stone of The Empire Strikes Back: A Posttranssexual Manifesto. It’s gonna be awesome.

Hello all,

attached please find some information and some links to a 5 week seminar entitled “The Future is Unmanned: Technologies for Corrupt Feminisms” presented by Virginia Barratt and Petra Kendall.

Guests are:
Linda Dement, Amy Ireland, Lucca Fraser, Allucquere Roseanne Stone, Rasheedah Phillips, Francesca da Rimini, Rasheedah Phillips, Emma Wilson and others TBC or who may drop in.

The first session is on Feb 26th with a round table discussion with special guest Sandy Stone. We are super excited to have Sandy guesting for us.

The times, unless otherwise stated, are 5pm-7.30pm EST

Here are a couple of links to information:

You can make enquiries or register for the event via the New Centre site.

Please share this widely with interested people.

Virginia + Petra

The Future Is Unmanned: Technologies for Corrupt Feminisms
The Future Is Unmanned: Technologies for Corrupt Feminisms

Website rsync Backups the Time Machine Way

Continuing my recent rash of stupid coding, after Spellcheck the Shell Way, I decided for Website rsync Backups the Time Machine Way.

For a few years now, I’ve been using a bash script I bodged together that does incremental-ish backups of my websites using the rather formidable rsync. This week I’ve been working for maschinentempel.de, helping get frohstoff.de‘s WooCommerce shop from Trabant to Hoonage. Which required repeated backing up of the entire site and database, and made me realise the shoddiness of my original backup script.

I thought, “Wouldn’t it be awesome, instead of having to make those stupid ‘backup.blah’ folders, to let the script create a time-stamped folder like Time Machine for each backup, and use the most recent backup for the rsync hard links link destination?” Fukken wouldn’t it, eh?

Creating time-stamped folders was easy. Using the most recent backup folder — which has the most recent date, and in standard list view on my Mac, the last folder in a list — was a little trickier. Especially because once a new folder was created to backup into, that previously most recent was now second to last. tail and head feels hilariously bodgy, but works? Of course it does.

Bare bones explaining: The script needs to be in a folder with another folder called ‘backups’, and a text file called ‘excludes.txt’.  Needs to be given chmod +x to make it executable, and generally can be re-bodged to work on any server you can ssh into. Much faster, more reliable, increased laziness, time-stamped server backups.

# ---------------------------------------------------------------
# A script to manually back up your entire website
# Backup will include everything from the user directory up
# excludes.txt lists files and folders not backed up
# Subsequent backups only download changes, but each folder is a complete backup
# ---------------------------------------------------------------
# get the folder we're in
this_dir="`dirname \"$0\"`"
# set the folder in that to backup into
# cd to that folder
echo "******************"
echo "cd-ing to $backup_dir"
echo "******************"
cd "$backup_dir" || exit 1
# make a new folder with timestamp
time_stamp=$(date +%Y-%m-%d-%H%M%S)
mkdir "$backup_dir/${backuppath}supernaut-${time_stamp}"
echo "created backup folder: supernaut-${time_stamp}"
echo "******************"
# set link destination for hard links to previous backup
# this gets the last two folders (including the one just made)
# and then the first of those, which is the most recent backup
link_dest=`ls | tail -2 | head -n 1`
echo "hardlink destination: $link_dest"
echo "******************"
# set rsync backup destination to the folder we just made
backup_dest=`ls | tail -1`
echo "backup destination: $backup_dest"
echo "******************"
# run rsync to do the backup via ssh with passwordless login
rsync -avvzc --hard-links --delete --delete-excluded --progress --exclude-from="$this_dir/excludes.txt" --link-dest="$backup_dir/$link_dest" -e ssh username@supernaut.info:~/ "$backup_dir/$backup_dest"
echo "******************"
echo "Backup complete"
echo "******************"
# info on the backup commands:
# -a --archive archive mode; same as -rlptgoD (no -H)
# -r --recursive recurse into directories
# -l --links copy symlinks as symlinks
# -p --perms preserve permissions
# -t --times preserve times
# -g --group preserve group
# -o --owner preserve owner (super-user only)
# -D same as --devices --specials
# --devices preserve device files (super-user only)
# --specials preserve special files
# -v --verbose increase verbosity - can increment for more detail i.e. -vv -vvv
# -z --compress compress file data during the transfer
# -c --checksum skip based on checksum, not mod-time & size – SLOWER
# -H --hard-links preserve hard links
# --delete delete extraneous files from dest dirs
# --delete-excluded also delete excluded files from dest dirs
# --progress show progress during transfer
# --exclude-from=FILE read exclude patterns from FILE – one file or folder per line
# --link-dest=DIR hardlink to files in DIR when unchanged – set as previous backup
# -e --rsh=COMMAND specify the remote shell to use – SSH
# -n --dry-run show what would have been transferred

Spellcheck the Shell Way

I was reading this awesome book (about which I shall soon blog) and there was this moment of, “Fark! What a brilliant line!” like I actually said that ’cos it was so good, followed by, “Fark! Spelling mistake of spacecraft’s name!” And I thought wouldn’t a good way to deal with spellchecking (besides my favourite cmd-;) be to take the entire text, do something fancy command-line to it, and output all the words alphabetically by frequency. Then you could just spellcheck that file, find the weird words, go back to the original document and correct the shit out of them. So I did. Brilliant!

# take a text and output all the words alphabetically by frequency
# spaces replaced with line breaks, lowercase everything, punctuation included (apostrophe in ascii \047)
# http://unix.stackexchange.com/questions/39039/get-text-file-word-occurrence-count-of-all-words-print-output-sorted
# http://tldp.org/LDP/abs/html/textproc.html
# http://donsnotes.com/tech/charsets/ascii.html
find . -name "foo.txt" -exec cat {} \; | tr ' ' '\012' | tr A-Z a-z | tr -cd '\012[a-z][0-9]\047' | grep -v "^\s*$" | sort | uniq -c | sort -bnr