Gallery

VNS Matrix / Merchants of Slime

Live on June 30th, a digital archive for Australian cyberfeminist collective, VNS Matrix / Merchants of Slime.

’90s-period CRT phosphor colours, monospace fonts, highly structured and interlinked data, emerging from over a year of conversations and work with the Merchants of Slime. Deep adoration for Web 1.0 aesthetics, sliding into contemporary possibilities for accessibility, interaction, responsiveness, and clarity.

By far the largest project I’ve undertaken, handling archival data management, utterly masses of PHP, JS, and CSS, and teasing out over months the design, aesthetic, and movement through hundreds of pages and thousands of media files – all while trying to keep it properly accessible, semantic, responsive, logical, even simple, while the phosphor burns the screen.

Heaps big thanks to Virginia Barratt and VNS Matrix for going, “Yeah, Frances is what we want.” And hectic reps to research assistant Clare Bartholomaeus for all the scanning and cataloguing.

Image

Slime is Live, Cunt

Phosphor burn digital archaeology slime archive for the 21st century, cunt.

Done.

Seems that keeping 3000 posts and 10,000 images updated takes about half a blog lifetime.

I moved from Movable Type to WordPress in 2009, and ditched ecto, the old blogging app, about the same time. Over the years, I wrote SQL queries, grepped the hell out of the database, redesigned the whole website (while keeping the same black and white aesthetic), recoded stuff, wrote some hella shonky redirections, and slowly went through all the posts turning images into galleries and using WordPress’ Featured Image, and then gave up on it all a couple of years ago before getting weirdly ‘inspired’ this weekend and doing 1000+ posts over the course of 2 days.

My database queries tell me all the galleries are now correct, and all the single images also. A stupid amount of work I hope I never have to do again, because I know my singular, obsessive focus will do it. Legit, my wrist is going ”WTF, Frances, WTfuckingF.” and if I keep blogging like this, eventually maintenance will take longer than there is days in a year.

Image

“Fucking banger, you little cunt!”

There was this downhill in Narrm, High St in Glen Iris, a long dogleg leading down to the train crossing where, if the traffic was right, I’d hit 70km/h. Apparently the speed limit now is 60. Kinda sketchy, as the road leading in already gave some speed, and the first right bend had a weird camber that pushed towards the gutter. The only way to ride it fast was to apex across both lanes. That’s the easy way to get fast.

I’ve been doing a particular training lately, that I enjoy in that obviously “I’m suffering here” kinda way. Every time I pass a person walking, it’s out of the saddle for 10 revolutions. If I pass another person while I’m up, another 10. A maximum of 20, but there’s a couple of sections around the airport where I might only get a few seconds sat back down before I’m up again. The randomness of it appeals to me, much more than the strict “20 seconds on, 2 minutes off, repeat” kind of thing of intervals. I have Emma Pooley to thank for this. Today I decided to add in one all out sprint per lap, about 250 metres of locked in, high cadence, heart-fucking pain. I like high cadence work, but it’s only been very recently I’ve been doing it, and in the context of high speed it’s very new to me — as much as I love fast.

’Cos I’m a slightly drama bitch about keeping details, I wrote this in my notes:

I hit 50.8km/h at 00:57:48, HR 187; HR peaks at 191 at 00:57:53, 5 seconds later and remains at this until 00:58:02, a duration of 9 seconds, 13 seconds after my speed ended its peak, and has dropped to 41km/h. I definitely felt those ten seconds, gasping for air and all. Anyway, fun times.

When I got home and saw I’d finally got over 50km/h, and no downhill assistance, I announced rather loudly, “Fucking banger, you little cunt!”

After many years of supernaut images being tiny wi…

Status

After many years of supernaut images being tiny with ragged right background captions, I’ve been slowly ditching it for supernaut teenage Instabanga big images. And finally I cleared out all the old styles and code, made new medium and large image galleries, redid the styles and scripts repeatedly, said goodbye to funky bodges. Kept the blazing deep pink tho’. 😍🤘💥❌💯‼️

生日快乐!!! Happy 14th Birthday supernaut! 🤘🎂🎉🔥💥🖤

One day late celebration of supernaut’s 7th April birthday. Supernaut is fully a teenager now, emoji-ing and posting images like she thinks this is Instabang. Emile said supernaut is a life-project now.

SecreT(uring)ly

Georg, with whom I worked on co-writing The Station, asked me if I’d like to do another piece of co-writing with him, this time an opera libretto. I said yes (duh!). Last Friday, we had a three-way chat with Henry Vega, the composer, about Alan Turing, neural networks, science fiction, queer stuff, and all, for a sharp hour (Georg’s good like that with his one-hour meetings).

Today I spent a couple of hours (after some dipping of toes last night) in installing TensorFlow-Char-RNN, a “a character level language model using multilayer Recurrent Neural Network,” as made wildly lovable by Janelle Shane of Letting neural networks be weird. That involved installing TensorFlow. I went for the direct MacOS approach (after toying with either a Vagrant VM or Docker container) of the Virtualenv flavour. Plus Python 3. And pip. Dependencies. We have them.

A bit of faffing around, and out is spat a ‘Shakespeare’:

t ‘vkdwsa avf
neu irot rS
, mvuaeea giCsouo aed renat rs
;iiweszteseooiiWhe thrr l st !htt :hsre

I mean, I was expecting a single, long ‘aaaaaaaaaaaaaaaaaaaaaa’, so this was progress.

More faffing, fans to 6000rpm, CPU to 500%, and some short while later, ‘Shakespeare’!

Before we proceed any further,
Or each doth now foul branch with thy preser’d up
Young to devise me him;
But in my jewities rebeeve me to this,
Your soul than daggers and breeding
some abrother Arms
What will be pronound with a husband; he’s beauty much or a slaughter,
But I’ll wring my false find than how ill.

Nailed it.

Hacking & Bodging a Git Hook + Vagrant + WP-CLI + Bash Local to Dev Database Transfer

Ever since I started using Git to push local website changes to a development server, I’ve been vaguely irritated about dealing with the database in the same manner. For a long time, I used interconnect/it’s Search Replace DB for this side of things, but I was always wondering if I could somehow increase laziness time by automating the process. One hungover Sunday, plus a couple of hours on Monday, and one hacked and bodged success.

This isn’t a “How to do yer Git + Vagrant + local to dev” thing, nor is it a copy-paste, “Works for me!” party. Nonetheless, provided you’re using git-push, and are comfortable with WP-CLI or MySQL command-line, Bash, and generally thrashing small bits of code around, in principle it would work in any situation. And I do feel kinda dirty throwing all this into Git post-receive, but whatever seems to work.

So, here’s what I wanted to do:

  1. Do all my commits, and run git-push and throw all the file changes to the dev server.
  2. Combine that with a dump of the database, then get it to the dev server.
  3. Use something at the other end to import the database, and search-replace strings
  4. Clean up after at both ends.

When I first looked into this, it seemed using the pre-commit Git hook was the most common approach, dumping the database and adding it to the commit. I didn’t want to do this, for a couple of reasons: I do a lot of commits, and the majority have no database component; I wasn’t looking to version control the database; All I wanted to do was push local to dev database with the changed files. Looks like a job for pre-push hook.

Earlier this year, I started using Vagrant, so the first issue was how to dump the database from there. I do commits from the local folder, rather than SSH-ing into the VM, so mysqldump is not going to work without first getting into the VM. Which brought its own set of weirdnesses, and this was the point when I decided to flop over to WP-CLI, the WordPress command-line tool.

I often find solutions to this sort of thing are dependant on the combination of software and commands being used. I use mysqldump on its own all the time, but here, I needed to use Git to set the path for where the database would be dumped to — because git hooks are in a sub-directory of the git folder — and that, in combination with dumping the database inside the VM while within a Git command running from the local folder (yeah, probably should just do all my git via SSH), and hurling it at a remote server, means sometimes things that work in isolation get cranky. And this is a hack/bodge, so I went with:

  1. Set up paths for the database dump with Git, ’cos Git is running this show.
  2. SSH into the Vagrant box.
  3. WP-CLI dump the database to a gzipped file.
  4. SCP that up to the dev server.
  5. Delete all that on the local server, ’cos I’m tidy.

That’s half of it done. I’ve got my pushes working, the database file is up on the dev server, the local server is all cleaned up, so now it’s time for the other end.

In this case, I was doing it for a site on DreamHost, who conveniently give all kinds of fun command-line access, plus WP-CLI on their shared servers. Once Git has finished checking out the new file changes in post-receive, it’s time for frankly bodging it.

My current usual setup is a bare repository on the dev server, which checks out to the development website directory. This means neither the uploaded database, nor WP-CLI and the WordPress root are in the same place as the running hook. No big deal, just use –path=. The next thing though, is cleaning up post-import. Strings to be changed all over the place, like local URLs swapped to dev. And for that we have, wp search-replace, which is an awful lot like Search Replace DB. At the dev end then:

  1. Set up paths again, this time it’s WP-CLI running the show.
  2. Unzip the database then import it.
  3. Do database stuff like search-replace strings, and delete transients.
  4. Delete that uploaded database file on the dev server, ’cos I’m tidy.

I was looking at all this late last night, all those repeating lines of ‘wp search-replace’ and I thought, “That looks like a job for an array.” Which led me down the tunnel of Bash arrays, associative arrays, “How can I actually do ‘blah’, ’cos bash seems to be kinda unwilling here?” and finally settling on not quite what I wanted, but does the job. Also, bash syntax always looks like it’s cursing and swearing.

The pre-push hook:

#!/bin/sh

# a pre-push hook to dump the database to a folder in the repo's root directory, upload it to the dev server, then delete when finished

echo '***************************************************************'
echo 'preparing to back up database'
echo '***************************************************************'

# set up some variables, to keep things more readable later on
# backup_dir is relative to git hooks, i.e. 2 directories higher, so use git to set it

ROOT="$(git rev-parse --show-toplevel)"
BACKUP_DIR="$ROOT/.database"
DB_NAME="database"

# check there is a database backup directory, make it if it doesn't exist then cd to it

if [ ! -d "$BACKUP_DIR" ]; then
mkdir "$BACKUP_DIR"
cd "$BACKUP_DIR"
else
cd "$BACKUP_DIR"
fi

# cos this is vagrant, first ssh into it. there will be a password prompt
# using EOF to write the commands in bash, rather than in ssh quotation marks

ssh -t vagrant@172.17.0.10 << EOF

# cd to the new databases folder. this is absolute, cos is vm and not local folder
cd "/var/www/user/domain.tld/.database" 

# then export the database with wp-cli and gzip it
wp db export --add-drop-table - | gzip -9 > $DB_NAME.sql.gz

# exit ssh
exit

# bail out of eof
EOF

# scp the backup directory and database to dev server
scp -r $BACKUP_DIR user@domain.tld:~/

# remove that backup directory so it's not clogging up git changes
rm -r $BACKUP_DIR

echo '***************************************************************'
echo 'all done, finishing up git push stuff'
echo '***************************************************************'

The post-receive hook:

#!/bin/sh

echo '***************************************************************'
echo 'post-receive is working. checking out pushed changes.'
echo '***************************************************************'

# check out the received changes from local to the dev site
git --work-tree=/home/user/dev.domain.tld  --git-dir=/home/user/.repo.git checkout -f


# import the database with wp-cli
echo '***************************************************************'
echo 'starting database import'
echo '***************************************************************'

# setting up some paths
# on some webhosts, e.g. all-inkl, setting the alias to wp-cli.phar is required, uncomment and set if needed
# alias wp='/path/to/.wp-cli/wp-cli.phar'

# the path to wp-config, needed for wp-cli
WP_PATH="/home/user/dev.domain.tld/wordpress"
# database directory, created in git pre-push
DB_DIR="/home/user/.database"

# check there is a database directory
if [ -d "$DB_DIR" ]; then

	# then check it for sql.gz files
	DB_COUNT=`ls -1 $DB_DIR/*.sql.gz 2>/dev/null | wc -l` 

	# if there is exactly 1 database, proceed
	if [ $DB_COUNT == 1 ]; then

		#grab the db name, this way the db name isn't hardcoded
		DB_NAME=$(basename $DB_DIR/*)

		echo 'importing the database'
		echo '***************************************************************'

		# unzip the database, then import it with wp-cli
		gunzip < $DB_DIR/$DB_NAME | wp db import - --path=$WP_PATH

		# clear the transients
		wp transient delete --all --path=$WP_PATH

		# run search replace on the main strings needing to be updated
		# make an array of strings to be searched for and replaced
		search=(
			"local.domain.tld:8443"
			"local.domain.tld"
			"/var/www/user/"
		)
		replace=(
			"dev.domain.tld"
			"dev.domain.tld"
			"/home/user/"
		)

		#loop through the array and spit it into wp search-replace
		for (( i=0; i < ${#search[@]}; ++i )); do
			eval wp search-replace --all-tables --precise \"${search[i]}\" \"${replace[i]}\" --path=$WP_PATH
		done

		# any other wp-cli commands to run
		wp option update blogname "blog name" --path=$WP_PATH

		# delete the backup directory, so there's no junk lying around
		rm -rf $DB_DIR
	
	else
	
		echo 'database was not found'
		echo '***************************************************************'
	
	fi

else

	echo 'database folder was not found'
	echo '***************************************************************'

fi

echo '***************************************************************'
echo 'all done'
echo '***************************************************************'

What else? Dunno. It’s pretty rough, but basically proves something I didn’t find an example of all combined into one: that you can use git hooks to push the database and file changes at the same time, and automate the local-to-dev database transfer process. Is this the best way to do it? Nah, it’s majorly bodgy, and would have to be tailored for each server setup, and I’m not even sure doing such things in a git hook is advisable, even if it works. It does demonstrate that each step of the process can be automated — irrespective of how shonky your setup is — and provided you account for that and your own coding proclivities, there’s multiple ways of doing the same thing.

(edit, a day later.)
I decided to throw this into ‘production’, testing it on a development site I had to create on webhost I’m not so familiar with but who do provide the necessities (like SSH and Let’s Encrypt). Two things happened.

First, WP-CLI didn’t work at all in the post-receive script, even while it did if I ran commands directly in Terminal (or iTerm as I’m currently using). After much messing about, and trying a bunch of things it turned out that this was an issue of “has to be tailored for each server setup”, in this case adding an alias to wp-cli.phar.

Second, having a preference for over-compensation while automating, it occurred to me that I’d made some assumptions, like there’d only be one database file in the uploaded directory, and that hardcoding the filename — which was one of those “I’ll fix that later” things — had morphed into technical debt. So, feeling well competent in Bash today, I decided for the “make sure there’s actually a database folder, then check there’s actually a sql.gz file in it, and there’s only one of them, then get the name of that file, and use it as a variable”. I often wonder how much of this is too much, but trying to cover the more obvious possible bollocks seems reasonably sensible.

Both of these have been rolled into the code above. And as always, it occurs to me already there’s better — ‘better’ — ways to do this, like in pre-push, piping the database directly to the dev server with SSH, or simultaneously creating a separate, local database backup, or doing it all in SQL commands.