Hacking & Bodging a Git Hook + Vagrant + WP-CLI + Bash Local to Dev Database Transfer

Ever since I started using Git to push local website changes to a development server, I’ve been vaguely irritated about dealing with the database in the same manner. For a long time, I used interconnect/it’s Search Replace DB for this side of things, but I was always wondering if I could somehow increase laziness time by automating the process. One hungover Sunday, plus a couple of hours on Monday, and one hacked and bodged success.

This isn’t a “How to do yer Git + Vagrant + local to dev” thing, nor is it a copy-paste, “Works for me!” party. Nonetheless, provided you’re using git-push, and are comfortable with WP-CLI or MySQL command-line, Bash, and generally thrashing small bits of code around, in principle it would work in any situation. And I do feel kinda dirty throwing all this into Git post-receive, but whatever seems to work.

So, here’s what I wanted to do:

  1. Do all my commits, and run git-push and throw all the file changes to the dev server.
  2. Combine that with a dump of the database, then get it to the dev server.
  3. Use something at the other end to import the database, and search-replace strings
  4. Clean up after at both ends.

When I first looked into this, it seemed using the pre-commit Git hook was the most common approach, dumping the database and adding it to the commit. I didn’t want to do this, for a couple of reasons: I do a lot of commits, and the majority have no database component; I wasn’t looking to version control the database; All I wanted to do was push local to dev database with the changed files. Looks like a job for pre-push hook.

Earlier this year, I started using Vagrant, so the first issue was how to dump the database from there. I do commits from the local folder, rather than SSH-ing into the VM, so mysqldump is not going to work without first getting into the VM. Which brought its own set of weirdnesses, and this was the point when I decided to flop over to WP-CLI, the WordPress command-line tool.

I often find solutions to this sort of thing are dependant on the combination of software and commands being used. I use mysqldump on its own all the time, but here, I needed to use Git to set the path for where the database would be dumped to — because git hooks are in a sub-directory of the git folder — and that, in combination with dumping the database inside the VM while within a Git command running from the local folder (yeah, probably should just do all my git via SSH), and hurling it at a remote server, means sometimes things that work in isolation get cranky. And this is a hack/bodge, so I went with:

  1. Set up paths for the database dump with Git, ’cos Git is running this show.
  2. SSH into the Vagrant box.
  3. WP-CLI dump the database to a gzipped file.
  4. SCP that up to the dev server.
  5. Delete all that on the local server, ’cos I’m tidy.

That’s half of it done. I’ve got my pushes working, the database file is up on the dev server, the local server is all cleaned up, so now it’s time for the other end.

In this case, I was doing it for a site on DreamHost, who conveniently give all kinds of fun command-line access, plus WP-CLI on their shared servers. Once Git has finished checking out the new file changes in post-receive, it’s time for frankly bodging it.

My current usual setup is a bare repository on the dev server, which checks out to the development website directory. This means neither the uploaded database, nor WP-CLI and the WordPress root are in the same place as the running hook. No big deal, just use –path=. The next thing though, is cleaning up post-import. Strings to be changed all over the place, like local URLs swapped to dev. And for that we have, wp search-replace, which is an awful lot like Search Replace DB. At the dev end then:

  1. Set up paths again, this time it’s WP-CLI running the show.
  2. Unzip the database then import it.
  3. Do database stuff like search-replace strings, and delete transients.
  4. Delete that uploaded database file on the dev server, ’cos I’m tidy.

I was looking at all this late last night, all those repeating lines of ‘wp search-replace’ and I thought, “That looks like a job for an array.” Which led me down the tunnel of Bash arrays, associative arrays, “How can I actually do ‘blah’, ’cos bash seems to be kinda unwilling here?” and finally settling on not quite what I wanted, but does the job. Also, bash syntax always looks like it’s cursing and swearing.

The pre-push hook:

#!/bin/sh

# a pre-push hook to dump the database to a folder in the repo's root directory, upload it to the dev server, then delete when finished

echo '***************************************************************'
echo 'preparing to back up database'
echo '***************************************************************'

# set up some variables, to keep things more readable later on
# backup_dir is relative to git hooks, i.e. 2 directories higher, so use git to set it

ROOT="$(git rev-parse --show-toplevel)"
BACKUP_DIR="$ROOT/.database"
DB_NAME="database"

# check there is a database backup directory, make it if it doesn't exist then cd to it

if [ ! -d "$BACKUP_DIR" ]; then
mkdir "$BACKUP_DIR"
cd "$BACKUP_DIR"
else
cd "$BACKUP_DIR"
fi

# cos this is vagrant, first ssh into it. there will be a password prompt
# using EOF to write the commands in bash, rather than in ssh quotation marks

ssh -t vagrant@172.17.0.10 << EOF

# cd to the new databases folder. this is absolute, cos is vm and not local folder
cd "/var/www/user/domain.tld/.database" 

# then export the database with wp-cli and gzip it
wp db export --add-drop-table - | gzip -9 > $DB_NAME.sql.gz

# exit ssh
exit

# bail out of eof
EOF

# scp the backup directory and database to dev server
scp -r $BACKUP_DIR user@domain.tld:~/

# remove that backup directory so it's not clogging up git changes
rm -r $BACKUP_DIR

echo '***************************************************************'
echo 'all done, finishing up git push stuff'
echo '***************************************************************'

The post-receive hook:

#!/bin/sh

echo '***************************************************************'
echo 'post-receive is working. checking out pushed changes.'
echo '***************************************************************'

# check out the received changes from local to the dev site
git --work-tree=/home/user/dev.domain.tld  --git-dir=/home/user/.repo.git checkout -f


# import the database with wp-cli
echo '***************************************************************'
echo 'starting database import'
echo '***************************************************************'

# setting up some paths
# on some webhosts, e.g. all-inkl, setting the alias to wp-cli.phar is required, uncomment and set if needed
# alias wp='/path/to/.wp-cli/wp-cli.phar'

# the path to wp-config, needed for wp-cli
WP_PATH="/home/user/dev.domain.tld/wordpress"
# database directory, created in git pre-push
DB_DIR="/home/user/.database"

# check there is a database directory
if [ -d "$DB_DIR" ]; then

	# then check it for sql.gz files
	DB_COUNT=`ls -1 $DB_DIR/*.sql.gz 2>/dev/null | wc -l` 

	# if there is exactly 1 database, proceed
	if [ $DB_COUNT == 1 ]; then

		#grab the db name, this way the db name isn't hardcoded
		DB_NAME=$(basename $DB_DIR/*)

		echo 'importing the database'
		echo '***************************************************************'

		# unzip the database, then import it with wp-cli
		gunzip < $DB_DIR/$DB_NAME | wp db import - --path=$WP_PATH

		# clear the transients
		wp transient delete --all --path=$WP_PATH

		# run search replace on the main strings needing to be updated
		# make an array of strings to be searched for and replaced
		search=(
			"local.domain.tld:8443"
			"local.domain.tld"
			"/var/www/user/"
		)
		replace=(
			"dev.domain.tld"
			"dev.domain.tld"
			"/home/user/"
		)

		#loop through the array and spit it into wp search-replace
		for (( i=0; i < ${#search[@]}; ++i )); do
			eval wp search-replace --all-tables --precise \"${search[i]}\" \"${replace[i]}\" --path=$WP_PATH
		done

		# any other wp-cli commands to run
		wp option update blogname "blog name" --path=$WP_PATH

		# delete the backup directory, so there's no junk lying around
		rm -rf $DB_DIR
	
	else
	
		echo 'database was not found'
		echo '***************************************************************'
	
	fi

else

	echo 'database folder was not found'
	echo '***************************************************************'

fi

echo '***************************************************************'
echo 'all done'
echo '***************************************************************'

What else? Dunno. It’s pretty rough, but basically proves something I didn’t find an example of all combined into one: that you can use git hooks to push the database and file changes at the same time, and automate the local-to-dev database transfer process. Is this the best way to do it? Nah, it’s majorly bodgy, and would have to be tailored for each server setup, and I’m not even sure doing such things in a git hook is advisable, even if it works. It does demonstrate that each step of the process can be automated — irrespective of how shonky your setup is — and provided you account for that and your own coding proclivities, there’s multiple ways of doing the same thing.

(edit, a day later.)
I decided to throw this into ‘production’, testing it on a development site I had to create on webhost I’m not so familiar with but who do provide the necessities (like SSH and Let’s Encrypt). Two things happened.

First, WP-CLI didn’t work at all in the post-receive script, even while it did if I ran commands directly in Terminal (or iTerm as I’m currently using). After much messing about, and trying a bunch of things it turned out that this was an issue of “has to be tailored for each server setup”, in this case adding an alias to wp-cli.phar.

Second, having a preference for over-compensation while automating, it occurred to me that I’d made some assumptions, like there’d only be one database file in the uploaded directory, and that hardcoding the filename — which was one of those “I’ll fix that later” things — had morphed into technical debt. So, feeling well competent in Bash today, I decided for the “make sure there’s actually a database folder, then check there’s actually a sql.gz file in it, and there’s only one of them, then get the name of that file, and use it as a variable”. I often wonder how much of this is too much, but trying to cover the more obvious possible bollocks seems reasonably sensible.

Both of these have been rolled into the code above. And as always, it occurs to me already there’s better — ‘better’ — ways to do this, like in pre-push, piping the database directly to the dev server with SSH, or simultaneously creating a separate, local database backup, or doing it all in SQL commands.

Git Hook + Vagrant + WP-CLI + Bash Local to Dev Database Transfer

Gallery

Ars Electronica

Wednesday in Linz. I had to pick up some requirements for S.J Norman’s Rest Area and pick up Kali Rose from the train station, all the way from Amsterdam to be the person in the bed in the van of Rest Area. I had an hour to kill, and had planned Sunday for mediæval art (did not happen), so went to Ars Electronica.

I’m rethinking my museum-ing, or at least for the moment not taking hundreds of photos, editing and blogging scores. These are simply things I liked and felt motivated enough to photograph. There’s so much in the museum, and much of it is temporal, interactive, and 3-dimensional; photography doesn’t serve these well.

I sent “Your unreadable text message to +43 664 1788374” for Stefan Tiefengraber’s your unerasable text. My phone autocorrected. It was pushed to the shredder then sat there, unshredded. The pink of the Biolab was so, so, very hot, florescent, candy, neon pink, rendered as something less than all those and fuchsia by my camera. Markus Reibe’s Protected Areas 2 is the closest thing to recognisable, non-interactive, 2-dimensional art I saw, on a wall documenting the history of what was digital / new media / computer art, and what I just call art these days. The atrium of Ars Electronica was bus yellow and grey cement. If I had time, I would have spent hours here with hundreds of photos.

Image

A Bus Stop near Linz

An attempt at the venacular of the photographic mundane: a prefab bus stop outside Linz on the way to Ottensheim, beside some of the saddest housing architecture I have yet seen.

Ottensheim–Linz Bus Stop
Ottensheim–Linz Bus Stop

5-Character Dev Environment

Messing with my .bash_profile this afternoon, post-diving into Laravel and Git (which I’ve been doing much of the last week), I realised I could boot my entire dev environment with 5 letters. Fewer, if I wanted.

So instead of going to the Dock, clicking each of the icons, going to each and faffing around, I could at least boot them all, and set off some commands in Terminal (or ITerm2 as I’m now using).

Weirdly, until Justine gave me an evening of command-line Git learning, and wanted my .bash_profile, “Like so,” I hadn’t realised you could do stuff like that, despite amusing myself with all manner of shell scripts. Now I know what’s possible, I’m over-achieving in efficient laziness.

What’s missing is:

  • Opening multiple windows in ITerm or Terminal and running a different command in each (I don’t want to boot multiple instances of an app).
  • Setting off a menu action in an opened app, e.g. in Transmit going to my work drive.
  • Extending it to boot the environment and then a specific project, e.g. “devup laravel” would open my laravel installation in each of the apps, like opening the database in Sequel Pro; cd-ing to the laravel folder after automatic SSH-ing into my Vagrant box, and so on.

Some of these are probably uncomplicated, but this was a 30-minute experiment that turned out to be very useful.

5-character dev environment
5-character dev environment

Image

Why, Yes, Chinese Grinding & Boring Equipment Company Spam Email, I am Interested.

Dear Lei Zhang of Fujian Nan’an Boreway Machinery Co., Ltd. in Shuitou Town, Nan’an City, Fujian Province, China. We have never met. But you emailed me on my blog email this morning. Your email was beautiful. I sadly have no use for boring and grinding heads as I sadly have no boring or grinding equipment. I wish I did. I saw the attached photograph of bush hammer rollers and thought to myself, “This is good and useful spam. All spam should be this good and useful. Imagine if it was, I would buy things.” So, thank you, Lei Zhang, you have done the unique. In all my years of receiving spam, in all those tens or hundreds of thousands of impersonal and unwanted emails, yours is the first that’s made me say, “I wish I had a need for this, I would buy immediately.” It’s true, if I had spare cash, I’d buy some bush hammer rollers just because they look brilliant — and I’ve never even seen them before. Because of your email, I learned something new, which is what is best in life. You must think I’m being sarcastic and mocking you, but I’m not. Yours is genuinely the best unsolicited email I’ve ever received.

Bush Hammer Rollers from Fujian Nan'an Boreway Machinery Co., Ltd.

Bookmark Archaeology

Aside

I was cleaning out my browser bookmarks last night, first time in years, bookmarks going back to the early-’00s, thousands of them. I opened them in batches, every one, to see if I wanted to keep them. Hundreds, thousands of dead sites, no longer found, no longer existing. All that history and culture vanished as if it never was, only the link and title in my bookmarks proving they once existed, and once I deleted that …

Code Stupidity

Aside

I got sick of the tiny, Web1.0 images everywhere here, a hangover from the earliest days of supernaut, so I decided — ’cos I like visuality & pix — to make small, big. I thought it would be easy. Little did I know I also create and add to the pile of Technical Debt. So: most single images in the recent past are now huge-ified, 666px wide; recent image galleries which are not full of diverse image ratios are now evenly splitting the Number of the Beast. Older images and galleries should be retaining their previous diminutiveness, but who knows, 13 years of blog is difficult to homogenise. Mostly I got distracted with how to make portrait images not blow out of the available browser window space, which turns out to be a kinda traumatising process I didn’t achieve. Plus how to Lazy Load srcsets by preg_replacing the new WordPress caption shortcode. OMFG, Frances, WTF? All of which makes me think it might be time for yet another supernaut refresh. So much code. So many images. So much …

A Year Of My Heart

A year ago, I decided to get all analytic on my training. Mainly I just like tech and pretty representations of data. So I bought a heart rate sensor. And now it’s been a year of me using it almost every time I train. Which means I can look at a year in the life of Frances training, with all the … whatever that reveals.

What does it reveal, Frances?

Well, other Frances. I trained 156 times — that I recorded, let’s say 170 because I pretty much did not train without it unless I forgot either sensor or phone. For a total of 190 hours — there’d be a few more in that for the times my phone battery died. For a measly distance of 1481 kilometres — of actual training rides, not including cross-town, Kreuzberg-Wedding type stuff, so maybe double that at least, no wonder I spend so much on my bike and it feels like it’s constantly in need of repair. Hey, just like me! (Wow, there’s a realisation, right there.) About 1/3 of that was ballet, another third cycling (mostly road at the moment, but some cyclocross), 1/6 bouldering, and the remaining 1/6th a mix of yoga and core training.

Oh, and supposedly I burned around 121,000 calories, which is about 60 days of eating 2000 calories a day. I’m not really convinced about this. I think it’s more of an imaginary number, and not the mathematical kind.

What else? Speed, both average and top are derived from iPhone GPS. I’m not sure how much dispersion there is in this, but I suspect it can easily be 5km/h or more in either direction. My next gear purchase (after … umm … new brakes and probably new rear derailleur pulley wheels) is a speed/cadence sensor — which probably means also a proper cycling head unit instead of phone …

I seem to unintentionally train in 9-10 week blocks, then give up in despair for a couple of weeks, then, like a goldfish circling its bowl, forget all that and get right back into it. Knowing that this might be my natural rhythm though, it could make sense to train in 9 week blocks with a week off, if for nothing else than keeping my enthusiasm. Also I doubt I’ve been training like that this year, my rhythm’s all over the place.

My maximum heart rate seems to be constant around 190 (excluding the huge jumps into the 200s that were either the battery going flat, the sensor getting jostled, or actual random heart weirdness from having stupid fun training in -10º weather). I dunno, I have no context or expertise for reading anything into these figures, other than I seem to like training if it involves a degree of discomfort and some suffering — which I didn’t need a heart rate sensor to tell me.

So, a year of data. What to do with it? No idea! Will I keep using it? For now, yes. It’s become automatic to put it on. I don’t really use it during training, though I’d use it for cycling if I could find an iPhone mount that could hold my ancient 4S. But mostly I do it on feel, and that corresponds pretty closely to the various heart rate zones. I do do regular post-training gawks, to compare how I felt with actual data — and knowing that data across sessions gives me a bit of a feeling for where I’m at on a particular day or week. And one other thing: I train a lot less than I think.

Worth it for seeing a year of training all pretty like that? Yup!

Polar Flow and H7 Heart Rate Sensor — One Year Weekly Training Report
Polar Flow and H7 Heart Rate Sensor — One Year Weekly Training Report
Polar Flow and H7 Heart Rate Sensor — One Year Daily Training Report
Polar Flow and H7 Heart Rate Sensor — One Year Daily Training Report

Video

Field Series 1

Me messing around with mediæval art, Photoshopping it until it’s far from the 3/4 of a millennium ago of its origin. It started as a visit to the Gemäldegalerie when I decided to do closeups of some of my favourite works. This is part of the Altarretabel in drei Abteilung mit dem Gnadenstuhl, from after 1250. Last night, feeling unexpectedly inspired around midnight, I realised I could mash another few score of layers into an image I was working on six months ago, and increase the density in ways that somehow appeal to my brain and eyes and emotions. I always zoom in on these images, like there’s myriad possible paintings in each. This time I took screenshots of those, and wanting to know what they might look like animated, threw them into Final Cut X and spat out 48 seconds of video.

I was asking myself if this is art. I know art and make art, but still. Maybe they’re sketches of possibilities. I like the artefacts generated from the process. I have no control over this. I have some control in which direction to push an image, but a lot of the detail is only minimally editable. Things happen, I make decisions, other things happen, possibilities open and close, I try and steer it towards a particular satisfaction, but each individual line and gradient and tone, no, that’s the software making its own decisions based on what I ask it to do. And as always, the further I get from using software as it was intended, the more interesting it becomes to me.