Gallery

Gala & Michael Headcasts Portraits

Three Australians in Wuppertal, by way of Brussels, Madrid, and Berlin. Last time I was in Wuppertal it was for The Vase, one of three banging works I’ve seen this year. This time, Friday evening, it’s snowing to whiteout, Gala and Michael are talking about the headcasts they’ve had done for their upcoming work, New People. They want photos. Guess who brought their camera? Saturday morning, after a lazy breakfast and before lunch hamburgers, still snowing, the falling-apart printer’s workshops behind Michael’s apartment having their roofless concrete floors jackhammered by the owner, one of those old socialist tradie types who ends up with a bunch of properties and maintains them all himself. It’s proper winter cold, slush and snow and wetness, and he’s hauling shit around like Sisyphus. We bail into the one building with a roof. Milky glass-paned, rusting windows along one wall fill it with just enough light for us to get away with photography. There’s a temporary scaffolding floor erected, we tall ones are nearly smacking our heads on bits of pipe and beam. Their busts go on the ground, then on a plank, I photograph them like I would mediæval art.

Image

Inadvertently 3-Dimensionalising Gala & Michael

Completely due to a tiny shuffle of one foot in a decrepit former book printer’s workshop out the back of Michael’s apartment (where the owner was totally having at it on a Saturday morning with jackhammers into concrete in the snow and freeing slush), I managed to do one of those mad meme-y gifs that look like they’re 3-dimensional. Tick-tock, back and forth, convinces me every time.

Cie. OFEN, Gala Moody & Michael Carter headcasts
Cie. OFEN, Gala Moody & Michael Carter headcasts

Showcase Beat Le Mot — Super Collider, at HAU2

Second show of the week, and the one I’m seeing first: Showcase Beat Le Mot’s Super Collider at HAU2, with Dasniya Sommer doing rope stuff.

Showcase Beat Le Mot — Super Collider
HAU Hebbel am Ufer, Berlin (HAU2)

  • Sun 10.12.2017, 19:00
  • Mon 11.12.2017, 19:00
  • Tue 12.12.2017, 19:00
  • Wed 13.12.2017, 19:00

Laser pointers write “Game Over” in the sky. The performers of Showcase Beat Le Mot do not want to waste any electricity on apocalyptic resignation but instead to expand the frame of discourse to include outer space. That’s why in their new performance they’re sending danced messages and mumbled formulas into the outskirts of the universe. “True, beautiful and good were sought, skewed, quirky and helpful were found.” The antennas for the extra-terrestrial messages are switched from receiving to sending. This is not documentary theatre about the powerlessness to act in a surging reality but an experiment that works like a utopian slingshot. The stage setting is the message, the audience the amplifier. Space acts instead of fake facts. Always alongside trouble. Welcome to the half-truth about everything.

Concept, Space, Text, Realisation: Showcase Beat Le Mot
Shibari: Dasniya Sommer
Sound/Music: Sebastian Meissner
Costumes: Clemens Leander
Video: Alexej Tschernij
Light: Klaus Dust
Movements: Nina Kronjäger
Stage construction: Jörg Fischer
Production: Olaf Nachtwey & Johanna J. Thomas
Thanks to: Alexander Djuric and Etel Adnan

Showcase Beat Le Mot — Super Collider
Showcase Beat Le Mot — Super Collider

Isabelle Schad — Double Portrait, & Turning Solo, at HAU3

Two shows by and with friends this week: Isabelle Schad’s première at HAU of Turning Solo with the brilliant Naïma Ferré, and Double Portrait, both of which I saw in June showings in Isabelle’s studio in Wiesenburg.

Dear friends and colleagues,

We would like to invite you to the Berlin Premiere / world premiere of the new pieces Double Portrait and Turning Solo by Isabelle Schad at HAU Hebbel am Ufer, Berlin.

We would be very happy to see you.

  1. Berlin premiere / world premiere
    • Friday, 15.12.2017, 19:00, HAU Hebbel am Ufer Berlin (HAU3)
  2. more performance dates:
    • 16.12. 2017, 19:00
    • 17.12. 2017, 19:00
    • 18.12. 2017, 19:00

With Double Portrait and Turning Solo, Isabelle Schad continues a series of works which attempt to create distinct and personal portraits through a purely physical approach, moulding respective rhythms and energies into choreographed experiences.

Double Portrait — the portrait for Przemek Kaminski and Nir Vidan — seeks to form a solo for two persons with their bodies, movements and subjective rhythms. Each of them finds his prolongation in the other. In changing interdependencies a shared space defines self and other, intimacy and care, colliding forces and violence creating a web of connectivities. The work plays with aspects of Frances Bacon’s paintings, their complexity in visual rhythm, their intensity and immediacy.

Turning Solo — the portrait for Naïma Ferré — is based on her fascination with spinning for long periods. This whirling practice is brought into dialogue with Schad’s research around axial and weight shift, around inner movement material and its extension into the world, around energetic fields that characterise oneself and others. Little by little an initially minimalist study in movement becomes a shimmering jewel, a rotating sculpture, the choreographic portrait of a dancer.

Credits Double Portrait
concept and choreography: Isabelle Schad / co-choreography und performance: Przemek Kaminski, Nir Vidan / dramaturgical support: Saša Božić / sound: Damir Šimunović / lighting: Bruno Pocheron / stage: Isabelle Schad, Bruno Pocheron, Charlotte Pistorius, Thomas Henriksson (painting) / costumes: Charlotte Pistorius / head of production: Heiko Schramm / production defacto: Andrea Remetin

Credits Turning Solo
concept and choreography: Isabelle Schad / co-choreography und performance: Naïma Ferré / dramaturgical support: Saša Božić / sound: Damir Šimunović / lighting: Bruno Pocheron & Emese Csornai / costumes: Charlotte Pistorius / head of production: Heiko Schramm

Isabelle Schad — Turning Solo
Isabelle Schad — Turning Solo
Isabelle Schad — Double Portrait
Isabelle Schad — Double Portrait

Gallery

ICE Bahn Wuppertal nach Berlin

The long train from Wuppertal to Berlin. Snow, lightless, proper winter arriving.

Gallery

Schwebebahn, Snow, ICE Wuppertal nach Berlin

Schwebebahn is still the best Bahn.

Hacking & Bodging a Git Hook + Vagrant + WP-CLI + Bash Local to Dev Database Transfer

Ever since I started using Git to push local website changes to a development server, I’ve been vaguely irritated about dealing with the database in the same manner. For a long time, I used interconnect/it’s Search Replace DB for this side of things, but I was always wondering if I could somehow increase laziness time by automating the process. One hungover Sunday, plus a couple of hours on Monday, and one hacked and bodged success.

This isn’t a “How to do yer Git + Vagrant + local to dev” thing, nor is it a copy-paste, “Works for me!” party. Nonetheless, provided you’re using git-push, and are comfortable with WP-CLI or MySQL command-line, Bash, and generally thrashing small bits of code around, in principle it would work in any situation. And I do feel kinda dirty throwing all this into Git post-receive, but whatever seems to work.

So, here’s what I wanted to do:

  1. Do all my commits, and run git-push and throw all the file changes to the dev server.
  2. Combine that with a dump of the database, then get it to the dev server.
  3. Use something at the other end to import the database, and search-replace strings
  4. Clean up after at both ends.

When I first looked into this, it seemed using the pre-commit Git hook was the most common approach, dumping the database and adding it to the commit. I didn’t want to do this, for a couple of reasons: I do a lot of commits, and the majority have no database component; I wasn’t looking to version control the database; All I wanted to do was push local to dev database with the changed files. Looks like a job for pre-push hook.

Earlier this year, I started using Vagrant, so the first issue was how to dump the database from there. I do commits from the local folder, rather than SSH-ing into the VM, so mysqldump is not going to work without first getting into the VM. Which brought its own set of weirdnesses, and this was the point when I decided to flop over to WP-CLI, the WordPress command-line tool.

I often find solutions to this sort of thing are dependant on the combination of software and commands being used. I use mysqldump on its own all the time, but here, I needed to use Git to set the path for where the database would be dumped to — because git hooks are in a sub-directory of the git folder — and that, in combination with dumping the database inside the VM while within a Git command running from the local folder (yeah, probably should just do all my git via SSH), and hurling it at a remote server, means sometimes things that work in isolation get cranky. And this is a hack/bodge, so I went with:

  1. Set up paths for the database dump with Git, ’cos Git is running this show.
  2. SSH into the Vagrant box.
  3. WP-CLI dump the database to a gzipped file.
  4. SCP that up to the dev server.
  5. Delete all that on the local server, ’cos I’m tidy.

That’s half of it done. I’ve got my pushes working, the database file is up on the dev server, the local server is all cleaned up, so now it’s time for the other end.

In this case, I was doing it for a site on DreamHost, who conveniently give all kinds of fun command-line access, plus WP-CLI on their shared servers. Once Git has finished checking out the new file changes in post-receive, it’s time for frankly bodging it.

My current usual setup is a bare repository on the dev server, which checks out to the development website directory. This means neither the uploaded database, nor WP-CLI and the WordPress root are in the same place as the running hook. No big deal, just use –path=. The next thing though, is cleaning up post-import. Strings to be changed all over the place, like local URLs swapped to dev. And for that we have, wp search-replace, which is an awful lot like Search Replace DB. At the dev end then:

  1. Set up paths again, this time it’s WP-CLI running the show.
  2. Unzip the database then import it.
  3. Do database stuff like search-replace strings, and delete transients.
  4. Delete that uploaded database file on the dev server, ’cos I’m tidy.

I was looking at all this late last night, all those repeating lines of ‘wp search-replace’ and I thought, “That looks like a job for an array.” Which led me down the tunnel of Bash arrays, associative arrays, “How can I actually do ‘blah’, ’cos bash seems to be kinda unwilling here?” and finally settling on not quite what I wanted, but does the job. Also, bash syntax always looks like it’s cursing and swearing.

The pre-push hook:

#!/bin/sh

# a pre-push hook to dump the database to a folder in the repo's root directory, upload it to the dev server, then delete when finished

echo '***************************************************************'
echo 'preparing to back up database'
echo '***************************************************************'

# set up some variables, to keep things more readable later on
# backup_dir is relative to git hooks, i.e. 2 directories higher, so use git to set it

ROOT="$(git rev-parse --show-toplevel)"
BACKUP_DIR="$ROOT/.database"
DB_NAME="database"

# check there is a database backup directory, make it if it doesn't exist then cd to it

if [ ! -d "$BACKUP_DIR" ]; then
mkdir "$BACKUP_DIR"
cd "$BACKUP_DIR"
else
cd "$BACKUP_DIR"
fi

# cos this is vagrant, first ssh into it. there will be a password prompt
# using EOF to write the commands in bash, rather than in ssh quotation marks

ssh -t vagrant@172.17.0.10 << EOF

# cd to the new databases folder. this is absolute, cos is vm and not local folder
cd "/var/www/user/domain.tld/.database" 

# then export the database with wp-cli and gzip it
wp db export --add-drop-table - | gzip -9 > $DB_NAME.sql.gz

# exit ssh
exit

# bail out of eof
EOF

# scp the backup directory and database to dev server
scp -r $BACKUP_DIR user@domain.tld:~/

# remove that backup directory so it's not clogging up git changes
rm -r $BACKUP_DIR

echo '***************************************************************'
echo 'all done, finishing up git push stuff'
echo '***************************************************************'

The post-receive hook:

#!/bin/sh

echo '***************************************************************'
echo 'post-receive is working. checking out pushed changes.'
echo '***************************************************************'

# check out the received changes from local to the dev site
git --work-tree=/home/user/dev.domain.tld  --git-dir=/home/user/.repo.git checkout -f


# import the database with wp-cli
echo '***************************************************************'
echo 'starting database import'
echo '***************************************************************'

# setting up some paths
# on some webhosts, e.g. all-inkl, setting the alias to wp-cli.phar is required, uncomment and set if needed
# alias wp='/path/to/.wp-cli/wp-cli.phar'

# the path to wp-config, needed for wp-cli
WP_PATH="/home/user/dev.domain.tld/wordpress"
# database directory, created in git pre-push
DB_DIR="/home/user/.database"

# check there is a database directory
if [ -d "$DB_DIR" ]; then

	# then check it for sql.gz files
	DB_COUNT=`ls -1 $DB_DIR/*.sql.gz 2>/dev/null | wc -l` 

	# if there is exactly 1 database, proceed
	if [ $DB_COUNT == 1 ]; then

		#grab the db name, this way the db name isn't hardcoded
		DB_NAME=$(basename $DB_DIR/*)

		echo 'importing the database'
		echo '***************************************************************'

		# unzip the database, then import it with wp-cli
		gunzip < $DB_DIR/$DB_NAME | wp db import - --path=$WP_PATH

		# clear the transients
		wp transient delete --all --path=$WP_PATH

		# run search replace on the main strings needing to be updated
		# make an array of strings to be searched for and replaced
		search=(
			"local.domain.tld:8443"
			"local.domain.tld"
			"/var/www/user/"
		)
		replace=(
			"dev.domain.tld"
			"dev.domain.tld"
			"/home/user/"
		)

		#loop through the array and spit it into wp search-replace
		for (( i=0; i < ${#search[@]}; ++i )); do
			eval wp search-replace --all-tables --precise \"${search[i]}\" \"${replace[i]}\" --path=$WP_PATH
		done

		# any other wp-cli commands to run
		wp option update blogname "blog name" --path=$WP_PATH

		# delete the backup directory, so there's no junk lying around
		rm -rf $DB_DIR
	
	else
	
		echo 'database was not found'
		echo '***************************************************************'
	
	fi

else

	echo 'database folder was not found'
	echo '***************************************************************'

fi

echo '***************************************************************'
echo 'all done'
echo '***************************************************************'

What else? Dunno. It’s pretty rough, but basically proves something I didn’t find an example of all combined into one: that you can use git hooks to push the database and file changes at the same time, and automate the local-to-dev database transfer process. Is this the best way to do it? Nah, it’s majorly bodgy, and would have to be tailored for each server setup, and I’m not even sure doing such things in a git hook is advisable, even if it works. It does demonstrate that each step of the process can be automated — irrespective of how shonky your setup is — and provided you account for that and your own coding proclivities, there’s multiple ways of doing the same thing.

(edit, a day later.)
I decided to throw this into ‘production’, testing it on a development site I had to create on webhost I’m not so familiar with but who do provide the necessities (like SSH and Let’s Encrypt). Two things happened.

First, WP-CLI didn’t work at all in the post-receive script, even while it did if I ran commands directly in Terminal (or iTerm as I’m currently using). After much messing about, and trying a bunch of things it turned out that this was an issue of “has to be tailored for each server setup”, in this case adding an alias to wp-cli.phar.

Second, having a preference for over-compensation while automating, it occurred to me that I’d made some assumptions, like there’d only be one database file in the uploaded directory, and that hardcoding the filename — which was one of those “I’ll fix that later” things — had morphed into technical debt. So, feeling well competent in Bash today, I decided for the “make sure there’s actually a database folder, then check there’s actually a sql.gz file in it, and there’s only one of them, then get the name of that file, and use it as a variable”. I often wonder how much of this is too much, but trying to cover the more obvious possible bollocks seems reasonably sensible.

Both of these have been rolled into the code above. And as always, it occurs to me already there’s better — ‘better’ — ways to do this, like in pre-push, piping the database directly to the dev server with SSH, or simultaneously creating a separate, local database backup, or doing it all in SQL commands.

Git Hook + Vagrant + WP-CLI + Bash Local to Dev Database Transfer