Gallery

MacBook Pro!

😱😭😍

Currently writing this on old MacBook Pro which has done 8 hard years under my fingers, and had the entire top case, keyboard, and battery replaced. And is running a tiny bash script which keeps the CPUs ticking over at a minimum of 5% so they don’t decide they’re asleep and randomly shut down.

New MacBook Pro is currently updating itself.

New MacBook Pro is a 16” M1 Pro with 32GB RAM and 1TB SSD. Which is kinda insane.

I took a few weeks last year to make a decision. More RAM? M1 Max? Bigger SSD? What combo of what? And most importantly, what colour? Which was Taner’s call. “You sure you want Space Grey?” Turns out I wasn’t. Silver it is.

I picked it up from Katrin and Taner a week ago, left it unboxed until yesterday when I had some calm, and now doing the massive migration.

Things I really like: Return of MagSafe! Which is Apple’s best invention. Braided power adaptor cable with plugs at both ends! I love braided cables (not sure how dirty it will get though), and hopefully it solves one of Apple’s biggest downers — the frayed cable at the MagSafe or adaptor ends — with a replaceable cable instead of trashing the whole thing. Return of the original TiBook blocky style. Tasty.

Obvious things I really like: Fuck me that’s a big, bright screen. Very different keyboard not requiring me to retrain my finger habits. Massive trackpad (requiring … finger habits). Touch ID. Which I don’t use for critical things like logging in but there’s plenty of other situations where it’s useful as fuck.

Things I have no idea about yet: How fast is it, actually? I went for 32GB of RAM over M1 Max because my current laptop was almost top of the line back in late-2013, and 16GB of RAM has held up well (even with terribly coded ‘modern’ websites and browsers). The 500GB SSD always felt kinda tight when my previous, 2008 MacBook Pro had a 1TB spinny drive. Battery life is supposed to be ‘up to 21 hours’, but realistically getting a day’s work out of it without a recharge, or an evening’s series binge, would be awesome.

And the big thing: How many years is it good for? 8 years on my current one and the main reason I bought a new one was I needed to get rid of some money. High on the list of other reasons were: it’s feeling old and a little creaky and slow, and it might decide to actually die at any time — or keep running for years. Dasniya has my 2008 laptop which still does occasional work and it’s still running slow but ok. Also high on the list was ditching of the Touch Bar, return of MagSafe, and arrival of M1. But I’ve spent on average 4–500€ a year across 4 laptops since 2002, and while they last a lot longer now, they’re also a lot more expensive. So, is 8 years realistic? Especially on the whole M1 hardware.

Somehow I’d like to not have to buy a new laptop again. I have no idea what my life might look like where I’m doing the work I enjoy and have no need for a laptop though. Being a poor who has very carefully balanced a shite household budget for decades means while I might be able to afford to pay for this, I never know if I’ll be able to afford having it. And yes, for anyone who thinks buying this does not equal poor: it’s a very large work expense, it’s one of a number of large expenses I had to choose between — including those trans expenses which I don’t know if I’ll ever fully cross off — and for most of the last decade I’ve been wearing the same clothes and cheap underwear. It’s beautiful and a little intimidating, and I’d love to enjoy it fully without worrying about how near-future me might be in a situation where I can’t afford the basics again.

Yah, so, obligatory unboxing photos, which are not especially aesthetic but I took as I was seeing it for the first time.

Hacking & Bodging a Git Hook + Vagrant + WP-CLI + Bash Local to Dev Database Transfer

Ever since I started using Git to push local website changes to a development server, I’ve been vaguely irritated about dealing with the database in the same manner. For a long time, I used interconnect/it’s Search Replace DB for this side of things, but I was always wondering if I could somehow increase laziness time by automating the process. One hungover Sunday, plus a couple of hours on Monday, and one hacked and bodged success.

This isn’t a “How to do yer Git + Vagrant + local to dev” thing, nor is it a copy-paste, “Works for me!” party. Nonetheless, provided you’re using git-push, and are comfortable with WP-CLI or MySQL command-line, Bash, and generally thrashing small bits of code around, in principle it would work in any situation. And I do feel kinda dirty throwing all this into Git post-receive, but whatever seems to work.

So, here’s what I wanted to do:

  1. Do all my commits, and run git-push and throw all the file changes to the dev server.
  2. Combine that with a dump of the database, then get it to the dev server.
  3. Use something at the other end to import the database, and search-replace strings
  4. Clean up after at both ends.

When I first looked into this, it seemed using the pre-commit Git hook was the most common approach, dumping the database and adding it to the commit. I didn’t want to do this, for a couple of reasons: I do a lot of commits, and the majority have no database component; I wasn’t looking to version control the database; All I wanted to do was push local to dev database with the changed files. Looks like a job for pre-push hook.

Earlier this year, I started using Vagrant, so the first issue was how to dump the database from there. I do commits from the local folder, rather than SSH-ing into the VM, so mysqldump is not going to work without first getting into the VM. Which brought its own set of weirdnesses, and this was the point when I decided to flop over to WP-CLI, the WordPress command-line tool.

I often find solutions to this sort of thing are dependant on the combination of software and commands being used. I use mysqldump on its own all the time, but here, I needed to use Git to set the path for where the database would be dumped to — because git hooks are in a sub-directory of the git folder — and that, in combination with dumping the database inside the VM while within a Git command running from the local folder (yeah, probably should just do all my git via SSH), and hurling it at a remote server, means sometimes things that work in isolation get cranky. And this is a hack/bodge, so I went with:

  1. Set up paths for the database dump with Git, ’cos Git is running this show.
  2. SSH into the Vagrant box.
  3. WP-CLI dump the database to a gzipped file.
  4. SCP that up to the dev server.
  5. Delete all that on the local server, ’cos I’m tidy.

That’s half of it done. I’ve got my pushes working, the database file is up on the dev server, the local server is all cleaned up, so now it’s time for the other end.

In this case, I was doing it for a site on DreamHost, who conveniently give all kinds of fun command-line access, plus WP-CLI on their shared servers. Once Git has finished checking out the new file changes in post-receive, it’s time for frankly bodging it.

My current usual setup is a bare repository on the dev server, which checks out to the development website directory. This means neither the uploaded database, nor WP-CLI and the WordPress root are in the same place as the running hook. No big deal, just use –path=. The next thing though, is cleaning up post-import. Strings to be changed all over the place, like local URLs swapped to dev. And for that we have, wp search-replace, which is an awful lot like Search Replace DB. At the dev end then:

  1. Set up paths again, this time it’s WP-CLI running the show.
  2. Unzip the database then import it.
  3. Do database stuff like search-replace strings, and delete transients.
  4. Delete that uploaded database file on the dev server, ’cos I’m tidy.

I was looking at all this late last night, all those repeating lines of ‘wp search-replace’ and I thought, “That looks like a job for an array.” Which led me down the tunnel of Bash arrays, associative arrays, “How can I actually do ‘blah’, ’cos bash seems to be kinda unwilling here?” and finally settling on not quite what I wanted, but does the job. Also, bash syntax always looks like it’s cursing and swearing.

The pre-push hook:

#!/bin/sh

# a pre-push hook to dump the database to a folder in the repo's root directory, upload it to the dev server, then delete when finished

echo '***************************************************************'
echo 'preparing to back up database'
echo '***************************************************************'

# set up some variables, to keep things more readable later on
# backup_dir is relative to git hooks, i.e. 2 directories higher, so use git to set it

ROOT="$(git rev-parse --show-toplevel)"
BACKUP_DIR="$ROOT/.database"
DB_NAME="database"

# check there is a database backup directory, make it if it doesn't exist then cd to it

if [ ! -d "$BACKUP_DIR" ]; then
mkdir "$BACKUP_DIR"
cd "$BACKUP_DIR"
else
cd "$BACKUP_DIR"
fi

# cos this is vagrant, first ssh into it. there will be a password prompt
# using EOF to write the commands in bash, rather than in ssh quotation marks

ssh -t vagrant@172.17.0.10 << EOF

# cd to the new databases folder. this is absolute, cos is vm and not local folder
cd "/var/www/user/domain.tld/.database" 

# then export the database with wp-cli and gzip it
wp db export --add-drop-table - | gzip -9 > $DB_NAME.sql.gz

# exit ssh
exit

# bail out of eof
EOF

# scp the backup directory and database to dev server
scp -r $BACKUP_DIR user@domain.tld:~/

# remove that backup directory so it's not clogging up git changes
rm -r $BACKUP_DIR

echo '***************************************************************'
echo 'all done, finishing up git push stuff'
echo '***************************************************************'

The post-receive hook:

#!/bin/sh

echo '***************************************************************'
echo 'post-receive is working. checking out pushed changes.'
echo '***************************************************************'

# check out the received changes from local to the dev site
git --work-tree=/home/user/dev.domain.tld  --git-dir=/home/user/.repo.git checkout -f


# import the database with wp-cli
echo '***************************************************************'
echo 'starting database import'
echo '***************************************************************'

# setting up some paths
# on some webhosts, e.g. all-inkl, setting the alias to wp-cli.phar is required, uncomment and set if needed
# alias wp='/path/to/.wp-cli/wp-cli.phar'

# the path to wp-config, needed for wp-cli
WP_PATH="/home/user/dev.domain.tld/wordpress"
# database directory, created in git pre-push
DB_DIR="/home/user/.database"

# check there is a database directory
if [ -d "$DB_DIR" ]; then

	# then check it for sql.gz files
	DB_COUNT=`ls -1 $DB_DIR/*.sql.gz 2>/dev/null | wc -l` 

	# if there is exactly 1 database, proceed
	if [ $DB_COUNT == 1 ]; then

		#grab the db name, this way the db name isn't hardcoded
		DB_NAME=$(basename $DB_DIR/*)

		echo 'importing the database'
		echo '***************************************************************'

		# unzip the database, then import it with wp-cli
		gunzip < $DB_DIR/$DB_NAME | wp db import - --path=$WP_PATH

		# clear the transients
		wp transient delete --all --path=$WP_PATH

		# run search replace on the main strings needing to be updated
		# make an array of strings to be searched for and replaced
		search=(
			"local.domain.tld:8443"
			"local.domain.tld"
			"/var/www/user/"
		)
		replace=(
			"dev.domain.tld"
			"dev.domain.tld"
			"/home/user/"
		)

		#loop through the array and spit it into wp search-replace
		for (( i=0; i < ${#search[@]}; ++i )); do
			eval wp search-replace --all-tables --precise \"${search[i]}\" \"${replace[i]}\" --path=$WP_PATH
		done

		# any other wp-cli commands to run
		wp option update blogname "blog name" --path=$WP_PATH

		# delete the backup directory, so there's no junk lying around
		rm -rf $DB_DIR
	
	else
	
		echo 'database was not found'
		echo '***************************************************************'
	
	fi

else

	echo 'database folder was not found'
	echo '***************************************************************'

fi

echo '***************************************************************'
echo 'all done'
echo '***************************************************************'

What else? Dunno. It’s pretty rough, but basically proves something I didn’t find an example of all combined into one: that you can use git hooks to push the database and file changes at the same time, and automate the local-to-dev database transfer process. Is this the best way to do it? Nah, it’s majorly bodgy, and would have to be tailored for each server setup, and I’m not even sure doing such things in a git hook is advisable, even if it works. It does demonstrate that each step of the process can be automated — irrespective of how shonky your setup is — and provided you account for that and your own coding proclivities, there’s multiple ways of doing the same thing.

(edit, a day later.)
I decided to throw this into ‘production’, testing it on a development site I had to create on webhost I’m not so familiar with but who do provide the necessities (like SSH and Let’s Encrypt). Two things happened.

First, WP-CLI didn’t work at all in the post-receive script, even while it did if I ran commands directly in Terminal (or iTerm as I’m currently using). After much messing about, and trying a bunch of things it turned out that this was an issue of “has to be tailored for each server setup”, in this case adding an alias to wp-cli.phar.

Second, having a preference for over-compensation while automating, it occurred to me that I’d made some assumptions, like there’d only be one database file in the uploaded directory, and that hardcoding the filename — which was one of those “I’ll fix that later” things — had morphed into technical debt. So, feeling well competent in Bash today, I decided for the “make sure there’s actually a database folder, then check there’s actually a sql.gz file in it, and there’s only one of them, then get the name of that file, and use it as a variable”. I often wonder how much of this is too much, but trying to cover the more obvious possible bollocks seems reasonably sensible.

Both of these have been rolled into the code above. And as always, it occurs to me already there’s better — ‘better’ — ways to do this, like in pre-push, piping the database directly to the dev server with SSH, or simultaneously creating a separate, local database backup, or doing it all in SQL commands.

5-Character Dev Environment

Messing with my .bash_profile this afternoon, post-diving into Laravel and Git (which I’ve been doing much of the last week), I realised I could boot my entire dev environment with 5 letters. Fewer, if I wanted.

So instead of going to the Dock, clicking each of the icons, going to each and faffing around, I could at least boot them all, and set off some commands in Terminal (or ITerm2 as I’m now using).

Weirdly, until Justine gave me an evening of command-line Git learning, and wanted my .bash_profile, “Like so,” I hadn’t realised you could do stuff like that, despite amusing myself with all manner of shell scripts. Now I know what’s possible, I’m over-achieving in efficient laziness.

What’s missing is:

  • Opening multiple windows in ITerm or Terminal and running a different command in each (I don’t want to boot multiple instances of an app).
  • Setting off a menu action in an opened app, e.g. in Transmit going to my work drive.
  • Extending it to boot the environment and then a specific project, e.g. “devup laravel” would open my laravel installation in each of the apps, like opening the database in Sequel Pro; cd-ing to the laravel folder after automatic SSH-ing into my Vagrant box, and so on.

Some of these are probably uncomplicated, but this was a 30-minute experiment that turned out to be very useful.

A Year Of My Heart

A year ago, I decided to get all analytic on my training. Mainly I just like tech and pretty representations of data. So I bought a heart rate sensor. And now it’s been a year of me using it almost every time I train. Which means I can look at a year in the life of Frances training, with all the … whatever that reveals.

What does it reveal, Frances?

Well, other Frances. I trained 156 times — that I recorded, let’s say 170 because I pretty much did not train without it unless I forgot either sensor or phone. For a total of 190 hours — there’d be a few more in that for the times my phone battery died. For a measly distance of 1481 kilometres — of actual training rides, not including cross-town, Kreuzberg-Wedding type stuff, so maybe double that at least, no wonder I spend so much on my bike and it feels like it’s constantly in need of repair. Hey, just like me! (Wow, there’s a realisation, right there.) About 1/3 of that was ballet, another third cycling (mostly road at the moment, but some cyclocross), 1/6 bouldering, and the remaining 1/6th a mix of yoga and core training.

Oh, and supposedly I burned around 121,000 calories, which is about 60 days of eating 2000 calories a day. I’m not really convinced about this. I think it’s more of an imaginary number, and not the mathematical kind.

What else? Speed, both average and top are derived from iPhone GPS. I’m not sure how much dispersion there is in this, but I suspect it can easily be 5km/h or more in either direction. My next gear purchase (after … umm … new brakes and probably new rear derailleur pulley wheels) is a speed/cadence sensor — which probably means also a proper cycling head unit instead of phone …

I seem to unintentionally train in 9-10 week blocks, then give up in despair for a couple of weeks, then, like a goldfish circling its bowl, forget all that and get right back into it. Knowing that this might be my natural rhythm though, it could make sense to train in 9 week blocks with a week off, if for nothing else than keeping my enthusiasm. Also I doubt I’ve been training like that this year, my rhythm’s all over the place.

My maximum heart rate seems to be constant around 190 (excluding the huge jumps into the 200s that were either the battery going flat, the sensor getting jostled, or actual random heart weirdness from having stupid fun training in -10º weather). I dunno, I have no context or expertise for reading anything into these figures, other than I seem to like training if it involves a degree of discomfort and some suffering — which I didn’t need a heart rate sensor to tell me.

So, a year of data. What to do with it? No idea! Will I keep using it? For now, yes. It’s become automatic to put it on. I don’t really use it during training, though I’d use it for cycling if I could find an iPhone mount that could hold my ancient 4S. But mostly I do it on feel, and that corresponds pretty closely to the various heart rate zones. I do do regular post-training gawks, to compare how I felt with actual data — and knowing that data across sessions gives me a bit of a feeling for where I’m at on a particular day or week. And one other thing: I train a lot less than I think.

Worth it for seeing a year of training all pretty like that? Yup!

Website rsync Backups the Time Machine Way

Continuing my recent rash of stupid coding, after Spellcheck the Shell Way, I decided for Website rsync Backups the Time Machine Way.

For a few years now, I’ve been using a bash script I bodged together that does incremental-ish backups of my websites using the rather formidable rsync. This week I’ve been working for maschinentempel.de, helping get frohstoff.de‘s WooCommerce shop from Trabant to Hoonage. Which required repeated backing up of the entire site and database, and made me realise the shoddiness of my original backup script.

I thought, “Wouldn’t it be awesome, instead of having to make those stupid ‘backup.blah’ folders, to let the script create a time-stamped folder like Time Machine for each backup, and use the most recent backup for the rsync hard links link destination?” Fukken wouldn’t it, eh?

Creating time-stamped folders was easy. Using the most recent backup folder — which has the most recent date, and in standard list view on my Mac, the last folder in a list — was a little trickier. Especially because once a new folder was created to backup into, that previously most recent was now second to last. tail and head feels hilariously bodgy, but works? Of course it does.

Bare bones explaining: The script needs to be in a folder with another folder called ‘backups’, and a text file called ‘excludes.txt’.  Needs to be given chmod +x to make it executable, and generally can be re-bodged to work on any server you can ssh into. Much faster, more reliable, increased laziness, time-stamped server backups.

#!/bin/sh
# ---------------------------------------------------------------
# A script to manually back up your entire website
# Backup will include everything from the user directory up
# excludes.txt lists files and folders not backed up
# Subsequent backups only download changes, but each folder is a complete backup
# ---------------------------------------------------------------
# get the folder we're in
this_dir="`dirname \"$0\"`"
# set the folder in that to backup into
backup_dir="$this_dir/backups"
# cd to that folder
echo "******************"
echo "cd-ing to $backup_dir"
echo "******************"
cd "$backup_dir" || exit 1
# make a new folder with timestamp
time_stamp=$(date +%Y-%m-%d-%H%M%S)
mkdir "$backup_dir/${backuppath}supernaut-${time_stamp}"
echo "created backup folder: supernaut-${time_stamp}"
echo "******************"
# set link destination for hard links to previous backup
# this gets the last two folders (including the one just made)
# and then the first of those, which is the most recent backup
link_dest=`ls | tail -2 | head -n 1`
echo "hardlink destination: $link_dest"
echo "******************"
# set rsync backup destination to the folder we just made
backup_dest=`ls | tail -1`
echo "backup destination: $backup_dest"
echo "******************"
# run rsync to do the backup via ssh with passwordless login
rsync -avvzc --hard-links --delete --delete-excluded --progress --exclude-from="$this_dir/excludes.txt" --link-dest="$backup_dir/$link_dest" -e ssh username@supernaut.info:~/ "$backup_dir/$backup_dest"
echo "******************"
echo "Backup complete"
echo "******************"
#------------------------------------------------
# info on the backup commands:
# -a --archive archive mode; same as -rlptgoD (no -H)
# -r --recursive recurse into directories
# -l --links copy symlinks as symlinks
# -p --perms preserve permissions
# -t --times preserve times
# -g --group preserve group
# -o --owner preserve owner (super-user only)
# -D same as --devices --specials
# --devices preserve device files (super-user only)
# --specials preserve special files
# -v --verbose increase verbosity - can increment for more detail i.e. -vv -vvv
# -z --compress compress file data during the transfer
# -c --checksum skip based on checksum, not mod-time & size – SLOWER
# -H --hard-links preserve hard links
# --delete delete extraneous files from dest dirs
# --delete-excluded also delete excluded files from dest dirs
# --progress show progress during transfer
# --exclude-from=FILE read exclude patterns from FILE – one file or folder per line
# --link-dest=DIR hardlink to files in DIR when unchanged – set as previous backup
# -e --rsh=COMMAND specify the remote shell to use – SSH
# -n --dry-run show what would have been transferred

Ballet & Tech (A First Attempt)

Me (on and off for the last couple of years): “It would be awesome to have a power meter or something so I can go all data on my training…”

Twitter:

Has any dancer ever measured a performance with a fitbit or pedometer? How many steps? How far do they dance? PLEASE will someone do this?

Me (in Jo Siska’s ballet class on Wednesday): “OMG Jo! Look! Data!”

Inaccurate data. But that’s what this is, a test of how to get meaningful and accurate(-ish) data on what goes on when I’m dancing.

When I was living in Wedding, part of my training routine was morning cyclocross rides in the forest around Flughafen Tegel. Last year when I inherited an (old, 4s) iPhone and stuck Trails app on it, I started to see what the intangible feeling of each ride represented. A couple of things were missing though, one of which I finally prodded myself to buy this week – a Polar H7 heart rate sensor (yeah, I got the pink strap). The other is one of those crazy expenses I’m unlikely to throw euros at unless I have around four thousand of them spare for a new bike: a power meter.

Power meters tend to be the province of bike crank arms, pedals, or hubs and cost about double what normal people spend on a whole bike. And none of them are objects you can take into a dance studio. Slightly getting there is the rpm2 shoe insert power meter, still no good for dance though. Which leaves the very new Stryd – and very cheap, not much more than a Fitbit (which I’ll get to later), and about the same size as the H7 – a power meter for runners.

Before all that, Wednesday. In the studio with my heart sensor on and my iPhone beside the barre, cos it uses Bluetooth to sync. That’s several problems right there. First, doing ballet (or generally dance) training with an iPhone lodged somewhere is not so practical, which means a pedometer is going to count exactly zero steps. Second, Bluetooth is possessive, it likes quasi-line-of-sight and proximity. Bouncing around ten meters down the studio with heart monitor facing away from it is going to generate some highly improvised heart rate info. If, for the sake of science, I slip my iPhone into my trackie pocket, I’ll get pedometer info, but any GPS-based data capture (speed, distance, location) is comically useless, having an accuracy of greater than 4 meters. I was dumping my heart info into Trails, which is a fine app for cycling training, and much of the time it had my location not even in the same building, plus my altitude changed by 24 metres.

Thursday on my morning training ride around Tempelhoferfeld, I used both Trails and Polar’s Polar Beat. The data resolution of both is pretty good, Polar Beat is more fine-grained, and neither had a problem with my phone being in the back pocket of my jersey. I’ve been doing enough cycling with data recording to know what looks right.

Which leads me to Fitbit, cos my flatmate has one. It stores the data locally so no need for a live Bluetooth connection. It does heart rate, pedometer, a bunch of other useful garbage, makes pretty data, syncs to phone, laptop, or to fitbit.com, and looks like a dainty watch strap.

So, Friday, ballet again. This time with a Fitbit and my H7 going to Polar Beat.

I’m siding with Fitbit when they say their data accuracy decreases outside fairly limited activities: both heart monitor and step counter are dependant on arms not windmilling for acquisition of useful data. Perhaps it requires repeated use to find the best spot on my wrist, but compared to the H7, Fitbit reported my average heart rate at ~20bpm less – I stuck fingers to neck and what the H7 shows is a good match. As for steps – and ignoring the first 18 minutes or so where I have no idea who it thought I was – it gave around 250 for the entire 40 minutes of barre, and 2200 for the class; obviously not counting a pas de bourée as three steps.

The H7 doesn’t do step counting – unless you pair it with their walnut-sized Stride Sensor somehow affixed to your foot. Its heart data though is magical. You can see every exercise through the class mirrored in my increased heart rate, and check out the centre adage starting at 40 minutes, where the curve is almost identical for both times, and the arc through the entire class, building intensity in small stages at the barre until peaking through the centre into longer and longer periods of maximum effort, before révérance-ing out. I can also look at sections, so if I select just the centre, then my average heart rate goes up to 167 and only once drops below 120. Lots of good data you can do stuff with. (And I can even assign training to Ballet, with a fancy Olympic-looking arabesque!)

But what about power? Or other stuff? Stryd for the power (and heart rate), and RunScribe for everything else? Would they even handle dancing? RunScribe would be awesome for visualising the mechanics of dancing, g-force, velocity, ground contact time, pronation – if it could handle the foot chaos. And then what to do with all this information? If it’s all just for a bit of woohoo! then Fitbit and its social network gamification of sleeping is fine. But if it’s for the purpose of improving performance, technique, being more diligent in how you train, that’s a whole other thing.

Gallery

Tanami Track & Desert

Somehow I got from trying to find my way across Berlin to several hours traipsing up the Tanami Track and across the desert in South Australia and the Northern Territory. Along the way I found a couple of impact craters, marvelled at the astoundingly and diversely complex geological processes across central Australia, followed dry, braided rivers to their inland deltas, seasonal lakes and waterways, found airport runways, a scrawl of tracks, trails, roads, paths that faded in and out, cattle stations, groups of houses, mines, diggings, scratchings, was amazed at the quantity of signs of human existence in the blankness, more amazed still by the utter beauty of the land, realised it looked a lot like my favourite kind of art and some of the stuff I was doing a while ago, and I was better just to take screenshots than a paintbrush, also that I am unlikely to ever see this land from the ground, and to see it like this, from surveillance satellites mapping the planet down to metre-resolution is something I’ll never experience.

Well, That’s Disappointing

I’d been waiting for my new MacBook Pro for weeks, and yesterday it finally arrived. I was rehearsing in the evening, so decided to postpone the unpacking until today because I wanted to photograph it like I did six years ago, and because I was planning to install everything anew. Today being lazy, it was around 4:30pm I got organised.

Half an hour later, just when I’m pulling the wrapper off the thing itself, getting ready to plug it in, I discover a fucking dent on the bottom case. A fucking dent. On my new, 2947.00€ MacBook Pro. I couldn’t believe it. I almost said, “aw, fuck it, whatever,” and then thought, “Wait, I just spent three thousand euros on something that I expect to last me six years, which I’ve already chosen to ignore the slightly dirty cable packaging and the definitely not clean bottom case, and now a fucking dent? Fuck this.”

So I packed it back up, put the lid back on the box and called BetterWorx. Oops. Closed. Wait ’til Monday. I’m not even going to speculate on how a new laptop can arrive buggered, but it’s going back and I’m fucked if I’m accepting this one.

Mavericks? I would have called it 10.9 Anarchists myself

Mainly it was because iCal and Address Book lose that utterly vile skeuomorphic stitched leather look, and also realising my afternoon was slightly free, and I’d downloaded the 5.31GB of OS X 10.9 Mavericks, and I was dead lusty for all the new stuff, so 40 minutes later or so I was booting on my venerable (but definitely alive) 2008 MacBook Pro into the first non-cat OS in 12 1/2 years. (Actually a bit more because I was messing around with pre-release versions even before that).

And that was easy, wasn’t it? (Besides needing to reindex my Mail which caused 20 or so emails from years ago to try and send themselves until I mashed the ‘off’ button for Wi-Fi). Tabs in the Finder? Nice! Not sure I’ll use Tags, mainly due to having a decade of junk on my laptop already organised. iCal’s new look and the Day view are especially pleasing (though adding notes is still not entirely possible with keyboard). Safari’s Inspector has been given a new set of clothes. iBooks! Awesome! Really brilliant that it’s finally on Macs. And Maps! My short play with it hasn’t revealed whether it can replace my current map choice for tracking my training rides, but the 3D view of around here makes the trees look like Krynoids from Doctor Who: Seeds of Doom.

Important stuff like Apache, PHP, and MySQL worked almost immediately: the former just needing its httpd-vhosts.conf file updated; the mid needing the former’s httpd.conf edited to load the PHP module and the last working without a problem. And that is the easiest setup for my localhost environment ever.

I also had to buy Little Snitch, which no longer worked, but considering how much I use it, it’s €30 very well-spent – especially considering 10.9 was free. Oh, and iWork, Aperture, bunch of other stuff also updated. Pity I can’t afford one of the new MacBook Pros.

New Computer Logic Board!

Oh indeed it’s going mental around here with overuse of the word, ‘New’.

A couple of weeks ago my nearly-four-year-old laptop went completely black when I dragged an image to be edited. Panic! High blood pressure! Gnashing of teeth! Somehow it decided to restart after half a day, and I of course began my usual assault on the internet to find out wtf was that wtf.

“macbookpro4,1 black screen” gave some pretty definitive answers, and as soon as I checked I had the dreaded NVIDIA GeForce 8600 GT graphics card, I was pretty sure I was staring “ugly immediate future” in the mouth.

One large Mac shop nearby Technische Universität was distinctly unhelpful (unlike last year when my top case gave out), and said my laptop wasn’t covered as I had a blaaaahblllaaaa … (won’t dwell on the minutia). So I resorted to running the fan on high which seemed to be the only solution for temporarily deferring inevitable logic board death while I a) tried to get through a stack of work and b) alternately pondered what to do or whined.

Being not one who likes to lose an argument when the price of doing so is about half that of a new laptop, I searched for precedents. Excitement, yes. Found them and called Apple just as my screen gave out again and on the very last day of the four-year period Apple would cover replacement for free, and received a definitive, “Yes, my supervisor says you’re covered.”

All of which found me on Tuesday in BetterWorx, really hoping that previous definitive wouldn’t be graced by a conditional. And I’ve seldom been so happy to see my laptop fall apart when tested as I was when they ran the graphics test and it threw back the NVIDIA failure.

More agonising as I was informed I might not make the deadline and …

Friday arrived and so did my laptop, ahead of the deadline by a scant 8 hours. And!

New 500gb hard drive installed (been sitting on my shelf since before the floods in Thailand doubled the price). Wonky latch also fixed!

I spent last night doing a completely fresh install (ten hours of trawling through all my forgotten system tweaks that suddenly and glaringly weren’t there), and now hoping nothing more goes askew (could do with a new battery some time), and I get at least the rest of the year out of this poor, clobbered thing (pulled an enormous dustball out from under the command key when I noticed the backlight was a little askew).

(This is a thank you also to everyone at BetterWorx.)