Hacking & Bodging a Git Hook + Vagrant + WP-CLI + Bash Local to Dev Database Transfer

Ever since I started using Git to push local website changes to a development server, I’ve been vaguely irritated about dealing with the database in the same manner. For a long time, I used interconnect/it’s Search Replace DB for this side of things, but I was always wondering if I could somehow increase laziness time by automating the process. One hungover Sunday, plus a couple of hours on Monday, and one hacked and bodged success.

This isn’t a “How to do yer Git + Vagrant + local to dev” thing, nor is it a copy-paste, “Works for me!” party. Nonetheless, provided you’re using git-push, and are comfortable with WP-CLI or MySQL command-line, Bash, and generally thrashing small bits of code around, in principle it would work in any situation. And I do feel kinda dirty throwing all this into Git post-receive, but whatever seems to work.

So, here’s what I wanted to do:

  1. Do all my commits, and run git-push and throw all the file changes to the dev server.
  2. Combine that with a dump of the database, then get it to the dev server.
  3. Use something at the other end to import the database, and search-replace strings
  4. Clean up after at both ends.

When I first looked into this, it seemed using the pre-commit Git hook was the most common approach, dumping the database and adding it to the commit. I didn’t want to do this, for a couple of reasons: I do a lot of commits, and the majority have no database component; I wasn’t looking to version control the database; All I wanted to do was push local to dev database with the changed files. Looks like a job for pre-push hook.

Earlier this year, I started using Vagrant, so the first issue was how to dump the database from there. I do commits from the local folder, rather than SSH-ing into the VM, so mysqldump is not going to work without first getting into the VM. Which brought its own set of weirdnesses, and this was the point when I decided to flop over to WP-CLI, the WordPress command-line tool.

I often find solutions to this sort of thing are dependant on the combination of software and commands being used. I use mysqldump on its own all the time, but here, I needed to use Git to set the path for where the database would be dumped to — because git hooks are in a sub-directory of the git folder — and that, in combination with dumping the database inside the VM while within a Git command running from the local folder (yeah, probably should just do all my git via SSH), and hurling it at a remote server, means sometimes things that work in isolation get cranky. And this is a hack/bodge, so I went with:

  1. Set up paths for the database dump with Git, ’cos Git is running this show.
  2. SSH into the Vagrant box.
  3. WP-CLI dump the database to a gzipped file.
  4. SCP that up to the dev server.
  5. Delete all that on the local server, ’cos I’m tidy.

That’s half of it done. I’ve got my pushes working, the database file is up on the dev server, the local server is all cleaned up, so now it’s time for the other end.

In this case, I was doing it for a site on DreamHost, who conveniently give all kinds of fun command-line access, plus WP-CLI on their shared servers. Once Git has finished checking out the new file changes in post-receive, it’s time for frankly bodging it.

My current usual setup is a bare repository on the dev server, which checks out to the development website directory. This means neither the uploaded database, nor WP-CLI and the WordPress root are in the same place as the running hook. No big deal, just use –path=. The next thing though, is cleaning up post-import. Strings to be changed all over the place, like local URLs swapped to dev. And for that we have, wp search-replace, which is an awful lot like Search Replace DB. At the dev end then:

  1. Set up paths again, this time it’s WP-CLI running the show.
  2. Unzip the database then import it.
  3. Do database stuff like search-replace strings, and delete transients.
  4. Delete that uploaded database file on the dev server, ’cos I’m tidy.

I was looking at all this late last night, all those repeating lines of ‘wp search-replace’ and I thought, “That looks like a job for an array.” Which led me down the tunnel of Bash arrays, associative arrays, “How can I actually do ‘blah’, ’cos bash seems to be kinda unwilling here?” and finally settling on not quite what I wanted, but does the job. Also, bash syntax always looks like it’s cursing and swearing.

The pre-push hook:

#!/bin/sh

# a pre-push hook to dump the database to a folder in the repo's root directory, upload it to the dev server, then delete when finished

echo '***************************************************************'
echo 'preparing to back up database'
echo '***************************************************************'

# set up some variables, to keep things more readable later on
# backup_dir is relative to git hooks, i.e. 2 directories higher, so use git to set it

ROOT="$(git rev-parse --show-toplevel)"
BACKUP_DIR="$ROOT/.database"
DB_NAME="database"

# check there is a database backup directory, make it if it doesn't exist then cd to it

if [ ! -d "$BACKUP_DIR" ]; then
mkdir "$BACKUP_DIR"
cd "$BACKUP_DIR"
else
cd "$BACKUP_DIR"
fi

# cos this is vagrant, first ssh into it. there will be a password prompt
# using EOF to write the commands in bash, rather than in ssh quotation marks

ssh -t vagrant@172.17.0.10 << EOF

# cd to the new databases folder. this is absolute, cos is vm and not local folder
cd "/var/www/user/domain.tld/.database" 

# then export the database with wp-cli and gzip it
wp db export --add-drop-table - | gzip -9 > $DB_NAME.sql.gz

# exit ssh
exit

# bail out of eof
EOF

# scp the backup directory and database to dev server
scp -r $BACKUP_DIR user@domain.tld:~/

# remove that backup directory so it's not clogging up git changes
rm -r $BACKUP_DIR

echo '***************************************************************'
echo 'all done, finishing up git push stuff'
echo '***************************************************************'

The post-receive hook:

#!/bin/sh

echo '***************************************************************'
echo 'post-receive is working. checking out pushed changes.'
echo '***************************************************************'

# check out the received changes from local to the dev site
git --work-tree=/home/user/dev.domain.tld  --git-dir=/home/user/.repo.git checkout -f


# import the database with wp-cli
echo '***************************************************************'
echo 'starting database import'
echo '***************************************************************'

# setting up some paths
# on some webhosts, e.g. all-inkl, setting the alias to wp-cli.phar is required, uncomment and set if needed
# alias wp='/path/to/.wp-cli/wp-cli.phar'

# the path to wp-config, needed for wp-cli
WP_PATH="/home/user/dev.domain.tld/wordpress"
# database directory, created in git pre-push
DB_DIR="/home/user/.database"

# check there is a database directory
if [ -d "$DB_DIR" ]; then

	# then check it for sql.gz files
	DB_COUNT=`ls -1 $DB_DIR/*.sql.gz 2>/dev/null | wc -l` 

	# if there is exactly 1 database, proceed
	if [ $DB_COUNT == 1 ]; then

		#grab the db name, this way the db name isn't hardcoded
		DB_NAME=$(basename $DB_DIR/*)

		echo 'importing the database'
		echo '***************************************************************'

		# unzip the database, then import it with wp-cli
		gunzip < $DB_DIR/$DB_NAME | wp db import - --path=$WP_PATH

		# clear the transients
		wp transient delete --all --path=$WP_PATH

		# run search replace on the main strings needing to be updated
		# make an array of strings to be searched for and replaced
		search=(
			"local.domain.tld:8443"
			"local.domain.tld"
			"/var/www/user/"
		)
		replace=(
			"dev.domain.tld"
			"dev.domain.tld"
			"/home/user/"
		)

		#loop through the array and spit it into wp search-replace
		for (( i=0; i < ${#search[@]}; ++i )); do
			eval wp search-replace --all-tables --precise \"${search[i]}\" \"${replace[i]}\" --path=$WP_PATH
		done

		# any other wp-cli commands to run
		wp option update blogname "blog name" --path=$WP_PATH

		# delete the backup directory, so there's no junk lying around
		rm -rf $DB_DIR
	
	else
	
		echo 'database was not found'
		echo '***************************************************************'
	
	fi

else

	echo 'database folder was not found'
	echo '***************************************************************'

fi

echo '***************************************************************'
echo 'all done'
echo '***************************************************************'

What else? Dunno. It’s pretty rough, but basically proves something I didn’t find an example of all combined into one: that you can use git hooks to push the database and file changes at the same time, and automate the local-to-dev database transfer process. Is this the best way to do it? Nah, it’s majorly bodgy, and would have to be tailored for each server setup, and I’m not even sure doing such things in a git hook is advisable, even if it works. It does demonstrate that each step of the process can be automated — irrespective of how shonky your setup is — and provided you account for that and your own coding proclivities, there’s multiple ways of doing the same thing.

(edit, a day later.)
I decided to throw this into ‘production’, testing it on a development site I had to create on webhost I’m not so familiar with but who do provide the necessities (like SSH and Let’s Encrypt). Two things happened.

First, WP-CLI didn’t work at all in the post-receive script, even while it did if I ran commands directly in Terminal (or iTerm as I’m currently using). After much messing about, and trying a bunch of things it turned out that this was an issue of “has to be tailored for each server setup”, in this case adding an alias to wp-cli.phar.

Second, having a preference for over-compensation while automating, it occurred to me that I’d made some assumptions, like there’d only be one database file in the uploaded directory, and that hardcoding the filename — which was one of those “I’ll fix that later” things — had morphed into technical debt. So, feeling well competent in Bash today, I decided for the “make sure there’s actually a database folder, then check there’s actually a sql.gz file in it, and there’s only one of them, then get the name of that file, and use it as a variable”. I often wonder how much of this is too much, but trying to cover the more obvious possible bollocks seems reasonably sensible.

Both of these have been rolled into the code above. And as always, it occurs to me already there’s better — ‘better’ — ways to do this, like in pre-push, piping the database directly to the dev server with SSH, or simultaneously creating a separate, local database backup, or doing it all in SQL commands.

5-Character Dev Environment

Messing with my .bash_profile this afternoon, post-diving into Laravel and Git (which I’ve been doing much of the last week), I realised I could boot my entire dev environment with 5 letters. Fewer, if I wanted.

So instead of going to the Dock, clicking each of the icons, going to each and faffing around, I could at least boot them all, and set off some commands in Terminal (or ITerm2 as I’m now using).

Weirdly, until Justine gave me an evening of command-line Git learning, and wanted my .bash_profile, “Like so,” I hadn’t realised you could do stuff like that, despite amusing myself with all manner of shell scripts. Now I know what’s possible, I’m over-achieving in efficient laziness.

What’s missing is:

  • Opening multiple windows in ITerm or Terminal and running a different command in each (I don’t want to boot multiple instances of an app).
  • Setting off a menu action in an opened app, e.g. in Transmit going to my work drive.
  • Extending it to boot the environment and then a specific project, e.g. “devup laravel” would open my laravel installation in each of the apps, like opening the database in Sequel Pro; cd-ing to the laravel folder after automatic SSH-ing into my Vagrant box, and so on.

Some of these are probably uncomplicated, but this was a 30-minute experiment that turned out to be very useful.

Website rsync Backups the Time Machine Way

Continuing my recent rash of stupid coding, after Spellcheck the Shell Way, I decided for Website rsync Backups the Time Machine Way.

For a few years now, I’ve been using a bash script I bodged together that does incremental-ish backups of my websites using the rather formidable rsync. This week I’ve been working for maschinentempel.de, helping get frohstoff.de‘s WooCommerce shop from Trabant to Hoonage. Which required repeated backing up of the entire site and database, and made me realise the shoddiness of my original backup script.

I thought, “Wouldn’t it be awesome, instead of having to make those stupid ‘backup.blah’ folders, to let the script create a time-stamped folder like Time Machine for each backup, and use the most recent backup for the rsync hard links link destination?” Fukken wouldn’t it, eh?

Creating time-stamped folders was easy. Using the most recent backup folder — which has the most recent date, and in standard list view on my Mac, the last folder in a list — was a little trickier. Especially because once a new folder was created to backup into, that previously most recent was now second to last. tail and head feels hilariously bodgy, but works? Of course it does.

Bare bones explaining: The script needs to be in a folder with another folder called ‘backups’, and a text file called ‘excludes.txt’.  Needs to be given chmod +x to make it executable, and generally can be re-bodged to work on any server you can ssh into. Much faster, more reliable, increased laziness, time-stamped server backups.

#!/bin/sh
# ---------------------------------------------------------------
# A script to manually back up your entire website
# Backup will include everything from the user directory up
# excludes.txt lists files and folders not backed up
# Subsequent backups only download changes, but each folder is a complete backup
# ---------------------------------------------------------------
# get the folder we're in
this_dir="`dirname \"$0\"`"
# set the folder in that to backup into
backup_dir="$this_dir/backups"
# cd to that folder
echo "******************"
echo "cd-ing to $backup_dir"
echo "******************"
cd "$backup_dir" || exit 1
# make a new folder with timestamp
time_stamp=$(date +%Y-%m-%d-%H%M%S)
mkdir "$backup_dir/${backuppath}supernaut-${time_stamp}"
echo "created backup folder: supernaut-${time_stamp}"
echo "******************"
# set link destination for hard links to previous backup
# this gets the last two folders (including the one just made)
# and then the first of those, which is the most recent backup
link_dest=`ls | tail -2 | head -n 1`
echo "hardlink destination: $link_dest"
echo "******************"
# set rsync backup destination to the folder we just made
backup_dest=`ls | tail -1`
echo "backup destination: $backup_dest"
echo "******************"
# run rsync to do the backup via ssh with passwordless login
rsync -avvzc --hard-links --delete --delete-excluded --progress --exclude-from="$this_dir/excludes.txt" --link-dest="$backup_dir/$link_dest" -e ssh username@supernaut.info:~/ "$backup_dir/$backup_dest"
echo "******************"
echo "Backup complete"
echo "******************"
#------------------------------------------------
# info on the backup commands:
# -a --archive archive mode; same as -rlptgoD (no -H)
# -r --recursive recurse into directories
# -l --links copy symlinks as symlinks
# -p --perms preserve permissions
# -t --times preserve times
# -g --group preserve group
# -o --owner preserve owner (super-user only)
# -D same as --devices --specials
# --devices preserve device files (super-user only)
# --specials preserve special files
# -v --verbose increase verbosity - can increment for more detail i.e. -vv -vvv
# -z --compress compress file data during the transfer
# -c --checksum skip based on checksum, not mod-time & size – SLOWER
# -H --hard-links preserve hard links
# --delete delete extraneous files from dest dirs
# --delete-excluded also delete excluded files from dest dirs
# --progress show progress during transfer
# --exclude-from=FILE read exclude patterns from FILE – one file or folder per line
# --link-dest=DIR hardlink to files in DIR when unchanged – set as previous backup
# -e --rsh=COMMAND specify the remote shell to use – SSH
# -n --dry-run show what would have been transferred

Spellcheck the Shell Way

I was reading this awesome book (about which I shall soon blog) and there was this moment of, “Fark! What a brilliant line!” like I actually said that ’cos it was so good, followed by, “Fark! Spelling mistake of spacecraft’s name!” And I thought wouldn’t a good way to deal with spellchecking (besides my favourite cmd-;) be to take the entire text, do something fancy command-line to it, and output all the words alphabetically by frequency. Then you could just spellcheck that file, find the weird words, go back to the original document and correct the shit out of them. So I did. Brilliant!

# take a text and output all the words alphabetically by frequency
# spaces replaced with line breaks, lowercase everything, punctuation included (apostrophe in ascii \047)
# http://unix.stackexchange.com/questions/39039/get-text-file-word-occurrence-count-of-all-words-print-output-sorted
# http://tldp.org/LDP/abs/html/textproc.html
# http://donsnotes.com/tech/charsets/ascii.html
find . -name "foo.txt" -exec cat {} \; | tr ' ' '\012' | tr A-Z a-z | tr -cd '\012[a-z][0-9]\047' | grep -v "^\s*$" | sort | uniq -c | sort -bnr

Yes, many things to blog …

Since returning from Zürich, I’ve mostly been embedded in Final Cut Pro X, turning four very different performances of Mars Attacks! into something approximating a single version. It’s not the kind of work that can be said to have one authoritative performance, but the video as document will become that de facto. Much forgetting of old versions of Final Cut also, as the X release is so different it’s just a hinderance to dredge up memories of how it works.

And, buying of cycling gear, so now I look the part when I go careening through the forest at speeds best not thinking about the consequences of a rapid stop with. I found a very nice pair of cycling glasses that dim when exposed to UV light – photochromic lenses! Also amazing how little I squint when the glare is cut out, and how delightful the absence of insects in my eyes is. I am enjoying cyclocross very, very much lately.

Also climbing, with the discovery earlier this year (I think) of a bouldering hall of massive size a mere 10 minute lazy bike ride away. My climbing ability has plummeted – it was never especially good on indoor holds which tend to the overhung; far from my preference of thin, balance-y edges on large amounts of vertical granite. Still, it’s incredibly inexpensive, and it’s delightful to be regularly hauling myself up stuff.

And reading! Which I was most lazy with this year, being distracted by the internet and other things, and my late-evening focus too scatty and distracted to apply itself to even medium stretches of reading. I have many good books though I’ve been enjoying lately, which shall appear here soon-ish.

And! Once more working with Das Helmi, this time for the Dahlem Museum (ha! all my museum-ing turns out to be for a purpose!): a project on Adrian Jacobsen, who went to Northern Canada and Alaska in the 1880’s and acquired a vast number of artefacts from the Eskimos, Aleut, Yupik, and Inuit. It seems to have percolated memories of things I learned in Canada … it’s an unexpected direction for me, reading beyond a cursory level on the First Nations in the North, and a fascinating one also, not the least because mountains, snow, and glaciers everywhere.

And! Rehearsing with Dasniya for a performance in Heppenheim in July!

Which is not all that’s going on, but I think that’s enough for the moment. And I shall endeavour to blog with consistency in the coming weeks also.

Mavericks? I would have called it 10.9 Anarchists myself

Mainly it was because iCal and Address Book lose that utterly vile skeuomorphic stitched leather look, and also realising my afternoon was slightly free, and I’d downloaded the 5.31GB of OS X 10.9 Mavericks, and I was dead lusty for all the new stuff, so 40 minutes later or so I was booting on my venerable (but definitely alive) 2008 MacBook Pro into the first non-cat OS in 12 1/2 years. (Actually a bit more because I was messing around with pre-release versions even before that).

And that was easy, wasn’t it? (Besides needing to reindex my Mail which caused 20 or so emails from years ago to try and send themselves until I mashed the ‘off’ button for Wi-Fi). Tabs in the Finder? Nice! Not sure I’ll use Tags, mainly due to having a decade of junk on my laptop already organised. iCal’s new look and the Day view are especially pleasing (though adding notes is still not entirely possible with keyboard). Safari’s Inspector has been given a new set of clothes. iBooks! Awesome! Really brilliant that it’s finally on Macs. And Maps! My short play with it hasn’t revealed whether it can replace my current map choice for tracking my training rides, but the 3D view of around here makes the trees look like Krynoids from Doctor Who: Seeds of Doom.

Important stuff like Apache, PHP, and MySQL worked almost immediately: the former just needing its httpd-vhosts.conf file updated; the mid needing the former’s httpd.conf edited to load the PHP module and the last working without a problem. And that is the easiest setup for my localhost environment ever.

I also had to buy Little Snitch, which no longer worked, but considering how much I use it, it’s €30 very well-spent – especially considering 10.9 was free. Oh, and iWork, Aperture, bunch of other stuff also updated. Pity I can’t afford one of the new MacBook Pros.

Image

mousepath

Along with upgrading my laptop to 10.6 on the weekend (I always, always do these things last at night, and know I’m going to break something and so telling myself, “Don’t do it, you’ll mess something up!”, “No, no, it’ll be alright… I’m awake this time…”) and breaking my AirPort, sundry plugins and my SQL installation, I gained 15gb of space, extra, extra fastness, (especially for opening encrypted sparse packages), a feeling of accomplishment, tiredness from staying up till 4am, and mousepath.

I also began cleaning out 8 years of bookmarks, and was rummaging through my net-art folder when I found Anatoly Zenkov’s small piece of Java code (download here). Much reminiscing on early 2000’s code-art…

So, here is around five hours of my day today, compared to some it’s quite light, mainly because I use the keyboard so much and was mostly coding.

blog stuff

I shouldn’t be on my laptop today; I am bad. Working around 12 and sometimes up to 16 hours a day since the start of February without a day off is stupid. It’s a habit of mine and ends with me burning out and getting sick. Which I did a little over a week ago, and struggled through work last week by coming home and going to bed around 9pm. And there is always as much work as hours I will give.

So I said no computer work this weekend, and spent a beautiful, warm and sunny day in Kreuzberg yesterday cycling about, lying am Engelbecken after ballet with D (who is not Daniel, but I am always uncertain when or if to introduce someone new on my blog…), and home later at dusk, my windows open full all day and some cheese and bread and a rather irresistible Ayurvedic tea that reminds me I like mountains and cold places… sleep…

Instead, I decided over breakfast to redo my blogroll. As usual since last time some have come, some have gone, some I’ve lost interest in… Some new categories, as always imprecise needed to be made, splitting Science & Humanities into Anthropology, Astronomy, Language as well, somehow reflecting more closely my news feeds (which you can download here).

New arrivals, many I’ve been reading for some time whom I am rather fond of (and whom are being added as I write because I keep remembering, ‘oh, I forgot blah!’)… enjoy the reading.

FLESH WORLD read while listening to sunn 0))).
guerilla semiotics, who used to be someone else and is still one of the most interesting theatre culture writers around.
My Big Backyard queer farm life in the sub-sub-tropics.
Buck Angel”s Blog! possible one of my favourites right now, for his Porno for Pyros video if nothing else.
Sugarbutch Chronicles butch trans* dyke porn-lit.
Let them eat pro-sm feminist safe spaces BDSM critical theory (for wont of a better description).
Shenzhen Noted who used to be Shenzhen Fieldnotes.
tang dynasty times, one of my utter all-round favourite blogs these days.
an imperfect pen, more China/Asia anthropology.
Paper Republic contemporary Chinese literature and translation.
Shanghai Scrap fascinating for me documenting of China’s scrap industry.
earlyTibet, almost a pair with tang dynasty times.
Hazaristan Times, one of the few Afghanistan blogs not COIN.
Cabinet of Wonders mmm blogging on the Age of Enlightenment.
Material World visual culture anthropology.
Neuroanthropology a field I seem to be reading a lot in during the last year.
HiBlog: HiRISE Team Blog from the satellite mission, beautiful.
Mars and Me a daily blog of one of the Mars rover drivers from the mission start five years ago.
Philosophy’s Other current philosophy stuff.
Feminist Philosophers a philosophy blog I’m fond of.
The Oyster’s Garter Oceanography, my new fascintation (along with volcanoes again, but have yet to find many blogs on that.)

[edit…]

Oh, and one more in a category I read a lot but never seem to mention much:

i love typography mmm… fonts and design…

cookie monster

Yeah, I wouldn’t recommend doing this. It’s old. Probably doesn’t work anymore anyway.

I don’t like cookies so much. The persistent browser types, with expiry dates of 2031 that cause a trail of my identity to be left across Google and other sites. And I don’t like how poorly Safari manages them. Even a checkbox option would be better, to keep the ones I need or don’t mind and to delete all the others instead of manually having to go through them all.

SafariPlus used to do this perfectly, from within the browser, unobtrusively. But since 10.5 and Safari 3, it hasn’t worked. So I changed to Cocoa Cookie, a separate small utility. I had to go to my Applications folder to find it, but was still quick and… then it stopped working, it would open without the window showing, caused much weirdness with Safari’s cookies since the latest version and…

uuuhhh… annoyance.

I found a Perl script a couple of days ago. I suppose it could also be done in AppleScript, and should really learn how to write in that, but it’s rather perfect. Well it runs from Terminal also, and requires some editing, but…
#!/usr/bin/perl

use strict;
use warnings;
use File::Slurp;

### Edit this to your liking (put a pipe character between two words)
my $keepCookiesWith = "gaydargirls|culturedcode|supernaut|dreamhost";

### Put your OS X short username here (there should be a directory with the same name under /Users)
my $userName = "francesdath";

### ### ### Don't edit beneath this line unless you know some Perl
my $path = "/Users/francesdath/Library/Cookies/Cookies.plist";
my @date = localtime();
my $date = sprintf("%04d%02d%02d", $date[5] + 1900, $date[4] + 1, $date[3]);
my $cookies = read_file($path);
rename ($path, $path . "." . $date);

open(WH, ">$path");
print WH <<EOF;
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/
PropertyList-1.0.dtd">
<plist version="1.0">
<array>
EOF
while ($cookies =~ m#(\s*<dict>.+?</dict>)#gs)
{
my $cookie = $1;
if ($cookie =~ /$keepCookiesWith/)
{
print WH $cookie;
}
}
print WH <<EOF;

</array>
</plist>
EOF
close (WH);

So, copy it into your favourite text editor, save as something memorable like ‘SafariCookiecleaner.pl’ and put it somewhere out of the way but not forgettable. You can add new cookies to be saved based on any attribute that appears in the Safari cookie window. I tend to use the domain names, like ‘macosxhints’, though I have one Google cookie I like to keep, with the name ‘PREF’, so I added that also (unfortunately YouTube has one with the same name…). Set the ‘username’ to your Home Folder (probably what appears in the SideBar in the Finder), and then set ‘my $path = “”;’ to ‘/Users/yourhomefolder/Library/Cookies/Cookies.plist’ which is the path to Safari’s cookies file.

Then open Terminal, and change to the directory where you’ve put it, change the permissions to 755 and then run it (quit Safari first).

Well like this:
cd /drag/the/folder/containing/the/script/into/Terminal
chmod 755 SafariCookieCleaner.pl
./SafariCookieCleaner.pl

Open Safari, look in cookies in Preferences and the ones you like should still be there. It makes a backup of the cookies file, so at worst nothing is irreparable. And it makes you look all UNIX by opening Terminal.

Gallery

women in mac

I was watching Welcome to Macintosh a few nights ago, becoming engaged in a mindless indulgence in Apple, I do remember these old things… It was though, a very male affair. Lots of tech geek guys. There was one woman and she I do remember… or rather I remember her work… or rather when I think of Macs it’s her I’m thinking of.

When OSX 10.2 came along with the new startup screen grey on grey bitten Apple logo monotone, much discussion ensued about how to hack the boot loader and replace the apple with the old, friendly pixel art smiley computer Happy Mac. Generally a deeply unsettling process involving editing hex addresses and .raw files, run length encoding and exclamations of “holy crap” when it worked instead of trashing your entire system.

But what of Happy Mac? And Moof the Dogcow? Bomb? Sad Mac? Watch cursor and page of text and font suitcase, floppy disc and all the icons of OS9 which were OS8 and 7 and… I’d never given it too much thought, and certainly not enough to imagine they were penned by someone more-or-less computer illiterate at the time (mostly due to lack of gui interface) using graph paper and filling in the squares.

Susan Kare whom I doubt I’d heard of until a couple of weeks ago is possibly the biggest influence on my design aesthetics and responsible for my emotional love affair with the Apple interface. Strange to look at OSX 10.5, the Aqua design, and then return to OS9 or even earlier and see her hand is indelible still.

And the other, whom I am nearly certain I’d never heard of, though I recall the ripples of her decisions, at least as somewhat recent history by the time I discovered computers. She was responsible for trashing Copland, the operating system that was to replace OS7, the purchase of NeXT and their operating system to replace it with, what became OSX, and the return of Steve Jobs, who promptly ridiculed and demoted her.

Ellen Hancock, without whom I would not be using OSX. Would Apple still exist? OSX was somewhat a torment to use until 10.3, at first there wasn’t even network or printing, swarms of kernel panics, much horribleness, but within this was… mmm like seeing the future. It was a special moment when I got my first laptop, a PowerBook G4 550mHz and 256Mb of RAM (20gig harddrive!) but the only question was, “Does it have OS ten?” … “uuuhhh… yeah…”

Why did I decide to write this?

It’s Apple’s 25th anniversary, and I’ve been reading Macworld’s celebrations. Of their 20 most important people in the history of the Mac, only two are women: Sarah Kare and Ellen Hancock. And in Welcome to Macintosh Sarah was the only woman out of a cast of guys to receive any attention. But maybe it’s because the tech industry is so heavily skewed to being a guy place, or maybe Apple has been a bunch of guys. But…

When I read about the history of Mac, it’s Steve Jobs, Johnathan Ive, Stephen Wozniak, others too, even Bill Gates. Yet what I loved Mac for before OSX was Moof and Happy Mac and the interface, and what I love now is OSX, that it exists, my interaction with an operating system. I feel a little stupid somehow to say I want to write about these two people who have had a profound effect on me precisely because they are women. To do so is important so as to remember by saying, that there are women who have had such a unique extraordinary influence on Macs, on technology, on science, on culture and it’s really good to have someone to look up to.

Oh, and Sarah made the Apple Mac team Pirate Flag so of course I adore her.