Hacking & Bodging a Git Hook + Vagrant + WP-CLI + Bash Local to Dev Database Transfer

Ever since I started using Git to push local website changes to a development server, I’ve been vaguely irritated about dealing with the database in the same manner. For a long time, I used interconnect/it’s Search Replace DB for this side of things, but I was always wondering if I could somehow increase laziness time by automating the process. One hungover Sunday, plus a couple of hours on Monday, and one hacked and bodged success.

This isn’t a “How to do yer Git + Vagrant + local to dev” thing, nor is it a copy-paste, “Works for me!” party. Nonetheless, provided you’re using git-push, and are comfortable with WP-CLI or MySQL command-line, Bash, and generally thrashing small bits of code around, in principle it would work in any situation. And I do feel kinda dirty throwing all this into Git post-receive, but whatever seems to work.

So, here’s what I wanted to do:

  1. Do all my commits, and run git-push and throw all the file changes to the dev server.
  2. Combine that with a dump of the database, then get it to the dev server.
  3. Use something at the other end to import the database, and search-replace strings
  4. Clean up after at both ends.

When I first looked into this, it seemed using the pre-commit Git hook was the most common approach, dumping the database and adding it to the commit. I didn’t want to do this, for a couple of reasons: I do a lot of commits, and the majority have no database component; I wasn’t looking to version control the database; All I wanted to do was push local to dev database with the changed files. Looks like a job for pre-push hook.

Earlier this year, I started using Vagrant, so the first issue was how to dump the database from there. I do commits from the local folder, rather than SSH-ing into the VM, so mysqldump is not going to work without first getting into the VM. Which brought its own set of weirdnesses, and this was the point when I decided to flop over to WP-CLI, the WordPress command-line tool.

I often find solutions to this sort of thing are dependant on the combination of software and commands being used. I use mysqldump on its own all the time, but here, I needed to use Git to set the path for where the database would be dumped to — because git hooks are in a sub-directory of the git folder — and that, in combination with dumping the database inside the VM while within a Git command running from the local folder (yeah, probably should just do all my git via SSH), and hurling it at a remote server, means sometimes things that work in isolation get cranky. And this is a hack/bodge, so I went with:

  1. Set up paths for the database dump with Git, ’cos Git is running this show.
  2. SSH into the Vagrant box.
  3. WP-CLI dump the database to a gzipped file.
  4. SCP that up to the dev server.
  5. Delete all that on the local server, ’cos I’m tidy.

That’s half of it done. I’ve got my pushes working, the database file is up on the dev server, the local server is all cleaned up, so now it’s time for the other end.

In this case, I was doing it for a site on DreamHost, who conveniently give all kinds of fun command-line access, plus WP-CLI on their shared servers. Once Git has finished checking out the new file changes in post-receive, it’s time for frankly bodging it.

My current usual setup is a bare repository on the dev server, which checks out to the development website directory. This means neither the uploaded database, nor WP-CLI and the WordPress root are in the same place as the running hook. No big deal, just use –path=. The next thing though, is cleaning up post-import. Strings to be changed all over the place, like local URLs swapped to dev. And for that we have, wp search-replace, which is an awful lot like Search Replace DB. At the dev end then:

  1. Set up paths again, this time it’s WP-CLI running the show.
  2. Unzip the database then import it.
  3. Do database stuff like search-replace strings, and delete transients.
  4. Delete that uploaded database file on the dev server, ’cos I’m tidy.

I was looking at all this late last night, all those repeating lines of ‘wp search-replace’ and I thought, “That looks like a job for an array.” Which led me down the tunnel of Bash arrays, associative arrays, “How can I actually do ‘blah’, ’cos bash seems to be kinda unwilling here?” and finally settling on not quite what I wanted, but does the job. Also, bash syntax always looks like it’s cursing and swearing.

The pre-push hook:


# a pre-push hook to dump the database to a folder in the repo's root directory, upload it to the dev server, then delete when finished

echo '***************************************************************'
echo 'preparing to back up database'
echo '***************************************************************'

# set up some variables, to keep things more readable later on
# backup_dir is relative to git hooks, i.e. 2 directories higher, so use git to set it

ROOT="$(git rev-parse --show-toplevel)"

# check there is a database backup directory, make it if it doesn't exist then cd to it

if [ ! -d "$BACKUP_DIR" ]; then
mkdir "$BACKUP_DIR"

# cos this is vagrant, first ssh into it. there will be a password prompt
# using EOF to write the commands in bash, rather than in ssh quotation marks

ssh -t vagrant@ << EOF

# cd to the new databases folder. this is absolute, cos is vm and not local folder
cd "/var/www/user/domain.tld/.database" 

# then export the database with wp-cli and gzip it
wp db export --add-drop-table - | gzip -9 > $DB_NAME.sql.gz

# exit ssh

# bail out of eof

# scp the backup directory and database to dev server
scp -r $BACKUP_DIR user@domain.tld:~/

# remove that backup directory so it's not clogging up git changes

echo '***************************************************************'
echo 'all done, finishing up git push stuff'
echo '***************************************************************'

The post-receive hook:


echo '***************************************************************'
echo 'post-receive is working. checking out pushed changes.'
echo '***************************************************************'

# check out the received changes from local to the dev site
git --work-tree=/home/user/dev.domain.tld  --git-dir=/home/user/.repo.git checkout -f

# import the database with wp-cli
echo '***************************************************************'
echo 'starting database import'
echo '***************************************************************'

# setting up some paths
# on some webhosts, e.g. all-inkl, setting the alias to wp-cli.phar is required, uncomment and set if needed
# alias wp='/path/to/.wp-cli/wp-cli.phar'

# the path to wp-config, needed for wp-cli
# database directory, created in git pre-push

# check there is a database directory
if [ -d "$DB_DIR" ]; then

	# then check it for sql.gz files
	DB_COUNT=`ls -1 $DB_DIR/*.sql.gz 2>/dev/null | wc -l` 

	# if there is exactly 1 database, proceed
	if [ $DB_COUNT == 1 ]; then

		#grab the db name, this way the db name isn't hardcoded
		DB_NAME=$(basename $DB_DIR/*)

		echo 'importing the database'
		echo '***************************************************************'

		# unzip the database, then import it with wp-cli
		gunzip < $DB_DIR/$DB_NAME | wp db import - --path=$WP_PATH

		# clear the transients
		wp transient delete --all --path=$WP_PATH

		# run search replace on the main strings needing to be updated
		# make an array of strings to be searched for and replaced

		#loop through the array and spit it into wp search-replace
		for (( i=0; i < ${#search[@]}; ++i )); do
			eval wp search-replace --all-tables --precise \"${search[i]}\" \"${replace[i]}\" --path=$WP_PATH

		# any other wp-cli commands to run
		wp option update blogname "blog name" --path=$WP_PATH

		# delete the backup directory, so there's no junk lying around
		rm -rf $DB_DIR
		echo 'database was not found'
		echo '***************************************************************'


	echo 'database folder was not found'
	echo '***************************************************************'


echo '***************************************************************'
echo 'all done'
echo '***************************************************************'

What else? Dunno. It’s pretty rough, but basically proves something I didn’t find an example of all combined into one: that you can use git hooks to push the database and file changes at the same time, and automate the local-to-dev database transfer process. Is this the best way to do it? Nah, it’s majorly bodgy, and would have to be tailored for each server setup, and I’m not even sure doing such things in a git hook is advisable, even if it works. It does demonstrate that each step of the process can be automated — irrespective of how shonky your setup is — and provided you account for that and your own coding proclivities, there’s multiple ways of doing the same thing.

(edit, a day later.)
I decided to throw this into ‘production’, testing it on a development site I had to create on webhost I’m not so familiar with but who do provide the necessities (like SSH and Let’s Encrypt). Two things happened.

First, WP-CLI didn’t work at all in the post-receive script, even while it did if I ran commands directly in Terminal (or iTerm as I’m currently using). After much messing about, and trying a bunch of things it turned out that this was an issue of “has to be tailored for each server setup”, in this case adding an alias to wp-cli.phar.

Second, having a preference for over-compensation while automating, it occurred to me that I’d made some assumptions, like there’d only be one database file in the uploaded directory, and that hardcoding the filename — which was one of those “I’ll fix that later” things — had morphed into technical debt. So, feeling well competent in Bash today, I decided for the “make sure there’s actually a database folder, then check there’s actually a sql.gz file in it, and there’s only one of them, then get the name of that file, and use it as a variable”. I often wonder how much of this is too much, but trying to cover the more obvious possible bollocks seems reasonably sensible.

Both of these have been rolled into the code above. And as always, it occurs to me already there’s better — ‘better’ — ways to do this, like in pre-push, piping the database directly to the dev server with SSH, or simultaneously creating a separate, local database backup, or doing it all in SQL commands.

5-Character Dev Environment

Messing with my .bash_profile this afternoon, post-diving into Laravel and Git (which I’ve been doing much of the last week), I realised I could boot my entire dev environment with 5 letters. Fewer, if I wanted.

So instead of going to the Dock, clicking each of the icons, going to each and faffing around, I could at least boot them all, and set off some commands in Terminal (or ITerm2 as I’m now using).

Weirdly, until Justine gave me an evening of command-line Git learning, and wanted my .bash_profile, “Like so,” I hadn’t realised you could do stuff like that, despite amusing myself with all manner of shell scripts. Now I know what’s possible, I’m over-achieving in efficient laziness.

What’s missing is:

  • Opening multiple windows in ITerm or Terminal and running a different command in each (I don’t want to boot multiple instances of an app).
  • Setting off a menu action in an opened app, e.g. in Transmit going to my work drive.
  • Extending it to boot the environment and then a specific project, e.g. “devup laravel” would open my laravel installation in each of the apps, like opening the database in Sequel Pro; cd-ing to the laravel folder after automatic SSH-ing into my Vagrant box, and so on.

Some of these are probably uncomplicated, but this was a 30-minute experiment that turned out to be very useful.

Website rsync Backups the Time Machine Way

Continuing my recent rash of stupid coding, after Spellcheck the Shell Way, I decided for Website rsync Backups the Time Machine Way.

For a few years now, I’ve been using a bash script I bodged together that does incremental-ish backups of my websites using the rather formidable rsync. This week I’ve been working for maschinentempel.de, helping get frohstoff.de‘s WooCommerce shop from Trabant to Hoonage. Which required repeated backing up of the entire site and database, and made me realise the shoddiness of my original backup script.

I thought, “Wouldn’t it be awesome, instead of having to make those stupid ‘backup.blah’ folders, to let the script create a time-stamped folder like Time Machine for each backup, and use the most recent backup for the rsync hard links link destination?” Fukken wouldn’t it, eh?

Creating time-stamped folders was easy. Using the most recent backup folder — which has the most recent date, and in standard list view on my Mac, the last folder in a list — was a little trickier. Especially because once a new folder was created to backup into, that previously most recent was now second to last. tail and head feels hilariously bodgy, but works? Of course it does.

Bare bones explaining: The script needs to be in a folder with another folder called ‘backups’, and a text file called ‘excludes.txt’.  Needs to be given chmod +x to make it executable, and generally can be re-bodged to work on any server you can ssh into. Much faster, more reliable, increased laziness, time-stamped server backups.

# ---------------------------------------------------------------
# A script to manually back up your entire website
# Backup will include everything from the user directory up
# excludes.txt lists files and folders not backed up
# Subsequent backups only download changes, but each folder is a complete backup
# ---------------------------------------------------------------
# get the folder we're in
this_dir="`dirname \"$0\"`"
# set the folder in that to backup into
# cd to that folder
echo "******************"
echo "cd-ing to $backup_dir"
echo "******************"
cd "$backup_dir" || exit 1
# make a new folder with timestamp
time_stamp=$(date +%Y-%m-%d-%H%M%S)
mkdir "$backup_dir/${backuppath}supernaut-${time_stamp}"
echo "created backup folder: supernaut-${time_stamp}"
echo "******************"
# set link destination for hard links to previous backup
# this gets the last two folders (including the one just made)
# and then the first of those, which is the most recent backup
link_dest=`ls | tail -2 | head -n 1`
echo "hardlink destination: $link_dest"
echo "******************"
# set rsync backup destination to the folder we just made
backup_dest=`ls | tail -1`
echo "backup destination: $backup_dest"
echo "******************"
# run rsync to do the backup via ssh with passwordless login
rsync -avvzc --hard-links --delete --delete-excluded --progress --exclude-from="$this_dir/excludes.txt" --link-dest="$backup_dir/$link_dest" -e ssh username@supernaut.info:~/ "$backup_dir/$backup_dest"
echo "******************"
echo "Backup complete"
echo "******************"
# info on the backup commands:
# -a --archive archive mode; same as -rlptgoD (no -H)
# -r --recursive recurse into directories
# -l --links copy symlinks as symlinks
# -p --perms preserve permissions
# -t --times preserve times
# -g --group preserve group
# -o --owner preserve owner (super-user only)
# -D same as --devices --specials
# --devices preserve device files (super-user only)
# --specials preserve special files
# -v --verbose increase verbosity - can increment for more detail i.e. -vv -vvv
# -z --compress compress file data during the transfer
# -c --checksum skip based on checksum, not mod-time & size – SLOWER
# -H --hard-links preserve hard links
# --delete delete extraneous files from dest dirs
# --delete-excluded also delete excluded files from dest dirs
# --progress show progress during transfer
# --exclude-from=FILE read exclude patterns from FILE – one file or folder per line
# --link-dest=DIR hardlink to files in DIR when unchanged – set as previous backup
# -e --rsh=COMMAND specify the remote shell to use – SSH
# -n --dry-run show what would have been transferred

Spellcheck the Shell Way

I was reading this awesome book (about which I shall soon blog) and there was this moment of, “Fark! What a brilliant line!” like I actually said that ’cos it was so good, followed by, “Fark! Spelling mistake of spacecraft’s name!” And I thought wouldn’t a good way to deal with spellchecking (besides my favourite cmd-;) be to take the entire text, do something fancy command-line to it, and output all the words alphabetically by frequency. Then you could just spellcheck that file, find the weird words, go back to the original document and correct the shit out of them. So I did. Brilliant!

# take a text and output all the words alphabetically by frequency
# spaces replaced with line breaks, lowercase everything, punctuation included (apostrophe in ascii \047)
# http://unix.stackexchange.com/questions/39039/get-text-file-word-occurrence-count-of-all-words-print-output-sorted
# http://tldp.org/LDP/abs/html/textproc.html
# http://donsnotes.com/tech/charsets/ascii.html
find . -name "foo.txt" -exec cat {} \; | tr ' ' '\012' | tr A-Z a-z | tr -cd '\012[a-z][0-9]\047' | grep -v "^\s*$" | sort | uniq -c | sort -bnr

wordpress security (in many small steps)

Yeah, I’d really suggest searching for newer ways of doing all this, it’s from 2010 (edited on 2015-10-08)

(This is for people who like reading code, cross-posted at thingswithbits.info)

Earlier this year supernaut got hacked. Many other of my WordPress installs did also, perhaps because they occupy the same shared hosting space. I learnt a lot about website and WordPress security very quickly – even to the point of inadvertently vanishing all but my index page for quite some time. Nothing if not clever, I am.

Because I am doing all my projects in WordPress at the moment, and also seem to have turned quite a few people over to using it also, I thought to document my approach and methods. The first thing I do then, is read. A lot.

I have a subjective and not-too carefully analysed approach to learning, especially when it comes to finding out information on a topic I know nothing about and need to know much quickly. It applies to everything, not simply limited to web design or computer stuff. I search and read and search and read and keep repeating until the same stuff starts to come up over and over again. Then I start to think I might be on the right path. So I might try a few things then. The key here is easiness. Anything requiring more than a few clicks, a few lines of text or modifications is not a reasonable solution.

Things that break this early get thrown away. A plug-in that asks for stupid things, or doesn’t perform without me rewriting some line in php.ini is not going to stay installed long. I wondered often if this was the wrong approach, but really, basic, effective security should be as simple to understand as a household door key. You shouldn’t have to build a lathe in order to cut the key yourself.

So, having done some research and playing, I slowly put together something useful. This is a mix of things I’ve been using for a while, and new things I’m adding at the moment, in response to pissy annoying php exploits, sql injections and other clever irritations.

Installing WordPress.

The first thing to change during an install is the database table prefix wp_. If you’ve already installed WordPress, it’s possible to also change this either using a plugin, or by editing wp-config.php and changing the table prefixes in phpMyAdmin.

Once logged in, make a new user with administrator privileges and suitably complex password (OSX Keychain Access has a very good password generator), log in with the new user and delete the user, ‘Admin’.

Now is also a good time to delete the default theme (after uploading your new one of course). As with the user named ‘Admin’, the wp_ table prefix and other defaults, botnet code injection methods look for these defaults as an easy place to start.

To avoid messiness, I think it’s better to leave installing plugins till last, though because information is sent in the clear unless using SSL or SSH, it’s probably a good idea to change the password again when it’s all finished.

Get rid of install.php

After your installation is finished, you don’t need this file, located in wp-admin. Delete it, or change the name, or even better log attempts to access it with this (just change the email address to receive notifications):

// install.php replacement page: http://perishablepress.com/press/2009/05/05/important-security-fix-for-wordpress
header("HTTP/1.1 503 Service Temporarily Unavailable");
header("Status 503 Service Temporarily Unavailable");
header("Retry-After 3600"); // 60 minutes
mail("email@domain.tld", "Database Error", "There is a problem with teh database!");

Error Establishing Database Connection
Error Establishing Database Connection
We are currently experiencing database issues. Please check back shortly. Thank you.

Dealing with wp-config

Every time I open this file and see the database name, user, password and host all in plain text, I get a little queasy. There are several ways to make this less painful, firstly using htaccess, which I’ll cover later. A quite elegant solution is to put all the sensitive information in a separate php file outside the root web directory, and make a call to that in the wp-config file.

First make a new file, config.php stick it (on Dreamhost) in the /home directory, chmod to 644, and cut-paste the following from the original wp-config.php:

// ** MySQL settings - You can get this info from your web host ** //
/** The name of the database for WordPress */
define('DB_NAME', 'database-name');
/** MySQL database username */
define('DB_USER', 'username');
/** MySQL database password */
define('DB_PASSWORD', 'p@s5w0rD');
/** MySQL hostname */
define('DB_HOST', 'sqlhost.domainname.tld');
/** * WordPress Database Table prefix. * * You can have multiple installations in one database if you give each a unique * prefix. Only numbers, letters, and underscores please! */
$table_prefix = 'prefix_';
/** force ssl login and admin - might slow things down */
/** on dreamhost must pay for ssl cert, hence not used */
/** define('FORCE_SSL_ADMIN', true); */

Then in the original file, just put:


For those lucky enough to have SSL on their server, using FORCE_SSL_ADMIN is an excellent idea. Changing permissions to 640 also is a good idea.

Adding Unique Authentication Keys takes about 30 seconds, and gives four separate keys to be used with your password. Copy-paste from the Secret Key online generator, it will look like this:

define('AUTH_KEY', ' ;+ Xk*Kf:y3e1L?.,r[Hx<m;rV57d>2WL#<#3[ d]!#+$79/pSAF(HrGEAfS`a4');
define('SECURE_AUTH_KEY', '.k0zMi[@f&)E>~y=ZqO6~IfHS$S SP8d>C]S@:zhxh?H]VtXEpqV?p-OJV*O~3?v');
define('LOGGED_IN_KEY', '~:b*7/m+Lx|-irCxYAHQn1t2$sYA+2}+*2c@!_,9/D2-H5cJ_:wJ8X7|-p%W&xGh');
define('NONCE_KEY', '%#T+Y*|N>cq/2m3CRqR}SCM BodKio`<x+?nMAe6,qgU:YiyKgEu,%>qS$V');


Most themes have a functions.php file which does all sorts of exciting things, writing bits to the theme templates, interacting with WordPress admin interface… A couple of extra lines provide a little obscurity. WordPress puts its version number in the header in wp_generator, and also a link to xml-rpc.php, which for most people is unnecessary – unless they are using a blogging client like Marsedit – and a risk. This quickly removes both, as well as hiding information about failed login attempts through the browser:

//security stuff
add_filter('login_errors',create_function('$a', "return null;"));
function removeHeadLinks() {
remove_action('wp_head', 'rsd_link');
remove_action('wp_head', 'wlwmanifest_link');
add_action('init', 'removeHeadLinks');
function no_generator() {
add_filter('the_generator', 'no_generator');


.htaccess is a joyous little world unto itself, like finding a hole in your backyard that leads into a vast cave system. mmm spelunking.

Much of my learning about security has revolved around what can be done with htaccess, and in particular Perishable Press and their 4G Blacklist. And much of what I do for security takes place here.

Starting with denying access to all to read the htaccess file itself. Then there is the WordPress hook that allows the install to exist in a different directory location to the site url. For those again who have SSL on their server, forcing SSL can be done here for admin and login. Then there are a bunch of protections to stop access to certain important files, install.php, wpconfig.php, and the WordPress readme.html.

Using gzip compression to deliver files and adding content expires information doesn’t strictly have much to do with security, but really, the difference in load times the former can make to a site, and the general usefulness of expires tags make this one to automatically add.

For those on Dreamhost, the DH-PHP handlers is automatically added when using the site-specific php.ini installer, something I’ll cover a bit further down.

Hotlinking prevents leechers sucking images and other content off your site, one of the first things I ever learnt how to prevent, when supernaut suddenly had massive bandwidth use as my images turned up in all manner of places.

The no-referrer section is specifically to thwart spammers circumventing your site altogether and trying to inject comment spam directly into the comments php. It’s also possible to block access to xml-rpc here, and use login passwords via httpasswd for extra security on the login page, both not included here.

Then comes the Perishable Press 4G Blacklist, a cornucopia of amazingness, which I left out for sake of brevity (haha). I have included two lines that need to be commented out in order for the browser-based file manager AjaXplorer to function ok.

order allow,deny
deny from all

RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]

# === FORCE SSL ===
#RewriteRule !^/wp-(admin|login|register)(.*) - [C]

# === PROTECT install.php ===
Order Allow,Deny
Deny from all
Satisfy all

# === PROTECT readme.html
Order deny,allow
deny from all

# === PROTECT wpconfig.php ===
order allow,deny
deny from all

# === DH-PHP handlers ===
AddHandler fastcgi-script fcg fcgi fpl
AddHandler php-fastcgi .php
Action php-fastcgi /cgi-bin/dispatch.fcgi

SetOutputFilter DEFLATE
SetOutputFilter DEFLATE
SetOutputFilter DEFLATE
SetOutputFilter DEFLATE

#set expire dates
ExpiresActive on
# 60 seconds * 60 minutes * 24 hours * 7 days
ExpiresDefault A604800
# 60 seconds * 60 minutes * 24 hours
ExpiresByType text/html A86400

<FilesMatch "\.(ico|pdf|flv|f4v|m4v|jpg|jpeg|png|gif|swf|js|css|ttf)$">
# configure ETag
FileETag none
# max-age set to one week as above
Header set Cache-Control "max-age=604800, public, must-revalidate"
# if you use ETags, you should unset Last-Modified
# Header unset Last-Modified

RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{REQUEST_FILENAME} -f
RewriteCond %{REQUEST_FILENAME} \.(gif|jpe?g?|png|ico)$ [NC]
RewriteCond %{HTTP_REFERER} !^https?://([^.]+\.)?domainname\. [NC]
RewriteRule \.(gif|jpe?g?|png|ico)$ - [F,NC,L]

RewriteCond %{REQUEST_URI} .wp-comments-post\. [NC]
RewriteCond %{HTTP_REFERER} !.*domainname\. [OR,NC]
RewriteCond %{HTTP_USER_AGENT} ^$
RewriteRule (.*) - [F,L]


# this line stops ajaxplorer working
RewriteRule ^(.*)$ - [F,L]

# RedirectMatch 403 \/\/ ajaxplorer again


Equally effective, and probably overkill, using robots.txt can grant or forbid access to a slew of places, particularly directories that you don’t want spidered, as well as any and all WordPress directories, using Disallow: /wp*.

User-agent: *
Disallow: /cgi-bin
Disallow: /lurking
Disallow: /phpsecinfo
Disallow: /wp-*
Disallow: /tag
Disallow: /author
Disallow: /wget/
Disallow: /httpd/
Disallow: /i/
Disallow: /f/
Disallow: /t/
Disallow: /c/
Disallow: /j/

User-agent: Mediapartners-Google
Allow: /
User-agent: Adsbot-Google
Allow: /
User-agent: Googlebot-Image
Allow: /
User-agent: Googlebot-Mobile
Allow: /
User-agent: ia_archiver-web.archive.org
Disallow: /
Sitemap: http://www.domainname.tld/sitemap.xml

php.ini and phpsecinfo

Getting deeper into the system still and further yet from WordPress, modifying php.ini, the file that sets up what php can do is another essential. Dreamhost doesn’t make it easy to edit the php.ini, but fortunately there’s a script which installs it locally. More excitement ahead.

As with htaccess, much can be done in php.ini to prevent messiness. The following seem to work rather well. I’ll leave this uncommented upon except to say AjaXplorer needs fopen to be on, and shall devote a future post to elaborating on php.ini security.

open_basedir = /home/site/folder:/home/site/tmp/folder
disable_functions = exec,passthru,system,proc_open,popen,curl_multi_exec,
expose_php = Off
error_reporting = E_ALL & ~E_NOTICE
register_globals = Off

; Whether to allow HTTP file uploads.
file_uploads = On
upload_tmp_dir = /home/site/folder/tmp/php
; Maximum allowed size for uploaded files.
upload_max_filesize = 200M

; Whether to allow the treatment of URLs (like http:// or ftp://) as files.
allow_url_fopen = On

; Whether to allow include/require to open URLs (like http:// or ftp://) as files.
allow_url_include = Off

In addition to editing php.ini, and making sure there isn’t a file lying around called info.php with phpinfo() inside, phpSecInfo is an invaluable tool for assaying the security of your website, the results from which can be directly used to edit php.ini.

FTP, or rather SFTP.

As with passwords being sent in the clear, so too is FTP on its own not so great. Dreamhost allows for shell plus SFTP access with FTP disabled, which is both sensible for using desktop FTP clients (such as the amazing Transmit), and for searching out code injections. Time to open Terminal.

Commandline access is essential for a number of reasons, and instead of using the username/password combination, create passwordless login using private keys.

//Generate a RSA private key
ssh-keygen -t rsa

// copy the key to your website
scp ~/.ssh/id_rsa.pub user@domainname.tld:~/

//ssh into your website
ssh user@domainname.tld

//Make a new folder, .ssh and copy the key to the authorized_keys file, then delete the key
mkdir .ssh
cat id_rsa.pub >> .ssh/authorized_keys
rm id_rsa.pub

//Set all permissions
chmod go-w ~
chmod 700 ~/.ssh
chmod 600 ~/.ssh/authorized_keys

Why might this all be useful? Going back to when I was hacked earlier this year, I could have gone though all the files on all my sites looking for base64 code, instead I opened Terminal, SSH’ed in and sent this command:

find . -name "*.php" -exec grep "base64" '{}' \; -print

Which searches through all files with the extension .php for the string base64 and dumps the results on screen. I found every instance of the hack in a matter of seconds.

Plugins for security.

Leaving aside all that code now…

WordPress is amazing because of its plugins and the community around its development. The problem though, for any plugin is twofold. Which one does the task you want the best (while integrating with the rest of your setup), and is updated frequently enough to not become a liability?

After the initial hack, I had many installed, which I then uninstalled because of small irritations and annoyances. After changing all my passwords to 16+ characters, including as many of type !@£$%^&_ as allowable, WordPress File Monitor has become an installation standard.

Rather than provide security, it lets you know when any modifications to files and folders have occurred, and in which. Notification via email and/or Dashboard alert, alterable scan intervals and directory path exclusions for me make this indispensable. When a new exploit emerges, instead of panicking and manually scanning all my installs for changes (which I do anyway out of nervous boredom), I can be fairly secure they will show up here. Of course, the idea is not to get hacked in the first place.

I’ve read a lot of good things about AskApache Password Protect, but I’ve never got it working, despite adding all the .htaccess files and even chmod up to 666. I would at least play with it otherwise, but for now don’t want to spend the time on it.

In general though (and said with a caveat that I don’t really know WordPress very well), much or all security that can be done with plugins can be done in other ways – .htaccess, robots.txt. php.ini, wp-config, sql changes and so on. Also, so many of the plugins haven’t been updated recently, which for me is worse than no protection due to the false sense of security.

During the course of the last two days, while I went through all the security stuff I could find, websites, pdfs, my own archives, I came across a couple of other plugins which I think are useful.

Semi Secure Login Reimagined provides about as good public and secret key encryption for passwords as possible if you don’t have access to SSL.

WP Security Scan I found useful for a post-install check to make sure all the settings were as minimally tight as could be. In the interests of not having hundreds of plugins, I uninstalled it after.

404 Notifier does just that, though I suspect getting off my ass and reading the logs (or ssh and then grepping them for 404s) would be a better idea.


Much of my information for this comes from a few places.

The WordPress Codex itself is a good place to start, and the Plugin directory also worth spending time in.
Perishable Press is invaluable, and not just for security.
Digging into WordPress, both the website and the book are the fundamental step-by-step guide for all things security and WordPress.
The WordPress community, across many blogs, forums, books, comments and bits and pieces.

Oh, and while this applies also to WordPress 2.9.x, I’m currently running the 3.0 beta on thingswithbits.info where I tested all this. (hopefully this all doesn’t add to confusion.)

ftp blah…

I have no idea why none of my FTP applications are able to run LIST -a while using Internode’s wireless connection and trying to do something simple like blog. What this means is I can’t get to the images directory and upload images, nor create new monthly folders, which makes all my entries kinda suck.

Lucky I can still upload using ecto, but the thumbnail images seem to have a rather large file size. All round kinda sucky and I need to find an non-crappy (-internode) connection to find out what’s going on. Not that you really need to know, it probably looks all the same to you, just annoying my sense of order and elegance.