Archiving a (WordPress) website with wget

I needed to archive several WordPress sites as part of the process of gathering the raw data for my thesis research. I found a few recipes online for using wget to grab entire sites, but they all needed some tweaking. So, here's my recipe for posterity:

I used wget, which is available on any linux-ish system (I ran it on the same Ubuntu server that hosts the sites).

wget --mirror -p --html-extension --convert-links -e robots=off -P . http://url-to-site

That command doesn't throttle the requests, so it could cause problems if the server has high load. Here's what that line does:

You may also need to play around with the -D domain-list and/or --exclude-domains options, if you just want to control how it handles content hosted on more than one domain.

It's worth noting that this isn't WordPress-specific. This should work fine for archiving any website.