Best Practices for Backing Up MediaWiki Installations
Why a MediaWiki Backup Is More Than Just “Copy‑Paste”
Imagine you’ve spent months curating a niche wiki about vintage arcade cabinets. One night a power surge knocks the server offline and, when it boots, the latest articles have vanished. The feeling is a bit like watching a favorite game cartridge melt – frustrating, avoidable, and painfully obvious that you should have backed up properly.
MediaWiki isn’t a single file you can zip and forget; it lives in two worlds – a relational database that stores pages, users, and history, and a file system that holds configuration, extensions, images, and uploads. If either side disappears, the whole wiki can become a ghost town.
1. Snapshot the Database First – It’s the Heartbeat
1.1 Pick the Right Tool for Your DB Engine
- MySQL / MariaDB –
mysqldumpis the workhorse. Use--single-transactionfor InnoDB tables to avoid locking the wiki during the dump. - PostgreSQL –
pg_dumpdoes the trick; add--no-ownerif you plan to restore on a different server. - SQLite – the entire
.sqlitefile is your dump, but you still want to lock it withsqlite3 yourwiki.db ".backup /tmp/backup.sqlite"to guarantee consistency.
1.2 Make the Wiki Read‑Only (Temporarily)
Before you start the dump, set $wgReadOnly = 'Backup in progress – please stand by'; in LocalSettings.php. It’s a tiny inconvenience for editors, but it guarantees the data you pull is not mid‑write. If you can’t edit the config (shared hosting, for instance), schedule the dump during a low‑traffic window and accept the minimal risk.
1.3 Automate with Cron or Systemd
Don’t rely on “I’ll remember tomorrow”. A typical cron line for a nightly MySQL dump looks like this:
0 2 * * * /usr/bin/mysqldump -u wikiuser -p'${MYSQL_PASS}' wiki_db \
| gzip > /var/backups/mediawiki/wiki_$(date +\%F).sql.gz
Or, if you prefer Systemd timers (they’re a bit more flexible):
[Unit]
Description=MediaWiki database backup
[Service]
Type=oneshot
ExecStart=/usr/bin/mysqldump -u wikiuser -p${MYSQL_PASS} wiki_db | gzip > /var/backups/mediawiki/wiki_$(date +%F).sql.gz
[Install]
WantedBy=timers.target
2. Preserve the File System – Images, Extensions, Configs
2.1 What Exactly Needs Backing Up?
- LocalSettings.php – your wiki’s brain.
- Images/ folder – all uploaded files, including deleted ones (they sit in
images/deleted/). - Extensions/ and Skins/ – custom code that isn’t part of a core upgrade.
- Cache directories (if you use
$wgCacheDirectory) – not critical, but they speed up restores.
2.2 Rsync Is Your Friend
Rsync can mirror the whole MediaWiki directory while preserving permissions. A typical one‑liner:
rsync -avz --delete /var/www/mediawiki/ backup@backupserver:/backups/mediawiki/$(date +%F)/Notice the --delete flag – it keeps the backup clean by removing files that no longer exist on the source. If you prefer a simple tarball, remember to exclude cache/ and any temporary directories to keep the archive slim.
2.3 Cloud Object Stores for Large Media
If your wiki hosts thousands of high‑resolution images, consider pushing the images/ folder to S3, Wasabi, or Backblaze B2. Tools like rclone make it painless:
rclone sync /var/www/mediawiki/images remote:mediawiki-images --progressThis also gives you geographic redundancy – a nice safety net if your primary host goes down.
3. Combine the Two: Full‑Stack Backup Strategies
3.1 The “All‑in‑One” Script
Many admins like to wrap the database dump and rsync into a single Bash script. Here’s a trimmed‑down version (feel free to add logging or email alerts):
#!/bin/bash
# MediaWiki backup – simple but effective
# 1. Put wiki in read‑only mode
sed -i "/$wgReadOnly/c\\$wgReadOnly = 'Backup in progress';" /var/www/mediawiki/LocalSettings.php
# 2. Dump the database
mysqldump -u wikiuser -p"$MYSQL_PASS" wiki_db | gzip > /backups/wiki_$(date +%F).sql.gz
# 3. Sync files
rsync -avz --delete /var/www/mediawiki/ /backups/mediawiki_$(date +%F)/
# 4. Remove read‑only flag
sed -i "/$wgReadOnly/d" /var/www/mediawiki/LocalSettings.php
Run it as root or a dedicated backup user, schedule it with cron, and you have a decent baseline.
3.2 Version‑Control Your Code, Not Your Data
Put LocalSettings.php (sans passwords) and any custom extensions into a Git repo. That way, a fresh server can be spun up with git clone and a simple composer install. The actual data – database and uploads – still lives in your backup archives.
4. Testing Restores – The Step Most People Skip
If you never try to restore a backup, you don’t really know if it works. Set up a throw‑away VM, restore the latest dump, point the web server at it, and verify:
- All recent pages appear.
- Images load correctly.
- Extensions initialize without fatal errors.
A quick smoke test is better than discovering corruption weeks later when the wiki is already missing critical content.
5. Retention, Encryption, and Off‑Site Storage
Backups are only as good as the place you keep them. A solid policy looks like this:
- Retention – keep daily backups for a week, weekly for a month, monthly for a year. Adjust based on how fast your wiki changes.
- Encryption – pipe the dump through
gpg --symmetric --cipher-algo AES256before moving it off‑site. - Off‑Site – a second copy on a different data center, or a cheap cloud bucket. Even a USB drive stored at a colleague’s desk can be a lifesaver.
6. Common Pitfalls and How to Dodge Them
Here are a few “gotchas” that keep popping up in forum threads:
- Missing
--single-transactionon InnoDB tables. The dump locks tables and can cause timeouts for busy wikis. - Forgetting to back up the
images/deletedfolder. Users think deleted images are gone forever, but they’re still needed for history exports. - Hard‑coding passwords in scripts. Use environment variables or a .my.cnf file with restricted permissions.
- Skipping the
LocalSettings.phpchange. A tiny read‑only flag can prevent subtle corruption that’s hard to trace later.
7. A Quick Checklist for the Busy Admin
When you’re juggling a dozen tasks, a checklist can be a lifesaver:
- [ ] Enable read‑only mode in LocalSettings.php before each dump
- [ ] Run mysqldump/pg_dump with appropriate flags
- [ ] Compress and encrypt the SQL file
- [ ] Rsync or tar the MediaWiki directory (exclude cache)
- [ ] Sync images to cloud storage if > 10 GB
- [ ] Push code changes to Git
- [ ] Rotate old backups according to retention policy
- [ ] Verify restore on a test VM at least monthly
Final Thoughts
Backing up a MediaWiki installation isn’t glamorous, but it’s the safety net that keeps a community’s knowledge from vanishing overnight. By treating the database and file system as separate, yet equally critical, pieces and automating the whole flow, you’ll sleep easier – knowing that even if the server hiccups, your wiki can be resurrected with a few commands.
And remember: the best backup plan is the one you actually run, test, and update as your wiki grows. A little extra effort now saves a mountain of grief later.