Syncing a Magento instance from production to development

Want to learn something new? Check out my most recent Egghead course:

Build a Twelve-Factor Node.js App with Docker - WATCH NOW on!

Cheers -Mark

Thu, 09/08/2011 - 15:19

Submitted by markoshust Thu, 09/08/2011 - 15:19

UPDATE! Code moved to bitbucket:

Often times, you need to pull down an up-to-date version of Magento from your production server to your development or staging servers. This involves:

  • Executing a mysqldump on production.
  • Zipping it up.
  • Secure copying the zipped file from production to development (or staging).
  • Unzipping the file locally.
  • Updating the base URL's in the core_config_data table to your local URL's.
  • Executing the SQL into your local database instance.
  • Deleting all the temp files from your production and development machines.

What a pain. Wouldn't it be nice if you just executed a script that did all of this for you?


if [[ `ssh $REMOTE_HOST 'test -e ~/'$REMOTE_MYSQL_DB'.tmp.sql && echo exists'` == *exists* ]]; then
  echo "Backup is currently being executed by another process. Please try again in a few moments."
  exit 1

echo "Creating backup of production database"
ssh $REMOTE_HOST 'mysqldump -u '$REMOTE_MYSQL_USER' -p'$REMOTE_MYSQL_PASS' '$REMOTE_MYSQL_DB' > ~/'$REMOTE_MYSQL_DB'.tmp.sql' &> /dev/null
ssh $REMOTE_HOST 'tar -czf '$REMOTE_MYSQL_DB'.tmp.sql.tar.gz '$REMOTE_MYSQL_DB'.tmp.sql' &> /dev/null

echo "Transferring database backup to localhost"
scp $REMOTE_HOST:~/$REMOTE_MYSQL_DB.tmp.sql.tar.gz ~/
ssh $REMOTE_HOST 'rm ~/'$REMOTE_MYSQL_DB'.tmp*'

echo "Extracting backup" 
tar -xzf ~/$REMOTE_MYSQL_DB.tmp.sql.tar.gz -C ~/
echo "Updating config" 
sed "s/https/http/g" ~/$REMOTE_MYSQL_DB.tmp.2.sql > ~/$
echo "Reloading localhost database (may take few minutes)"
mysql -u $LOCAL_MYSQL_USER -p$LOCAL_MYSQL_PASS $LOCAL_MYSQL_DB < ~/$ &> /dev/null

# Clean local temp files
rm ~/$REMOTE_MYSQL_DB.tmp*

echo "Complete!"

You're welcome :)

Just save the file as somewhere on your Mac or Linux machine (No love for Windows people, sorry. Come on and get a Mac already.), update the constants at the top of the file, make it executable (chmod 777), and execute it as so: ~/ You can easily set this up on a cron for automated syncing down of a production database to staging.

NOTE: Updated NOV-4-2011 with more configurability between remote and local mysql database names, usernames, passwords, etc.

NOTE: Updated NOV-18-2011 with multi-store support on same root domain ( and nixing https from localhost.

NOTE: Updated AUG-13-2012 with db hostname configurability. Also, moved to bitbucket:


Haven't tried it out yet but all the same... this is very useful. Thanks for posting.

No problem, hopefully someone finds a use for this. Have this working on a few sites and it's all dandy. Multi-store setups will require some additional scripting.

I am running this for the first time. It would be nice if it would also run without using SSH where both instances were on the same server since the SSH connection isn't necessary and I had to enter my password 5 times during the execution of this script. I have zero Bash scripting experience but will likely figure out the few modifications necessary to make it work locally without SSH. Thank you for the script.

You are free to modify the script as necessary, but the easiest way would be to keep the script all the same and just ssh into your local machine -- username@localhost

I did have to update your script to make it work when both are on the localhost. I commented out the lines that transfer, tar and untar the database dumb as they are unnecessary in this case and actually break the script as the when the file is removed from the 'remote' host it is actually the localhost as well. The error with the script as it was. Extracting backup tar: /home/_____/____.tmp.sql.tar.gz: Cannot open: No such file or directory tar: Error is not recoverable: exiting now tar: Child returned status 2 tar: Error exit delayed from previous errors Updating config sed: can't read /home/____/____.tmp.sql: No such file or directory When I commented out the following it worked. #ssh $REMOTE_HOST 'tar -czf '$REMOTE_MYSQL_DB'.tmp.sql.tar.gz '$REMOTE_MYSQL_DB'.tmp.sql' &> /dev/null #echo "Transferring database backup to localhost" #scp $REMOTE_HOST:~/$REMOTE_MYSQL_DB.tmp.sql.tar.gz ~/ #ssh $REMOTE_HOST 'rm ~/'$REMOTE_MYSQL_DB'.tmp*' #echo "Extracting backup" #tar -xzf ~/$REMOTE_MYSQL_DB.tmp.sql.tar.gz -C ~/

Updated the script to conform to multiple store setups on the same root domain (ex.,, etc.). Also got rid of https from mysql import script as most, if not all localhost installs don't have (and don't need) ssl support.

I found this post looking for tips on how to run mysqldump on a production Magento instance. I believe that at our scale, running mysqldump hammers the database for 2-4 minutes, and can either take the store offline or at least massively impact response times. Presumably your script applies to situations where the dump file is small and the impact of running it minor. Our dump output is ~1.5G, so piping that over SSH seems like a bad plan. Better to dump it onto disk locally and then transfer by resumable means. Thanks for sharing your script, it looks useful to many people with small stores.

The script doesn't pipe a mysqldump over ssh. It ssh's into the box, runs a mysqldump, then exits back to a local prompt and scp's the file over. If a mysqldump is crashing your server, you most likely should be a using a mysql snapshot/replication setup with partial data syncs, something completely past the technicality of this article :)

Hi! Thanks for sharing script, I didn't know how easy execution commands on remote host via SSH could be :) How do you usually solve opposite task? i.e. sync changes from staging to production Files sync is easy task. But database.. it's necessary to merge new settings into existing database. Do you have good solution for this? Thanks, Andrey

Hi Andrey,That's a good question. As far as I know there isn't a good approach for syncing from staging > production.