Jump to content

create a cronjob to backup database daily


Recommended Posts

Hi guys

 

I have been trying to create a daily backup of MYSQL DB on a linux CentOs Server..

 

I have been using the following cron syntax, but have had no results, any suggestions

 

What I do us, put the following in a text file then in command line I type

 

crontab /cronfile.txt

15 2 * * * mysqldump --user root --password="passowrd" --opt database_name > /home/myrefco/db/db-`date +%Y-%m-%d`.sql
#

 

Thank you

Put the mysqldump command in a shell script file instead of directly on the crontab. chmod the script to 777 like

 

chmod 777 /var/myrefco/backup.sh

 

then use

 

crontab -e

 

to modify the crontab and add the script to run. Set the cron timings as normal and just this as the command

 

/var/myrefco/backup.sh

 

Save the crontab using :wq

 

Then restart the crontab, not sure if this is correct for CentOS:

 

/etc/init.d/crontab restart

Hi mate

 

thank you for your reply,

 

I am 100% new to Linux and command line

 

I have no idea how to do the 1st step (Put the mysqldump command in a shell script file instead of directly on the crontab.)

 

And I dont really knw too much on crontabs, so to edit it and make it work with backup.sh, i have no idea how to do all this :S

You don't need to make a separate script, just make sure you supply the full path to mysqldump, or set the PATH variable.

 

15 2 * * * /usr/bin/mysqldump --user root --password="passowrd" --opt database_name > /home/myrefco/db/db-`date +%Y-%m-%d`.sql

hey tried the about but still no luck

 

here is what I did, I ran the script

/usr/bin/mysqldump --user root --password="passowrd" --opt database_name > /home/myrefco/db/db-`date +%Y-%m-%d`.sql

in the terminal and that worked find

 

15 2 * * * /usr/bin/mysqldump --user root --password="passowrd" --opt database_name > /home/myrefco/db/db-`date +%Y-%m-%d`.sql

but then i put the full script in a txt file, ran crontab home/filename.txt

 

the cron was created because i could list it using crontab -l

 

but no backup files are being created :(

but then I put the full script

if your system is able to send emails,

than you could edit your crontabfile ('crontab -e')

and put 'MAILTO=youremailadres@example.com' before the line that tries to do the backup

 

All output of that command will be mailed to the specified address (if mail is setup correctly)

This way you can see error message's like 'wrong password', etc....

  • 2 weeks later...

hey tried the about but still no luck

 

here is what I did, I ran the script

/usr/bin/mysqldump --user root --password="passowrd" --opt database_name > /home/myrefco/db/db-`date +%Y-%m-%d`.sql

in the terminal and that worked find

 

15 2 * * * /usr/bin/mysqldump --user root --password="passowrd" --opt database_name > /home/myrefco/db/db-`date +%Y-%m-%d`.sql

but then i put the full script in a txt file, ran crontab home/filename.txt

 

the cron was created because i could list it using crontab -l

 

but no backup files are being created :(

but then I put the full script

 

Putting the script in your crontab probably isn't working because the script was never marked as executable.  You need to run the ls -l command on the file to see what the permissions are.  If you don't know how to read the output then see these two entries:

 

http://en.wikipedia.org/wiki/Ls

 

http://en.wikipedia.org/wiki/File_system_permissions#Notation_of_traditional_Unix_permissions

 

You'll probably need to run the chmod command to make it as executable (either for your user, or for everyone).  The easiest way to do this is to simply run:

 

chmod +x your_file_name

 

You should also but the interpreter that you want to be used at the top of your script.  For example, if you want it to be executed by Korn shell you should put something like:

 

#!/bin/ksh

 

For BASH it would be:

 

#!/bin/bash

 

 

In any case, if you get the cron job working with your script then it's going to constantly create those files but never delete them.  Here's a little script that might help with that.  First it will determine today's date, and then it will backup your database with the date in the filename.  Finally it will look through the directory for files older than 7 days and delete them.  Thus -- you always keeps the most recent week of files.  If you want to keep 2 weeks worth of files, change the argument in find from -mtime 7 to mtime 14.

 

#!/bin/bash -x

backupDir=/home/myrefco/db

# backup the database
myDate=$(date '+%Y-%m-%d')
dbName=db-$myDate.sql
cd /tmp
mysqldump -u root -ppassword database_name > $dbName
gzip $dbName
mv $dbName.gz $backupDir

# delete anything older than 7 days
find $backupDir -mtime +7 | xargs rm -rf

 

 

I haven't thoroughly tested it so you should before using it.  I'm pretty sure it will work but run it by hand and maybe even add the -x option (#!/bin/bash -x) so that you can see exactly what it's doing as it runs.  Have fun.

 

 

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.