可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
I tried many scripts for database backup but I couldn't make it. I want to backup my database every hour.
I added files to "/etc/cron.hourly/" folder, changed its chmod to 755, but it didn't run.
At least I write my pseudo code.
I would be happy if you can write a script for this operation and tell me what should I do more ?
After adding this script file to /etc/cron.hourly/
folder.
- Get current date and create a variable,
date=date(d_m_y_H_M_S)
- Create a variable for the file name,
filename="$date".gz
- Get the dump of my database like this
mysqldump --user=my_user --password=my_pass --default-character-set=utf8 my_database | gzip > "/var/www/vhosts/system/example.com/httpdocs/backups/$("filename")
- Delete all files in the folder
/var/www/vhosts/system/example.com/httpdocs/backups/
that are older than 8 days
- To the file
"/var/www/vhosts/system/example.com/httpdocs/backup_log.txt"
, this text will be written: Backup is created at $("date")
- Change the file owners (chown) from root to "my_user". Because I want to open the backup and log files from the "my_user" FTP account.
- I don't want an email after each cron.
>/dev/null 2>&1
will be added.
回答1:
After hours and hours work, I created a solution like the below. I copy paste for other people that can benefit.
First create a script file and give this file executable permission.
# cd /etc/cron.daily/
# touch /etc/cron.daily/dbbackup-daily.sh
# chmod 755 /etc/cron.daily/dbbackup-daily.sh
# vi /etc/cron.daily/dbbackup-daily.sh
Then copy following lines into file with Shift+Ins
#!/bin/sh
now="$(date +'%d_%m_%Y_%H_%M_%S')"
filename="db_backup_$now".gz
backupfolder="/var/www/vhosts/example.com/httpdocs/backups"
fullpathbackupfile="$backupfolder/$filename"
logfile="$backupfolder/"backup_log_"$(date +'%Y_%m')".txt
echo "mysqldump started at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
mysqldump --user=mydbuser --password=mypass --default-character-set=utf8 mydatabase | gzip > "$fullpathbackupfile"
echo "mysqldump finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
chown myuser "$fullpathbackupfile"
chown myuser "$logfile"
echo "file permission changed" >> "$logfile"
find "$backupfolder" -name db_backup_* -mtime +8 -exec rm {} \;
echo "old files deleted" >> "$logfile"
echo "operation finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
echo "*****************" >> "$logfile"
exit 0
Edit:
If you use InnoDB and backup takes too much time, you can add "single-transaction" argument to prevent locking. So mysqldump line will be like this:
mysqldump --user=mydbuser --password=mypass --default-character-set=utf8
--single-transaction mydatabase | gzip > "$fullpathbackupfile"
回答2:
Create a script similar to this:
#!/bin/sh -e
location=~/`date +%Y%m%d_%H%M%S`.db
mysqldump -u root --password=<your password> database_name > $location
gzip $location
Then you can edit the crontab
of the user that the script is going to run as:
$> crontab -e
And append the entry
01 * * * * ~/script_path.sh
This will make it run on the first minute of every hour every day.
Then you just have to add in your rolls and other functionality and you are good to go.
回答3:
I got the same issue.
But I manage to write a script.
Hope this would help.
#!/bin/bash
# Database credentials
user="username"
password="password"
host="localhost"
db_name="dbname"
# Other options
backup_path="/DB/DB_Backup"
date=$(date +"%d-%b-%Y")
# Set default file permissions
umask 177
# Dump database into SQL file
mysqldump --user=$user --password=$password --host=$host $db_name >$backup_path/$db_name-$date.sql
# Delete files older than 30 days
find $backup_path/* -mtime +30 -exec rm {} \;
#DB backup log
echo -e "$(date +'%d-%b-%y %r '):ALERT:Database has been Backuped" >>/var/log/DB_Backup.log
回答4:
#!/bin/sh
#Procedures = For DB Backup
#Scheduled at : Every Day 22:00
v_path=/etc/database_jobs/db_backup
logfile_path=/etc/database_jobs
v_file_name=DB_Production
v_cnt=0
MAILTO="abc@as.in"
touch "$logfile_path/kaka_db_log.log"
#DB Backup
mysqldump -uusername -ppassword -h111.111.111.111 ddbname > $v_path/$v_file_name`date +%Y-%m-%d`.sql
if [ "$?" -eq 0 ]
then
v_cnt=`expr $v_cnt + 1`
mail -s "DB Backup has been done successfully" $MAILTO < $logfile_path/db_log.log
else
mail -s "Alert : kaka DB Backup has been failed" $MAILTO < $logfile_path/db_log.log
exit
fi
回答5:
Here is my mysql backup script for ubuntu in case it helps someone.
#Mysql back up script
start_time="$(date -u +%s)"
now(){
date +%d-%B-%Y_%H-%M-%S
}
ip(){
/sbin/ifconfig eth0 2>/dev/null|awk '/inet addr:/ {print $2}'|sed 's/addr://'
}
filename="`now`".zip
backupfolder=/path/to/any/folder
fullpathbackupfile=$backupfolder/$filename
db_user=xxx
db_password=xxx
db_name=xxx
printf "\n\n"
printf "******************************\n"
printf "Started Automatic Mysql Backup\n"
printf "******************************\n"
printf "TIME: `now`\n"
printf "IP_ADDRESS: `ip` \n"
printf "DB_SERVER_NAME: DB-SERVER-1\n"
printf "%sBACKUP_FILE_PATH $fullpathbackupfile\n"
printf "Starting Mysql Dump \n"
mysqldump -u $db_user -p$db_password $db_name| pv | zip > $fullpathbackupfile
end_time="$(date -u +%s)"
elapsed=$(($end_time-$start_time))
printf "%sMysql Dump Completed In $elapsed seconds\n"
printf "******************************\n"
PS: Rememember to install pv and zip in your ubuntu
sudo apt install pv
sudo apt install zip
Here is how I set crontab by using crontab -e
in ubuntu to run every 6 hours
0 */6 * * * sh /path/to/shfile/backup-mysql.sh >> /path/to/logs/backup-mysql.log 2>&1
Cool thing is it will create a zip file which is easier to unzip from anywhere
回答6:
#!/bin/bash
# Add your backup dir location, password, mysql location and mysqldump location
DATE=$(date +%d-%m-%Y)
BACKUP_DIR="/var/www/back"
MYSQL_USER="root"
MYSQL_PASSWORD=""
MYSQL='/usr/bin/mysql'
MYSQLDUMP='/usr/bin/mysqldump'
DB='demo'
#to empty the backup directory and delete all previous backups
rm -r $BACKUP_DIR/*
mysqldump -u root -p'' demo | gzip -9 > $BACKUP_DIR/demo$date_format.sql.$DATE.gz
#changing permissions of directory
chmod -R 777 $BACKUP_DIR
回答7:
You might consider this Open Source tool, matiri, https://github.com/AAFC-MBB/matiri which is a concurrent mysql backup script with metadata in Sqlite3. Features:
- Multi-Server: Multiple MySQL servers are supported whether they are co-located on the same or separate physical servers.
- Parallel: Each database on the server to be backed up is done separately, in parallel (concurrency settable: default: 3)
- Compressed: Each database backup compressed
- Checksummed: SHA256 of each compressed backup file stored and the archive of all files
- Archived: All database backups tar'ed together into single file
- Recorded: Backup information stored in Sqlite3 database
Full disclosure: original matiri author.
回答8:
As a DBA, You must schedule the backup of MySQL Database in case of any issues so that you can recover your databases from the current backup.
Here, we are using mysqldump to take the backup of mysql databases and the same you can put into the script.
[orahow@oradbdb DB_Backup]$ cat .backup_script.sh
#!/bin/bash
# Database credentials
user="root"
password="1Loginxx"
db_name="orahowdb"
v_cnt=0
logfile_path=/DB_Backup
touch "$logfile_path/orahowdb_backup.log"
# Other options
backup_path="/DB_Backup"
date=$(date +"%d-%b-%Y-%H-%M-%p")
# Set default file permissions
Continue Reading ....
MySQL Backup