AUTO BACKUP DATA TO GOOGLE DRIVE WITH RCLONE - LINUX

Hello everyone, continuing on the topic of Linux today, I share with everyone a backup tool that is quite useful in backing up and backing up the system on Linux, which is Rclone.
WHAT IS RCLONE?
Rclone can be said to be an effective synchronous data backup solution to the cloud that helps your data to be backed up periodically and kept absolutely safe.
The advantages of using a cloud storage service are high speed (due to having servers located around the world), data security (not worrying about hardware and network problems) and most of all. are all free. Rclone supports many popular Cloud services such as:
Google Drive
Amazon S3
Openstack Swift / Rackspace cloud files / Memset Memstore
Dropbox
Google Cloud Storage
Amazon Drive
Microsoft OneDrive
Hubic
Back Blaze B2
Yandex Disk
SFTP
The local filesystem
Instead of backing up to another VPS for storage, I switched to using Google Drive , 15GB free capacity , buying more is also quite cheap, only 45k/month is 100GB already. If you have a free Google Apps account , even better.
INSTALL RCLONE
Install the latest version with Linux 64bit operating system
cd /root/
wget https://downloads.rclone.org/rclone-current-linux-amd64.zip
unzip rclone-current-linux-amd64.zip
\cp rclone-v*-linux-amd64/rclone /usr/sbin/
rm -rf rclone-*
Commonly used Rclone commands :
rclone config – Configure the connection to the cloud service.
rclone copy – Copy files from server to cloud, skip if data already exists.
rclone sync – Synchronize between server and cloud, only update data on cloud.
rclone move – Move files from server to cloud.
rclone delete – Deletes the data of the folder.
rclone purge – Deletes the data of the folder and all its contents.
rclone mkdir – Create folder.
rclone rmdir – Delete empty folder at path.
rclone rmdirs – Delete all empty folders at the path. The timer includes:
ms – Milliseconds
s – Seconds
m – Minutes
h – Hours
d – Days
w – Weeks
M – Months
y – Years
rclone check – Check if the server and cloud data is in sync or not.
rclone ls – List all data including size and path.
rclone lsd – List entire directories.
rclone lsl – List all data including modification time, size and path.
rclone size – Returns the directory size.
View more commands
Create Backup to Cloud
Now run the rclone config command and follow the steps as below.
[root@vps1 ~]# rclone config
2020/01/12 14:08:20 NOTICE: Config file "/root/.config/rclone/rclone.conf" not found - using defaults
No remotes found - make a new one
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n
name> backupdaily
Here you choose n to create a new one and name it backupdaily
Type of storage to configure.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / 1Fichier
\ "fichier"
2 / Alias for an existing remote
\ "alias"
3 / Amazon Drive
\ "amazon cloud drive"
4 / Amazon S3 Compliant Storage Providers including AWS, Alibaba, Ceph, Digital Ocean, Dreamhost, IBM COS, Minio, SeaweedFS, and Tencent COS
\ "s3"
5 / Backblaze B2
\ "b2"
6 / Box
\ "box"
7 / Cache a remote
\ "cache"
8 / Citrix Sharefile
\ "sharefile"
9 / Compress a remote
\ "compress"
10 / Dropbox
\ "dropbox"
11 / Encrypt/Decrypt a remote
\ "crypt"
12 / Enterprise File Fabric
\ "filefabric"
13 / FTP Connection
\ "ftp"
14 / Google Cloud Storage (this is not Google Drive)
\ "google cloud storage"
15 / Google Drive
\ "drive"
Storage> 15
I Backup to Google Drive , so I choose 15
** See help for drive backend at: https://rclone.org/drive/ **
Google Application Client Id
Setting your own is recommended.
See https://rclone.org/drive/#making-your-own-client-id for how to create your own.
If you leave this blank, it will use an internal key which is low performance.
Enter a string value. Press Enter for the default ("").
client_id>
Google Application Client Secret
Setting your own is recommended.
Enter a string value. Press Enter for the default ("").
client_secret>
Scope that rclone should use when requesting access from drive.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / Full access all files, excluding Application Data Folder.
\ "drive"
2 / Read-only access to file metadata and file contents.
\ "drive.readonly"
/ Access to files created by rclone only.
3 | These are visible in the drive website.
| File authorization is revoked when the user deauthorizes the app.
\ "drive.file"
/ Allows read and write access to the Application Data folder.
4 | This is not visible in the drive website.
\ "drive.appfolder"
/ Allows read-only access to file metadata but
5 | does not allow any access to read or download file content.
\ "drive.metadata.readonly"
scope> 1
Choose 1 here
ID of the root folder
Leave blank normally.
Fill in to access "Computers" folders (see docs), or for rclone to use
a non root folder as its starting point.
Note that if this is blank, the first time rclone runs it will fill it
in with the ID of the root folder.
Enter a string value. Press Enter for the default ("").
root_folder_id>
Do not enter anything, Press Enter to use default
Service Account Credentials JSON file path
Leave blank normally.
Needed only if you want use SA instead of interactive login.
Enter a string value. Press Enter for the default ("").
service_account_file>
Do not enter anything, Press Enter to use default
Edit advanced config? (y/n)
y) Yes
n) No
y/n> n
Select n For no advanced customization, just basic use
Remote config
Use auto config?
* Say Y if not sure
* Say N if you are working on a remote or headless machine
y) Yes
n) No
y/n> n
Choose n
If your browser doesn't open automatically go to the following link: https://accounts.google.com/o/oauth2/auth?access_type=offline&client_id=...
Log in and authorize rclone for access
Enter verification code> 4/vQEo6HRbnnZb98tetr0ZpNb28a6sf8e84fHfT65wfBTz8xvbKUAE6EI
Configure this as a Shared Drive (Team Drive)?
y) Yes
n) No (default)
y/n> n
Select N to not configure Drive team
--------------------
[da]
type = drive
scope = drive
token = {"access_token":"ya29.a0ARrdaM_z8fvjB4QmGeNmNyAOgjGtRj773tnXBZMa5AiT47npTjQEBXgNxv4B8HhNEO6ADh6PEX8kTjQ_J2pWdzgYpCSLpOJDxq8EJ5DNkgP3i85YyJiu476fL6ImPzmoR1LhC9A3xl6zsTqo5aYSWMBruZPW","token_type":"Bearer","refresh_token":"1//0g6QqPuoh3HBaCgYIARAAGBASNwF-L9Ir9dhX4soUISXlchHljWi1cYF2ybXySMDW19pmWemjtj7SyCXQ6uGzh4PnbJZv4pU_KP0","expiry":"2020-01-12T15:21:11.384653782+07:00"}
team_drive =
--------------------
y) Yes this is OK (default)
e) Edit this remote
d) Delete this remote
y/e/d> y
Select Y to agree
Current remotes:
Name Type
==== ====
backupdaiy drive
e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q> q
Select q to exit
root@vps:~#
That's it, now you can test with the list directory command in the remote connection:
rclone lsd remote:backupdaily
Create Script Backup to Cloud
nano /root/backup.sh
#!/bin/bash
cwd=$(pwd)
SERVER_NAME=localhost
REMOTE_NAME=data
DATE=`date +%Y-%m-%d`
TIMESTAMP=$(date +%F)
BAK_DIR=/data-backup/
BACKUP_DIR=${BAK_DIR}/${TIMESTAMP}
MYSQL_USER="root"
MYSQL=/usr/bin/mysql
MYSQL_PASSWORD=Admin@123
Mysqldump=/usr/bin/mysqldump
rclone=/usr/sbin/rclone
SECONDS=0
exec >$BAK_DIR/logs/${DATE}.log
exec 2>&1
mkdir -p "$BACKUP_DIR/mysql"
echo "Starting Backup Database";
databases=`$MYSQL -u $MYSQL_USER -p$MYSQL_PASSWORD -e "SHOW databases;" | grep -Ev "(Database |information_schema | performance_schema | mysql | sys)" `
for db in $databases; do
echo ${db}
$Mysqldump -u $MYSQL_USER -p$MYSQL_PASSWORD --databases $db --quick --lock-tables=false | gzip> "$BACKUP_DIR/mysql/$db.gz"
done
echo "Finished";
echo '';
echo "Starting Backup Website";
mkdir -p $BACKUP_DIR/data
domain=${D##*/} # Domain name
echo "-" $domain;
zip -r -y -q $BACKUP_DIR/data/$TIMESTAMP.zip /var/www/html/ #Exclude cache
#fi
#done
echo "Finished";
echo '';
echo "Starting compress file";
size1=$(du -sh ${BACKUP_DIR} | awk '{print $1}')
cd ${BAK_DIR}
tar -czf ${TIMESTAMP}".tgz" $TIMESTAMP
cd $cwd
size2=$(du -sh ${BACKUP_DIR}".tgz"| awk '{print $1}')
rm -rf ${BACKUP_DIR}
echo "File compress from "$size1" to "$size2
echo "Finished";
echo '';
echo "Starting Backup Uploading";
$rclone copy ${BACKUP_DIR}.tgz "$REMOTE_NAME:/$SERVER_NAME/"
#$rclone -q delete --min-age 1m "$REMOTE_NAME:/$SERVER_NAME" #remove all backups older than 1 week
find ${BAK_DIR} -mindepth 1 -mtime +6 -delete
echo "Finished";
echo '';
duration=$SECONDS
echo "Total $size2, $(($duration/60)) minutes and $(($duration%60)) seconds elapsed."
chmod +x /root/backup.sh
Try checking on the Cloud for a new folder with backup data:
rclone lsl remote:backupdaily
Create automatic cronjob with daily backup
crontab -e
0 2 * * * /root/backup.sh > /dev/null 2>&1
This schedule is that every day at exactly 2 am, Backup move Google Drive will automatically run .
Download backup files from Cloud to VPS
The easiest way to restore that data is to download the backup file from the Cloud to your computer, and then upload it back to VPS depending on your needs. However, if you want to download the backup file directly to the VPS, you can always use Rclone with the copy command.
rclone copy "remote:/backupdaily/2021-01-12" /var/www/html/
The above command will copy the folder in the backupdaily folder on the Cloud to your 2021-01-12folder/var/www/html/ in the backupdaily folder on the Cloud to the VPS