Easy cPanel backups

Status
Not open for further replies.

droolingmnky

Spank Me~
Oct 1, 2006
174
1
0
Michigan
How many people have automated backups taken of their sites? If your some of the few that don't and use cPanel then your in luck.

There are two options you can go. One is to have a set of scripts run by cron on your server and peform the necessary backups, or a bash script run by cron on a *nix box with wget. Of course there is a paid option from cpanelbackup.com, I have never used them or even considered them because I'm a cheap ass.

Now for the scripts stored locally on the host you'll need a ftp server to send all the information to. Keep them outside of the public_html and www dir the last thing you want is some one to be able to get access to these scripts. This isn't a totally set and forget solution. You will have to check your backed up accounts and remove the old backup files from the main directory for the full backups and the database backups from where the script is ran. I didn't write these scripts, but they work and it's a smaller amount of hassle than manually logging into each account and entering in all the information.

I don't have a *nix box aviable to test the bash script. I will assume you will still have to clean up after it.

Strong points of the locally hosted scripts.
  • backups happen when you tell them to even if you loose power.
  • the backups are stored locally on the host
  • if some one else reads the script odds will be good they already have your account information
  • it's not operating system dependant
Weaknesses
  • your account information is stored in plain text
  • if cron isn't set right you don't get a backup
  • if the server information isn't set right or is down you don't get a remote backup
Bash script strengths
  • account information isn't stored locally on the host
  • one file to edit easier to maintain and setup
  • automatically sorts the backups in dated directories
  • no need for a ftp server
Weaknesses
  • the box that's running the script goes down there will be no backup peformed
  • all the account information is stored in one file plain text
  • again relies on cron to run
  • you need a bash shell and wget or be running cygwin
Quick guide to cron
minute hour day-of-the-month month day-of-the-week command
a * is a wild card matches any thing and it uses military time for the time format. If you need more indepth then try to follow this it's old but has good info.

Running the scripts from cron is simple. Just enter php /home/<user>/<directory scripts are in>/<scriptname>.php as the command and the time you want.

Example of a script running every day at midnight
0 0 * * * php /home/noob/wicked/fire.php
Only on weekdays at midnight
0 0 * * 1-5 php /home/noob/wicked/fire.php

One thing to consider when choosing a time is that many people choose midnight to run their scripts. So try to choose an oddball time when there will be less server load.

Make it a habit when you either clean up or archive your backups on your backup box to check to make sure all the backups are valid and contain good information. Simply randomly extract one and check a few key files. If there is a problem contact your host, then manually peform a backup and check it.

How often you should backup is dependant on a few things. The site activity, how often you update it, and how valuable it is to you.

If you trust your host to peform backups, PM me for my paypal address so you can send me some money.
 


char limit here are the scripts

mysql backup script
Code:
#! /usr/bin/php -q
<?php
// A script produced by BrockHosting.com
// For more help visit http://www.brockhosting.com/bb/viewtopic.php?p=13
$user_name = "username"; // YOUR cpanel user name

$user_pass = "pass"; // YOUR cpanel password
$user_website = "website.com"; // YOUR website do NOT include http://

$ftp_server_ip = "ip/domain"; // Server IP here
$ftp_server_user = "user"; // FTP login name here
$ftp_server_pass = "password"; // FTP password here


$db_list = array("database"); // List of your databases
// Must be "database1","Database2"
// DO NOT ENTER "username_database1", "username_database2"


// Do not edit below this line
$dc = date('m-d-y');
$looper = 0;
while($db_list[$looper] != null) {
 $db_name = $db_list[$looper];
 $get_this = "http://".$user_name.":".$user_pass."@".$user_website.":2082/getsqlbackup/".$db_name.".gz";
 echo "\n<br><br>\nAtempting to downlaod $db_name";
 $handle = fopen($get_this, "r");
  $save_as = "backup_".$db_name."_".$dc."_.gz";
 if ($handle != null) {
   $local_handle = fopen($save_as,"x");
    echo "\n<br>Saveing \"$db_name\" as \"$save_as\" ";
 while(!feof($handle)){ fwrite($local_handle,fgets($handle)); }
 }
 fclose($handle);
 fclose($local_handle);
 // Got file need ftp home
 $ftp_conn_id = ftp_connect($ftp_server_ip) or die("Couldn't connect to $ftp_server_ip");
 ftp_login($ftp_conn_id,$ftp_server_user,$ftp_server_pass);
 $upload = ftp_put($ftp_conn_id, $save_as, $save_as, FTP_BINARY);
 if ($upload != null) {
 echo "\n<br>File Sent!\n";
 unlink($save_as);
 } else {
 echo "\n<br>File NOT SENT!\n";
 }

 $looper++;
}
?>
full backup
Code:
#! /usr/bin/php -q
<?php

// PHP script to allow periodic cPanel backups automatically.
// Based on script posted by max.hedroom in cpanel.net forums
//   This script contains passwords.  KEEP ACCESS TO THIS FILE SECURE!

// ********* THE FOLLOWING ITEMS NEED TO BE CONFIGURED *********

// Info required for cPanel access
$cpuser = "user"; // Username used to login to CPanel
$cppass = "pass"; // Password used to login to CPanel
$domain = "domain.com"; // Domain name where CPanel is run
$skin = "x"; // Set to cPanel skin you use (script won't work if it doesn't match)

// Info required for FTP host
$ftpuser = "user"; // Username for FTP account
$ftppass = "pass"; // Password for FTP account
$ftphost = "domain.com"; // Full hostname or IP address for FTP host
$ftpmode = "passiveftp"; // FTP mode ("ftp" for active, "passiveftp" for passive)

// Notification information
$notifyemail = "email@domain.com"; // Email address to send results

// Secure or non-secure mode
$secure = 0; // Set to 1 for SSL (requires SSL support), otherwise will use standard HTTP

// Set to 1 to have web page result appear in your cron log
$debug = 0;

// *********** NO CONFIGURATION ITEMS BELOW THIS LINE *********

if ($secure) {
   $url = "ssl://".$domain;
   $port = 2083;
} else {
   $url = $domain;
   $port = 2082;
}

$socket = fsockopen($url,$port);
if (!$socket) { echo "Failed to open socket connection... Bailing out!\n"; exit; }

// Encode authentication string
$authstr = $cpuser.":".$cppass;
$pass = base64_encode($authstr);

$params = "dest=$ftpmode&email=$notifyemail&server=$ftphost&user=$ftpuser&pass=$ftppass&submit=Generate Backup";

// Make POST to cPanel
fputs($socket,"POST /frontend/x/backup/dofullbackup.html?".$params." HTTP/1.0\r\n");
fputs($socket,"Host: $domain\r\n");
fputs($socket,"Authorization: Basic $pass\r\n");
fputs($socket,"Connection: Close\r\n");
fputs($socket,"\r\n");

// Grab response even if we don't do anything with it.
while (!feof($socket)) {
  $response = fgets($socket,4096);
  if ($debug) echo $response;
}

fclose($socket);

?>
bash script
Code:
#!/bin/bash

# Nightly backup system for home directories and MySQL 
# databases for websites managed by CPanel.
# Author: Kieran O'Shea
# Author's Website: www.kieranoshea.com
# Support: kieran@kieranoshea.com
# License: GPLv2

# USAGE: Place this file inside the directory you want to
# put backups in, set it as a CRON job to run and collect
# backups from the site(s) you define below at a preset 
# time each day, or run it manually everytime you want to
# take a backup. You can place an unlimited number of sites
# and databases to be backed up in the variables below so
# long as you follow the convention. It doesn't even matter
# if your different domains are hosted in different locations
# so long as they are all managed by CPanel.

# Prerequisites:
# - Linux box with bash shell
# - wget installed

# Define all the sites you wish to backup in the array below
# Domain name and tld only, eg. kieranoshea.com

domain=( domain1.com domain2.com )

# Define all the usernames for the above sites in the array 
# below - keep order consistant!

user=( user1 user2 )

# Define all the passwords for the above usernames in the 
# array below - keep order consistant! If you log on as 
# reseller or root you can enter the same password in 
# every array position, but none the less they must be filled.

pass=( pass1 pass2 )

# Number of databases in each domain above - keep order 
# consistant. If you don't wish to backup any databases
# for a particular domain, or don't have any for it, place
# a 0 in it's position in the array.

no_of_db=( 3 1 )

# Database names, exactly as they are in CPanel, in the order
# that the domains appear. You may like to log into CPanel
# manually and check your database names before filling this 
# variable in. Number of db names here must match the above 
# variables also! Eg. all dbs in domain 1 followed by all 
# dbs in domain 2 etc. You DO NOT need to enter any database 
# names below for a domain if you have placed a zero for it
# in the array above

databases=( domain1_datebase1 domain1_datebase2 domain1_datebase3 domain2_datebase1 )

# Thats all folks, save the file, CHMOD it to be executable 
# by the user who will run it, and you are all set.


# Do not edit anything below this point!

echo
echo "-----------------------------------------------------------------"
echo "Kieran's nightly backup system started"
echo "-----------------------------------------------------------------"
echo ""
echo "-----------------------------------------------------------------"
echo "Creating a directory in your backup path for tonights backup..."
echo "-----------------------------------------------------------------"

FOLDER="$(date +%d-%m-%Y)"
mkdir $FOLDER
cd $FOLDER
echo "-------"
echo "Done!"
echo "-------"

echo ""
echo "-----------------------------------------------------------------"
echo "Creating a directory for every domain to be backed up into..."
echo "-----------------------------------------------------------------"
count=${#domain
[*]}
i=0
while [ "$i" -lt "$count" ]
do
    mkdir ${domain[$i]}
    let "i = $i + 1"
done
echo "-------"
echo "Done!"
echo "-------"

echo ""
echo "-----------------------------------------------------------------"
echo "Backing up MySQL databases..."
echo "-----------------------------------------------------------------"
overall_db_count=0
count=${#domain
[*]}
i=0
while [ "$i" -lt "$count" ]
do
    cd ${domain[$i]}
    mkdir databases
    cd databases
    start_point=$overall_db_count # start point for iterating over the array of db names
    let "end_point = $overall_db_count + ${no_of_db[$i]}" # end point to stop iterating (always the overall count!)

    n=$start_point
    while [ "$n" -lt "$end_point" ]
    do
        wget http://${user[$i]}:${pass[$i]}@www.${domain[$i]}:2082/getsqlbackup/${databases[$n]}.gz
        let "n = $n + 1"
    done
    
    let "overall_db_count = $overall_db_count + ${no_of_db[$i]}" # increase the overall count to keep a tally
    cd ..
    cd ..
    let "i = $i + 1"
done
echo "-------"
echo "Done!"
echo "-------"

echo ""
echo "-----------------------------------------------------------------"
echo "Backing up home directories; this make take some time..."
echo "-----------------------------------------------------------------"
count=${#domain
[*]}
i=0
while [ "$i" -lt "$count" ]
do 
    cd ${domain[$i]}
    wget http://${user[$i]}:${pass[$i]}@www.${domain[$i]}:2082/getbackup/backup-${domain[$i]}-$FOLDER.tar.gz
    cd ..
    let "i = $i + 1"
done

echo "-------"
echo "Done!"
echo "-------"

echo ""
echo "-----------------------------------------------------------------"
echo "All backup tasks completed. Goodnight!"
echo "-----------------------------------------------------------------"
 
So I go from downloading backups weekly, to logging into FTP to delete old backups to download them weekly from FTP?

It isn't THAT hard to log in weekly to pull a backup of your sites that update on their own (no local copy).

I do like this for anyone extremely lazy. The only problem, if you never go in and actually download a backup copy every now and then and your host has a failed RAID 0 (yeah, some hosts are IDIOTS!), your still screwed.
 
So I go from downloading backups weekly, to logging into FTP to delete old backups to download them weekly from FTP?

It isn't THAT hard to log in weekly to pull a backup of your sites that update on their own (no local copy).

I do like this for anyone extremely lazy. The only problem, if you never go in and actually download a backup copy every now and then and your host has a failed RAID 0 (yeah, some hosts are IDIOTS!), your still screwed.

It's really a trade off. You either have more disk usage and current backups, or you have no disk usage forget to backup make some major changes and BOOM host looses your data. If you have the disk usage you can easily forget and go from every week to every month, but you still have daily backups.

I also don't doubt that the scripts could be modified to automatically remove the old files upon a successful ftp upload. So if any one cares to try by all means feel free. This isn't a GREAT FANTASTIC GIMMIE 99.95 way to backup. It's free, relatively simple, and most importantly it works.
 
Status
Not open for further replies.