Naturkundemuseum-bw

Backup

Auf dem Webserver liegen im Verzeichnis /is/htdocs/wp1090471_HRVI45VNYG/backups/ mehrere Skripte, die Backups schreiben und via SFTP auf die StorageBox bei HostEurope kopieren. Es werden die Daten der letzten 7 Tage aufbewahrt.

Das Hauptskript ist file_backup.sh, es ruft alle Sub-Skripte auf.

Erstellung des Typo3 Backups:

#!/bin/bash

# Set the path to the Typo3 installation and the backup directory
TYPO3_DIR="$HOME/live"
BACKUP_DIR="$HOME/backups/live"

# Create a timestamp
DATE=$(date +%F)

# Perform the file backup
tar -czvf $BACKUP_DIR/file_backup_$DATE.tar.gz -C $TYPO3_DIR .

Erstellung des Datenbank-Backups:

#!/bin/bash

CONFIG_FILE="$HOME/live/typo3conf-local/LocalConfiguration.php"
BACKUP_DIR="$HOME/backups/live"

DB_NAME=$(cat $CONFIG_FILE | grep \'dbname\'\ =\> | cut -d "'" -f 4)
DB_USER=$(cat $CONFIG_FILE | grep \'user\'\ =\> | cut -d "'" -f 4)
DB_PASSWORD=$(cat $CONFIG_FILE | grep \'password\'\ =\> | cut -d "'" -f 4)
DB_HOST=$(cat $CONFIG_FILE | grep \'host\'\ =\> | cut -d "'" -f 4)

DATE=$(date +%F)

#mysqldump -h $DB_HOST -u $DB_USER -p$DB_PASSWORD $DB_NAME > $BACKUP_DIR/db_backup_$DATE.sql
mysqldump --quick --quote-names --allow-keywords --force --no-data -h $DB_HOST -u $DB_USER -p$DB_PASSWORD $DB_NAME > $BACKUP_DIR/db_backup_structure_$DATE.sql
mysqldump --quick --quote-names --allow-keywords --force --extended-insert --no-tablespaces --no-create-db --no-create-info --skip-add-drop-table -h $DB_HOST -u $DB_USER -p$DB_PASSWORD $DB_NAME > $BACKUP_DIR/$

# Generate checksum for structure dump
sha256sum $BACKUP_DIR/db_backup_structure_$DATE.sql > $BACKUP_DIR/db_backup_structure_$DATE.sql.sha256

# Generate checksum for data dump
sha256sum $BACKUP_DIR/db_backup_data_$DATE.sql > $BACKUP_DIR/db_backup_data_$DATE.sql.sha256

Upload der Backups zur StorageBox und Löschen der Backups, die älter als 7 Tage sind:

#!/bin/bash

# Path to your backup directory on the webserver
BACKUP_DIR="$HOME/backups/live"
SSH_KEY="~/.ssh/id_rsa_backup"

# Date formatting for removing old backups
DATE=$(date +%F)

# Cleanup local backups older than 7 days
find $BACKUP_DIR/*.sql -mtime +7 -exec rm {} \;
find $BACKUP_DIR/*.sql.sha256 -mtime +7 -exec rm {} \;
find $BACKUP_DIR/*.tar.gz -mtime +7 -exec rm {} \;

# Generate checksum for structure dump
sha256sum $BACKUP_DIR/db_backup_structure_$DATE.sql > $BACKUP_DIR/db_backup_structure_$DATE.sql.sha256

# Generate checksum for data dump
sha256sum $BACKUP_DIR/db_backup_data_$DATE.sql > $BACKUP_DIR/db_backup_data_$DATE.sql.sha256

# Start SFTP session and perform tasks
sftp remote_storage_box << EOL
  put $BACKUP_DIR/db_backup_structure_$DATE.sql
  put $BACKUP_DIR/db_backup_data_$DATE.sql
  put $BACKUP_DIR/file_backup_$DATE.tar.gz
  put $BACKUP_DIR/db_backup_structure_$DATE.sql.sha256
  put $BACKUP_DIR/db_backup_data_$DATE.sql.sha256

EOL

# Start SFTP session and upload files
sftp -i "$SSH_KEY" ftp13270622-backup@remote_storage_box << EOF
  put "$BACKUP_DIR/db_backup_structure_$DATE.sql"
  put "$BACKUP_DIR/db_backup_data_$DATE.sql"
  put "$BACKUP_DIR/file_backup_$DATE.tar.gz"
  put "$BACKUP_DIR/db_backup_structure_$DATE.sql.sha256"
  put "$BACKup_DIR/db_backup_data_$DATE.sql.sha256"
EOF

# Function to delete old backups on remote server via SFTP
delete_remote_old_backups() {
  # Calculate the cutoff date (7 days ago)
  CUTOFF_DATE=$(date -d '7 days ago' '+%F')

  # Get the list of backup files from the remote server
  REMOTE_FILES=$(sftp -i "$SSH_KEY" ftp13270622-backup@remote_storage_box << EOF | awk '{print $NF}'
    ls
EOF
)

  # Initialize an empty string to hold rm commands
  SFTP_COMMANDS=""

  # Iterate over each file and determine if it's older than 7 days
  while read -r FILE; do
    # Extract the date from the filename using regex
    if [[ "$FILE" =~ db_backup_structure_([0-9]{4}-[0-9]{2}-[0-9]{2})\.sql ]]; then
      FILE_DATE="${BASH_REMATCH[1]}"
    elif [[ "$FILE" =~ db_backup_data_([0-9]{4}-[0-9]{2}-[0-9]{2})\.sql ]]; then
      FILE_DATE="${BASH_REMATCH[1]}"
    elif [[ "$FILE" =~ file_backup_([0-9]{4}-[0-9]{2}-[0-9]{2})\.tar\.gz ]]; then
      FILE_DATE="${BASH_REMATCH[1]}"
    else
      # Skip files that don't match the backup patterns
      continue
    fi

    # Compare the file date with the cutoff date
    if [[ "$FILE_DATE" < "$CUTOFF_DATE" ]]; then
      # Append the rm command for this file
      SFTP_COMMANDS+="rm backups/$FILE\n"
    fi
  done <<< "$REMOTE_FILES"

  # If there are files to delete, execute the rm commands via SFTP
  if [[ -n "$SFTP_COMMANDS" ]]; then
    echo -e "$SFTP_COMMANDS" | sftp -i "$SSH_KEY" ftp13270622-backup@remote_storage_box
  fi
}

# Call the function to delete old backups
delete_remote_old_backups

Aufruf dieser drei Skripte durch das Hauptskript:

#!/bin/bash

# Main backup script

# Generate the date string
DATE=$(date +%F)

# Call DB backup
~/backups/db_backup.sh $DATE

# Call File backup
~/backups/file_backup.sh $DATE

# Call SFTP and Cleanup
~/backups/sftp_backup.sh $DATE

# Weekly check on Sunday (when `date +%u` returns 7)
if [ "$(date +%u)" == "7" ]; then
    ~/backups/weekly_checksum_check.sh
fi

Das Backup und der Weekly Checksum Check werden von Cronjob aufgerufen, welches über die Oberfläche vom HostEurope KIS gesteuert werden: