Limit number of rar archives on Ubuntu Server 14.04.3/Samba

Discussion in 'Programming/Scripts' started by danhansen@denmark, Oct 26, 2015.

  1. danhansen@denmark

    danhansen@denmark Member HowtoForge Supporter

    Hello friens ;)


    Ubuntu Server 14.04.3 LTS (GNU/Linux 3.13.0-32-generic x86_64)
    Samba version 4.1.6-Ubuntu
    Rar version RAR 4.20


    I'm using rar to add archives to a backupdisk (/backupdisk1) and this works perfect. But, I need to be able to limit the numbers of archives.
    Rar is set to create archives in volumes size 4Gb, selfextracting and with a timestamp in the end of the filenames. In this case it is only one time every day. Format is:

    Sample of backup sets:
    filename-251015-215123.sfx
    filename-251015-215123.rar1
    filename-251015-215123.rar2
    filename-251015-215123.rar3

    filename-251015-215035.sfx
    filename-261015-215035.rar1
    filename-261015-215035.rar2
    filename-261015-215035.rar3

    Can you help me with a command to set a limit of these archives? I can build the shell-script myself, but I need help with the command/ideas to do it.

    Maybe a "grep" command? Or "awk" ?? I would love to be able to set a max. number of "archive sets" E.g. 30 versions as a maximum and then the oldest would get trashed ;)

    I've got a script which warns me should the disks get almost full, but I would like a bit more control. Can you help me?

    Kind Regards,
    DanHansen@Denmark
     
    futureman likes this.
  2. mariaczi

    mariaczi New Member

    Hi.
    After create archive use find command with switches [man find - files oldest then X days]
     
  3. danhansen@denmark

    danhansen@denmark Member HowtoForge Supporter

    Hi Mariaczi,

    Thanks for your post. It's a little while ago, this question, but I still have issues with the script where rar is used... I solved the find and destroy action which I'll show further down, but I still have a problem. When RAR'ing to a map'ed drive (in this case 2 drives actually) the files don't appear. WHen running the script manually, theres no problems, the files appear and everything is honkydonky. But when cron runs the script the files doesn't appear... Maybe you have a suggestion to this problem!?!?

    Here's init-log to show that the script is running once daily:
    CRON RAR 05.00 - Lancaster Daily Rar Initiated: Thu Jan 7 05:00:01 CET 2016
    CRON RAR 05.00 - Lancaster Daily Rar Successful: Thu Jan 7 05:03:08 CET 2016
    CRON RAR 05.00 - Lancaster Daily Rar Initiated: Fri Jan 8 05:00:01 CET 2016
    CRON RAR 05.00 - Lancaster Daily Rar Successful: Fri Jan 8 05:03:00 CET 2016
    CRON RAR 05.00 - Lancaster Daily Rar Initiated: Sat Jan 9 05:00:01 CET 2016
    CRON RAR 05.00 - Lancaster Daily Rar Successful: Sat Jan 9 05:03:08 CET 2016
    CRON RAR 05.00 - Lancaster Daily Rar Initiated: Sun Jan 10 05:00:01 CET 2016
    CRON RAR 05.00 - Lancaster Daily Rar Successful: Sun Jan 10 05:03:10 CET 2016
    CRON RAR 05.00 - Lancaster Daily Rar Initiated: Mon Jan 11 05:00:01 CET 2016
    CRON RAR 05.00 - Lancaster Daily Rar Successful: Mon Jan 11 05:03:11 CET 2016

    Here's the script locating, sorting and selecting the right files. Works perfectly. But, there's no files to be found because of the problem described above ;) :
    #!/bin/bash
    # CRON WILL RUN THIS SCRIPT DAILY AT 07.00 FIND&REMOVE OLD RAR FROM 2 BACKUP DISKS
    echo "CRON RAR 07.00 - Lancaster Daily Find&Remove Initiated: $(date)" >> /home/username/logs/testdustbuster_dailyinit.log
    /usr/bin/find /media/backupdisk1/ -type f -mmin +1 \( -name 'rar_daily*.*' -name '*.sfx' -o -name 'rar_daily*.*' -name '*.rar' \) -exec ls -s {} \; | sort -n -r >> /home/username/logs/testdustbuster_daily.log
    /usr/bin/find /media/backupdisk1/ -type f -mmin +1 \( -name 'rar_daily*.*' -name '*.sfx' -o -name 'rar_daily*.*' -name '*.rar' \) -exec ls -s {} \; | sort -n -r | xargs rm -f
    echo "CRON RAR 07.00 - Lancaster Daily Find&Remove Successful: $(date)" >> /home/username/logs/testdustbuster_dailyinit.log

    Looking forward to hear your suggestion ;)

    Kind Regards,
    Dan
     
  4. mariaczi

    mariaczi New Member

    1st - check permission for point (folder(s)) where you mount/map shared resource for this account you run your script
    2nd - in script (with cron) you must use absolute path to binary because cron doesn't known PATH :)
    I can't find any command for create archive in source you showed ;)
     
  5. sjau

    sjau Local Meanie Moderator

    Why not just simply use:
    Code:
    #!/usr/bin/env bash
    days=10 # Remove backups older than 10 days
    find "/path/to/backupdir/" -type f -mtime +"${days}" -exec rm "{}" \;
    
    Didn't test it but that looks a lot saner. I assume all the backup files are in the same folder.
    Otherwise, if you make for each backup a folder first, you could run:
    Code:
    #!/usr/bin/env bash
    days=10 # Remove backups older than 10 days
    find "/path/to/backupdir/" -maxdepth 1 -type d -mtime +"${days}" -exec rm -Rf "{}" \;
    
    Also: xargs is evil according to #bash on freenode
     

Share This Page