Re: [Hampshire] Tar file max size

Top Page

Reply to this message
Author: James Dutton via Hampshire
Date:  
To: Hampshire LUG Discussion List
CC: James Dutton
Subject: Re: [Hampshire] Tar file max size
On Wed, 26 Mar 2025 at 12:06, rmluglist2--- via Hampshire
<hampshire@???> wrote:
>
> Hi all
> Coming back to a topic I raised a few months ago…
> I’ve been trying to get a write-once-read-many backup using tape. Trouble is – it keeps failing. It’s a fair old amount of data to backup (20+Tb) but I think one of the issues is I’m trying:
> tar czvf /dev/nst0 /path/to/data
> Googling tells me tar’s max file size is 8Gb and my files to be backed up (which are highly compressed anyway) are all around 8Gb – hence even only 2 of them will create a tar file over 8Gb. I’m happy to remove the z so remove compression but is the way to this really (seems awfully clunky)_ to write a script to supply each file to be archived as the second argument to the command above. So my tapes will end up with 500 .tar files if I have 500 files to compress? Doing it as one .tar file – or even one .tar per directory is going to result in .tar files of over 200Gb so possibly the reason for the errors.
> If anyone’s wondering what error codes I’m getting, it’s usually “file too big” (I’m paraphrasing).
> Lastly – I’ve not found any decent GUI based (free) software to do this. Grsync didn’t seem to like my tape drive – not sure why
>


I am sure this problem has been solved before. Are you reinventing the
wheel in some way?
I found that open source backup software like "bacula" has solved all
these sorts of problems.
Is there a specific reason why you don't wish to use tried and tested
backup software?
https://www.bacula.org/

Kind Regards

James

--
Please post to: Hampshire@???
Manage subscription: https://mailman.lug.org.uk/mailman/listinfo/hampshire
LUG website: http://www.hantslug.org.uk
--------------------------------------------------------------