Hi all
I've never been a bash script expert (to say the least!) but I need a clever
script (I assume that's the easiest way but please say if not) that can turn
120 files (amounting to 2Tb of data) into 40 files of <=50Gb for an
archiving project I'm about to embark on. Can anyone help / give me some
pointers?
I can guarantee none of the files are over 50Gb in size. So it's a
question of something like:
Create an empty archive (call it n.tar.gz)
Repeat
Find next (uncompressed) file
Compress it to tmp.tar.gz
If size of tmp.tar.gz < (50Gb - current size of n.tar.gz) then
Append tmp.tar.gz to n.tar.gz
Else
Close n.tar.gz
Alert me so I can write n.tar.gz to MD
Create next empty archive ((n+1).tar.gz)
Add tmp.tar.gz to (n+1).tar.gz
End If
Until all files have been compressed
The end product should be 40 files called n.tar.gz (where 0<n<41).
Any (constructive!) ideas very welcome.
Cheers
--
Please post to: Hampshire@???
Web Interface:
https://mailman.lug.org.uk/mailman/listinfo/hampshire
LUG URL:
http://www.hantslug.org.uk
--------------------------------------------------------------