I need to re-set my WD MyCloud and want all my data off it. The 2 shares I need to copy over total about 500gb between them so want to zip them up. Please bear in mind I know much less about what I’m doing than you might give me credit for so any explanations written so my mother could understand them would be appreciated. Here is what I’ve been doing:
SSH in and then go to the first share as follows:
cd shares/John/
In there is the folder I want to zip titled finder zip. This folder has a number of sub folders and folders within that. Once in there I then do the following:
tar -zcvf finderzip.tar.gz /shares/John/finder\ zip/
I did this on a different folder and it started giving me feedback on progress and what it was doing. When I do it on this folder though I get the “Removing leading `/’ from member names” and then it appears to do nothing. When I look in finder it does create an archived file but seems to time out at about 80MB max.
Where am I going wrong? Please remember how easily confused I get with this stuff so laymans greatly appreciated!
Thanks! Seems to be working. Out of interest, if I want to shut my macbook pro does this continue to run in the background? Have I set the command off on the MyCloud drive? Or is there a way I need to exit terminal without it messing things up and is there a way to log back in and see progress?
Hang on, When I copied and pasted into ‘notes’ it added the ‘&’. Have removed now and got the following - nohup: ignoring input and appending output to `nohup.out’
Ampersand is to force the process to run in background.
And yes the output you see is correct. The nohup command ensure the process won’t get terminated by hangup signal. Read the nohup.out to see process details.
My main question would: why zip everything? You’ll have to copy the zip off the device at some point before you clear all the data. Why not just copy it? Or do you expect zip to give you a significant compression?
Hint: if it’s mostly compressed media, zip won’t help.
Think that depends on the OP’s contents and strategy, copying 500GB of smaller files is way a lot slower than two big 250GB chunk. I.e. I do daily rootfs backup. My fastest strategy would be dumping the whole 2GB partition before sending it off elsewhere instead of copying each of the files.
Finder just seems to struggle copying it over uncompressed and takes ages and looses connection etc. Thought zipping large folders was best practice for a speedier transfer?
Depending on your needs, you can just tar and avoid unnecessary gzip compression attempt. Just remove the ‘z’ switch and let the filename be *.tar. Also I must add that I don’t know your file contents. If you have symbolic links and want to tar the file instead of links, add the ‘h’ switch. For more details tar --help;
cd /shares/folder
tar cva …/buckup.tar.lzma *
-c = Create
-v = Verbose (For see what happend now)
-a = Auto detect compresson method from archive extension
lzma = Compression method like 7zip. Also available: .gz, .xz and .bzip2
Taken a while but I got round to doing this and it seems to have worked. Did nohup tar -zcvf finderzip.tar.gz -C ./finder\ zip/ .;
I’m now left with a .out file though. No idea what this is or how to open/convert it. I want to get it onto my mac so do I copy over and then do something to open it up?
Edit - I think I’m getting confused. I think it was the above - tar cva …/buckup.tar.lzma * that I did with a nohup prefix that left me with a .out file.