Filestation takes the users PC completely out of the equation. . .and I have found it quite effective. I have used it mostly for USB transfers to/from the NAS, but i also works well between a WD NAS and a Synology NAS.
I believe this has it’s own flaws. Within the WD ExoSystem; I have done this using SSH to execute the necessary commands directly from Linux.
Yes using WD’s local or remote backup (or Safepoint for OS3 users) has its own flaws. One major one being the inability to use a single bay second gen OS5 firmware My Cloud as a remote backup target. WD set it up so the multi bay models can be backup targets for OS5 remote backups but not the single bay. Stupid. The other flaw is OS3 Safepoint just fails sometimes with no indication why so a user can track down and fix the cause. Frustrating.
The flaky Safepoint was the main reason I moved (at the time) to using a rsync with email notification script to backup a first gen single bay My Cloud to a USB hard drive. Even made a post about using rsync to back up that first gen single bay My Cloud to a Synology NAS. With rsync one can combine it with nohup command to have it run after shutting down or disconnecting the computer from the My Cloud.
Problem is, using SSH/rsync typically requires some extra computer knowledge and skill to get working properly. Not everyone wants to go geek like that and want something much simpler and familiar like Windows File Explorer or Mac Finder, hence File Station in the Synology DSM NAS operating system.
Having had the same challenge, I found this thread and tried the rsync approach. However, transferring 1.4 TB of data with rsync would run for days and was still far from finishing. I think that’s due to the fact that I use my WD drive for Time Machine Backups, and they consist of thousands of smaller files.
I tried other approaches, and eventually came back here to report what worked fastest and best for me: using tar and scp. This is how you do it:
Activate ssh on both devices
Login to the UI, and under Settings > Network, switch on SSH.
ssh into the old device
ssh sshd@device-name
cd to the data folder, for example
cd /shares/TimeMachineBackup
Create a tar archive of the files and directories that you want to transfer. Use nohup, so you can logout and let tar keep running
nohup tar -cvf archive.tar file1 file2 directory1 >tar.log &
Note that I don’t use compression, because most files do not compress well, and tar turned out to be significantly faster without compression.
Wait till tar has finished. This can take hours. You can check if it is still running using
ps aux | grep tar
ssh into the new device
Copy the archive over, again using nohup to keep it running
It would be quicker if you output the tar file to the new device. This would save the step of having to transfer the tar file after the tar command was done.
Nice idea. Didn’t think of that option, tbh. Actually, didn’t know that tar was capable of doing so. You might want to post instructions how that could be done.
nohup tar -cvf new-device-name:/shares/TimeMachineBackup/archive.tar file1 file2 directory1 >tar.log &
or you could mount the new device on the old device.
mkdir /transfer
mount -t cifs //192.168.1.209/shares/TimeMachineBackup /transfer
nohup tar -cvf /transfer/archive.tar file1 file2 directory1 >tar.log &
change the ip address to your new device ip address.