Filestation takes the users PC completely out of the equation. . .and I have found it quite effective. I have used it mostly for USB transfers to/from the NAS, but i also works well between a WD NAS and a Synology NAS.
I believe this has it’s own flaws. Within the WD ExoSystem; I have done this using SSH to execute the necessary commands directly from Linux.
Yes using WD’s local or remote backup (or Safepoint for OS3 users) has its own flaws. One major one being the inability to use a single bay second gen OS5 firmware My Cloud as a remote backup target. WD set it up so the multi bay models can be backup targets for OS5 remote backups but not the single bay. Stupid. The other flaw is OS3 Safepoint just fails sometimes with no indication why so a user can track down and fix the cause. Frustrating.
The flaky Safepoint was the main reason I moved (at the time) to using a rsync with email notification script to backup a first gen single bay My Cloud to a USB hard drive. Even made a post about using rsync to back up that first gen single bay My Cloud to a Synology NAS. With rsync one can combine it with nohup command to have it run after shutting down or disconnecting the computer from the My Cloud.
Problem is, using SSH/rsync typically requires some extra computer knowledge and skill to get working properly. Not everyone wants to go geek like that and want something much simpler and familiar like Windows File Explorer or Mac Finder, hence File Station in the Synology DSM NAS operating system.
Having had the same challenge, I found this thread and tried the rsync approach. However, transferring 1.4 TB of data with rsync would run for days and was still far from finishing. I think that’s due to the fact that I use my WD drive for Time Machine Backups, and they consist of thousands of smaller files.
I tried other approaches, and eventually came back here to report what worked fastest and best for me: using tar and scp. This is how you do it:
Activate ssh on both devices
Login to the UI, and under Settings > Network, switch on SSH.
ssh into the old device
ssh sshd@device-name
cd to the data folder, for example
cd /shares/TimeMachineBackup
Create a tar archive of the files and directories that you want to transfer. Use nohup, so you can logout and let tar keep running
nohup tar -cvf archive.tar file1 file2 directory1 >tar.log &
Note that I don’t use compression, because most files do not compress well, and tar turned out to be significantly faster without compression.
Wait till tar has finished. This can take hours. You can check if it is still running using
ps aux | grep tar
ssh into the new device
Copy the archive over, again using nohup to keep it running
It would be quicker if you output the tar file to the new device. This would save the step of having to transfer the tar file after the tar command was done.
Nice idea. Didn’t think of that option, tbh. Actually, didn’t know that tar was capable of doing so. You might want to post instructions how that could be done.
nohup tar -cvf new-device-name:/shares/TimeMachineBackup/archive.tar file1 file2 directory1 >tar.log &
or you could mount the new device on the old device.
mkdir /transfer
mount -t cifs //192.168.1.209/shares/TimeMachineBackup /transfer
nohup tar -cvf /transfer/archive.tar file1 file2 directory1 >tar.log &
change the ip address to your new device ip address.
I suggest to use Linux terminal as a most universal approach.
If you don’t have linux on your notebook, you can use putty or virtual box (you will need to install linux using VB).
Here I will give a brief description of actions for manual backup (also possible to create a cron script, that is going to do actions on certain schedule).
From web interface of admin panel. Go to settings → network settings and enable ssh. Create password for ssh remember it!!!
Connect via ssh. Default user is sshd for login, but you will become a root, when session is going to start:
$ ssh sshd@your_NAS_ip
your_NAS_ip is visible from settings.
Enter password which you created when you were switching on ssh (it isn’t visible while you’re typing).
Find in terminal on which hard drive your data is located:
$ cd /shares
$ ls -la
$ cd /mnt
$ls -la
(/shares - directory with links on real data). Real directory may look like this “/mnt/HD/HD_a2/Public” and so one, explore from /mnt. Use:
3.1. “cd …” to go up.
3.2 “cd ” – to enter directory.
3.3 “ls -la” – list all files in long format with hidden files.
3.4 “pwd” – show current directory.
Plug in usb in your NAS (it is going to be mounted automatically). So, go to that directory like:
$ cd /mnt/USB/<your_usb_dev> (don’t really remember certain path)
Copy data via cp command with verbose (-v) mode recursively (-r). From NAS to your backup USB (SSD/HDD/anything what you will plug):
#$ cp -rv /mnt/HD/* /mnt/USB/<your_usb_dev>
My source directory is: /mnt/HD/HD_a2 and flash drive is /mnt/USB/USB1_c1, so this command for me looks like this: cp -rv /mnt/HD/HD_a2/* /mnt/USB/USB1_c1
If you are curious in which directory your data is you can use “du -sh .” from directory to check that directory size.
Well, everything is done!!! If backup usb is going to be used on windows (filesystem type: FAT32, NTFS) warning is going to pop up, agree to fix problems, it should not harm your data.
I’m also a “happy” user of “My Cloud EX2 Ultra” OS3…
PS:
I decided to SWITCH from WD to SYNOLOGY, due to their better software and NAS system.
How did I come to that solution. I could not back up data via gui from admin panel, because backup option via usb was removed from admin panel, I could not create a task like it is written in instruction (2016). Also, Synology provides administration via linux and its apps, what is more convenient. I was very disappointed that I could not backup data via usual interface, it is very sad that WD does not care about their older versions of NAS, very pity, hope that none will use WDs NASes.