My Cloud replication over the Internet?

I have a My Cloud NAS sitting on my network at home. I’d like to buy another, have it physically located at my brother’s house and each night replicate the NAS in my house with the one in my brothers.

Firstly would this be possible and secondly, suggestions or a solution would be very welcome?

Cheers,

Paul

The My Cloud doesn’t have this capability.

Several of the higher-end models do, though.

The trick, though, is (if I recall correctly) they require a static IP address for the “target” network.

If your brother’s ISP, for example, changes his IP address due to DHCP assignment, it doesn’t work well.

You might want to look at the manuals for the various models to see which ones support it and how it works.

Do you really want to expost your personal information as well as your brothers to the internet?

Pehaps you might consider a My Cloud Mirror to insure data integrity.

It’s not officially support but there’s several ways to mirror the clouds together over the net and all requires some linux/networking knowledge unless you have a 3rd party win/mac client doing the mirroring job on smb shares(slower). One of it you could use the preinstalled rsync and schedule a crontab every night to mirror some paths between the clouds. For the cloud acting as the “server”, it needs to have port 22 port forwarded on the router. The cloud client on the other hand will handle the rsync cron job.

Some tips:

* SSH into your cloud and use “rsync --help”.

* If the cloud “server” has dynamic IP, you can access using 3rd party dynamic DNS (you’ll need to implement these or if the router supports) or the ones provided by wd2go.com. i.e. wdmycloud.device(DeviceID).wd2go.com.

* Use compression and choose lighter Ciphers i.e. arcfour. May be also disallow pseudo tty. It helps to speed things up.

* Use a less common external port other than TCP port 22. Change to a strong root password!

All above is without installing anything extra on the WDMyCloud so it’s easy to revert. For security wise, other than point #3 #4, it’s your responsibility to protect the “server” from brute force attacks. I won’t cover more of those here.

Hope it helps.

Is it possible to whitelist some external ssh users? I am scared that people can brute force the ssh password, because the username is a very common one (root / sshd).

SSH is locked already by default, you can add users to it.

Best practices are:

  • Change root password from default, use a strong password
  • Create a new user for ssh with strong password.
  • Test and make sure you can ssh with new user before moving to next step
  • Do not allow root to login remotely

Thanks. I am not a Linux / ssh expert, so not allowing root to login remotely, sounds a little bit ‘tricky’ to me. Can you tell me what it exaclty means and how to do this?

Just search for it, there are plenty of results and how-tos out there. Research research and read read before you try any. This is one example:

Remember, create a user first normally using the dashboard, then permit that user to ssh and test test test. Otherwise if you remove Root login before that, you will be locked out of the box.

Thanks, I will try this.

Take a look at my security write-ups at the end of this 1st post. This is specifically for MyCloud. Simple steps to follow.

It also includes a way to prevent brute force attacks while preventing yourself from being locked out. I can’t think of a better way to keep these pest off my property :stuck_out_tongue:

You mean this?

Yup :slightly_smiling: BTW JFYI, that site is running on WDMyCloud 4TB 1st Gen firmware v4.

A double check to see if I understand correctly what I am doing:

  1. Add user
  2. Allow this user to login from another device (test test etc.)
  3. Disable login from another device for root user
  4. User from step 1 can still do root things by using sudo
    This user can also undo step 3.

And this is more secure, because brute force hackers don’t know your user?

Will these settings stay how they are after a reboot or firmware upgrade etc?

If you’re a target, any username will subject to dictionary attacks especially usernames are case sensitive in the format of a-z (1st character), 0-9, ‘-’ or ‘_’ unless you use --force-badname which allows uppercase and period. Best chances are some good strong password or disable password logins altogether by using RSA/DSA key exchange.

For MyCloud Gen1 yes the settings sticks on reboot but on firmware upgrade you’ll have to reapply these modifications again.

If I only forward the ssl port to my WDMC in my router, can’t I just use
/etc/hosts.allow (with local range and one external ip)
and
/etc/hosts.deny (blocking all)
to be 100% safe?

I wouldn’t change those at all. But you can if you really want to. However, that would mean no remote access in case you are using the remote WD apps.

For you firewall, just make sure whatever port you used, it goes to MBL/22 and there is nothing else. in Firewall you should be able to allow a specific source IP if you want to as well.
Firewall/4000 (WAN) → MBL/22 (LAN)

FYI, there is no such thing as 100% safe. All we can do it make it harder for them to get in, but if someone REALLY want to get into your system, good hackers will find a way.

Thanks again. I know there’s no such thing as 100% safe, but I want to make it safe enough to scare most hackers away, without changing too much ‘risky’ (as I am not a Linux expert) settings.

I am not using the remote WD apps. I will use this MyCloud as a satellite backup device for my My Cloud Mirror. The only thing that it has to do, is be a rsync daemon.
That’s why I thought that if I forward only 1 WAN port to MC:22 and only allow 1 external ip and all internal ip’s to access…

What do you advice in this situation?

I was reading this link posted yesterday: Securing WDMyCloud SSH & FTP Remote Access | TeaNazaR.com

So if I changed my root password and whitelisted 1 IP and 1 user for rsync and am planning to make use of /etc/hosts.deny and hosts.allow for ssh access. Do I really need the whole script as written on this website? And will I be ‘safe enough’?

:bulb: Another idea that just came up: can’t I just open an external (WAN) port to rsync, without disabling ssh completely?

I don’t really recommend you manually changing the hosts file especially if you’re new to all these. What if your IP lease expires and changes unless of course you have a static IP? Or worst you could get locked out permanently if wrong changes were done.

The whole script I wrote that you see might look lengthy but it uses very less resources when active in background. I have it running on all my system facing the net that doesn’t support dynamic iptables or firewalls. You can test it out, try SSH or FTP or FTPS login to my MyCloud i.e. ssh teanazar.com. 3x failed login and you’re banned :stuck_out_tongue:

For your idea, are you aware that rsync daemon itself has no encryption although the authentication, if applied, is done using the old MD4 hash? It relies on SSH for data exchange encryption. To speed up the process, you can trade-off with a weaker cipher such as RC4 (arcfour).

Whatever it is, yes there’s no such thing as 100% safe. You don’t scare hackers away but you deter them.

My ip only changes about once a year. That’s why I also want to add a local ip range.

I wasn’t aware of the weak security of rsync, thanks.

I’ll try and see if I can get your script working (next weekend).