Remote backup with nonstandard ports (not 873 and 22)


#1

I have set up my 2nd WD My Cloud EX2 Ultra and configured it to use ports 30022 and 30873 for SSH / Remote backup. I cannot use simply 22/873 because of conflict with other services listening on my one and only static IP.

I went to my first WD My Cloud EX2 and wanted to set up Remote backups. But there is no way to specify the ports. Why then does WD allow me to enable port forwarding to arbitrary ports? What is the workaround?

Thanks,
Tomas


#2

Is not “Settings - - > Network - - > Port Forwarding” what you are looking for?

It’s been a while since I played with this. . .but I seem to recall having to make adjustments on the router port forwarding table as well as on the NAS. (In my case, I was getting two separate NAS boxes to the internet - - > and they naturally wanted to default to the same ports for communications)


#3

Hi, yea, but this is not what I mean. I have this set up already, issue is how to specify these different ports on the OTHER device that is supposed to access them. As it looks like wd just assumes ports are standard when setting up remote backup to another wd.


#5

Yeah. . .I think somewhere in this mess you have to do some port forwarding on the Router. I know for file access, the two NAS drives I have are using the same internal ports. . . I know the NAS drives can assign different external ports for the internal ports. … .but I also know that the router can do the same thing.


#6

I usually do portforwarding on the router, however you may want to make use of uPNP.
My Cloud OS ships with a rather obscure tool upnp_igdctrl

Alpha Networks UPnP ControlPoint  IGD-ctrl 1.0

Usage:
    Add port mapping:
        upnp_igdctrl [-S service] -A -t {TCP | UDP} -e <port number> -p <port number> -d <string> [-n <loop number>]
    Delete port mapping:
        upnp_igdctrl [-S service] -D -t {TCP | UDP} -e <port number> [-n <loop number>]
    Get specific port mapping entry:
        upnp_igdctrl [-S service] -P -t {TCP | UDP} -e <port number>
    Get external IP address:
        upnp_igdctrl [-S service] -W
	
    options:
        -S    service: IP(WANIPConnection) or PPP(WANPPPConnection) [default=IP]
        -t    protocol: TCP or UDP
        -e    external port number
        -p    internal port number
        -d    service description
        -n    number of ports to be added or deleted

I think you can forward port 30022 on the NAS to port 40022 on the router with this command

upnp_igdctrl -A -t TCP -e 40022 -p 30022 -d my_ssh

#7

Hi and thanks for the reply. But this is not what I am asking. I know how to forward ports on router. I know how to forward them also in WD cloud. What I need is a way to set up Remote backups TO server, that has ports forwarded to nonstandard 22 and 873. As there is no way to specify this in settings. All the replies are about how to forward ports, not about how to use the forwarded ports on second WD that needs to access primary WD over these different ports.


#8

Isn’t the port forwarding bi-directional?


#9

The problem is really not HOW TO FORWARD PORTS, but HOW TO USE FORWARDED PORTS on the screen:

From null

i.e. on the second WD cloud than the one where ports are forwarded.


#10

This is the same question I have too. My destination router won’t open Port 22 (I have port forwarding configured in the destination router to send port 873 to the destination MyCloud, but port 22 won’t config in my destination router due to a “conflict” glitch). So if I change the port forwarding config on the destination MyCloud to 30022 and configure the destination router to port forward 30022 to the destination MyCloud, how then do I configure the backup job on the source MyCloud to use port 30022 to get thru the destination router.

This sounds like what you are asking too.


#11

I’ve done some digging… here are the results.

Server

First you’d want to run the rsync daemon at the source side on a non-default port.
You may run this directly

rsync --daemon --port=10873 --config=/etc/rsyncd.conf

Or use the WD wrappers to save a custom port in the config and then start the daemon

rsyncom -e 1 -p somepassword -o 10873 -s
rsyncom -x

The password gets converted to base64 and is stored in /etc/rsyncd.secrets.
The WD wrapper generates a configuration file with rsync modules for all shares.

I wouldn’t open up the rsync port to the public internet. I think the password can be bruteforced too easily.
Use SSH instead.
You should change the SSH port in /etc/ssh/sshd_config (e.g. to 10022) and then

kill -HUP $(pidof sshd)

Client

On the client side, you may connect directly to the rsync daemon to list the shares. It will prompt a password, fill in the base64 encoded version! If you chose ‘somepassword’ before, you the password is ‘c29tZXBhc3N3b3Jk’

rsync -rtv --port 10873 root@server_ip_address::

e.g. to list the contents of the Public share

rsync -rtv --port 10873 root@server_ip_address::Public

e.g. archive pictures from local PC to the Pictures share on the server (only transfer those that weren’t there yet)

rsync -a --progress ~/cat_pics --port 10873 root@server_ip_address::Pictures/cats

Note that there’s no encryption of the data over the rsync port.
For remote connections, you should connect over SSH. Note the single colon.

rsync -rtv --port 10022 root@server_ip_address:

Sync Jobs

There is a WD tool to manage rsync jobs, but I didn’t analyse it yet.
It creates cron jobs with rsync commands. It is very unlikely that it supports custom ports so you better don’t get your hopes up.

root # rsyncmd -h

NAME
	rsyncmd - Run Rsync Backup	[v1.00_20130904]

SYNOPSIS
	rsyncmd [-h] [-n seq] [-r job_name] [-t inc_remote_path] [-l inc_path_list]

DESCRIPTION
	-h	help
	(local) -b	force to run rsync backup (ignore state)
	(local) -n	input backup seq, and run rsync backup recover to remote
	(local) -r	input job name, and run rsync backup according to xml setting
	(local) -k	input job name, and kill rsync backup job
	(local) -l -r 	output inc path to xml file
	(remote) -t	rotate remote inc folder
	(remote) -l -t	output xml format inc path

#12

Hello and thank you so much for helpful reply.

Listing of all shares on remote server works.

root@michnoCloud root # rsync -rtv --port 30873 root@XXXXXXXXXX::
Public         
SmartWare      
TimeMachineBackup

however I have an issue while trying to list specific folder.

root@michnoCloud root # rsync -rtv --port 30873 root@XXXXXXXXXXX::Public
Password: 
@ERROR: auth failed on module Public
rsync error: error starting client-server protocol (code 5) at main.c(1530) [Receiver=3.0.7]

I have confirmed by debase64encoding /etc/rsyncd.secrets that the password I use is correct.

Here is my rsyncd.conf on server:

root@MyCloudEX2Ultra root # cat /etc/rsyncd.conf 
hosts allow = *
hosts deny = *
use chroot = no
uid = root
gid = root
secrets file = /etc/rsyncd.secrets
pid file = /var/run/rsync.pid
timeout = 300

[ Public ]
path = /mnt/HD/HD_a2/Public
read only = false
list = yes
auth users = root

This all seems correct, any ideas what might be wrong?

P.S. rsync over SSH just hangs. I can SSH from my PC just fine, but does not work from one WD to another WD. I think using --port is not how you specify SSH port with rsync, I will try to figure out what is wrong:

root@michnoCloud root # rsync -rtv --port 30022 sshd@XXXXXXXXX:
ssh: connect to host XXXXXXXXX port 22: Connection timed out
rsync: connection unexpectedly closed (0 bytes received so far) [Receiver]
rsync error: unexplained error (code 255) at io.c(601) [Receiver=3.0.7]

Thanks again!


#13

Ok, getting somewhere!

rsync -e 'ssh -p30022' -rtv --port 30873 sshd@XXXXXXXXXX:/mnt/HD/HD_a2/Public/

this connects to SSH on port 30022 and rsync daemon on port 30873! I will now play with commands and let you know if I find out how to back up properly.


#14

Alright, I am backing UUUUUPP!!! All hail to @Tfl who helped me with basic syntax.

root@michnoCloud root # rsync -e ‘ssh -p30022’ -a --progress /mnt/HD/HD_a2/ --port 30873 sshd@XXXXXXXXX:/mnt/HD/HD_a2/Backups/

What is left:

  1. figure out how to CRON this
  2. figure out how to hardcode SSH password into command
  3. or even better figure out how to use SSH keys for authentication
  4. optimize command to exclude some dirs (this should be easy)

I will update you once I have all figured out. Or I will cry for help. one or the other.

THANKS!


#15

That connection error means the password is not accepted. Just use the base64encoded password as in the secrets file. Eventually try with my example passwords.

I strongly recommend to create an ssh configuration file at /home/root/.ssh/config or ~/.ssh/config or /etc/ssh/ssh_config

Host otherbox
    Hostname: XXXXX
    User: root
    Port: 30022

Then setup your keys and copy them to otherbox

ssh-key-gen
ssh-copy-id otherbox

Then simply

rsync -a /shares/ImportantData otherbox:/shares/Backups

Finally get persistence across reboots, e.g. with this snippet, which is part of my entware package.