This thread (special thanks to Ewald for mentioning the Linux e2tools) along with the old thread that’s around (mentioning R-Linux) helped me recover my data from my WD NAS drive. Though I’d contribute the missing piece in my scenario.
If your situation is similar to mine, this might help. You’ll need some Linux proficiency, but this might prove useful.
First a little background…
My WD NAS wouldn’t boot after a power outage (steady yellow light), and although I didn’t have any critical data in there, I had some large files that were not yet uploaded to my offsite backup. I wanted to recover them if possible, and I ended up here in the forums. I found out about R-Linux in an older thread, and it had worked for most people. It runs on Windows and Linux, but I’m currently running Mac OS (don’t have a Windows machine on hand). Since VMWare Fusion allows a 30 day trial, I downloaded it, and created a Linux VM. I installed R-Linux inside my Linux VM. It ran, and I could see my WD drive with all the files. Doing a recovery using R-Linux looked promising… but there was a problem. A permissions problem. No big issue, since I could manually change the permissions (I used an app called BatChmod- http://www.lagentesoft.com/batchmod/ to change permissions). The problem is that folders themselves were unreadable when I recovered them. I could not recover any folder that contained sub-folders, R-Linux would not allow it because the folders had the wrong permissions. As a last resort, I started recovering my files folder by folder (this HD had a complex directory structure). After a while, I realized this was not working. I had too many subdirectories and I had to go one by one. Since R-Linux only presents a GUI, and I couldn’t mount the drive via the terminal this was the only way to go. Enter e2tools. I Found this thread, installed e2tools, and realized that now I could copy files using the terminal. I tried using these tools to copy a directory via the command line, but after many failed attempts I gave up. These tools could only copy individual files, not full directories (at least I couldn’t find a way). So I decided to write a script. After going back to R-Linux for one more attempt, I noticed that it allowed to export the whole directory structure into a text file. Great, at least I could use that and I wouldn’t have to do that in the script.
Instead of using Perl or Python (like a normal programmer would), I decided to create a BASH shell script (note- I’m not a shell script programmer). I only needed to create a few directories and copy a few files, how hard could that be? Turns out that my directories had complex names, some containing spaces, and string manipulation is a bit weird in shell script. It took longer than it should have, but in the end I was able to write a script that would go through my list of files (that I exported from R-Linux) and move each file (and create the folder structure at the same time).
The script is a bit rough around the edges (I was learning on the go), but it got the job done. Like I mentioned, I also used R-Linux to export the file and directory listing into a text file. Without this text file, the script won’t work. If someone bricks a WD NAS, finds my answer, and is programmer, maybe they can add that portion to the script, so there’s no dependency on R-Linux and someone else who runs into this issue can simply run one script after installing the e2tools.
You have to define a few variables. You have to specify the path to the WD NAS, the path where you want to copy the files to, and the path where the file containing all the entries exported from R-Linux is located. You should create a small test file with only a few entries to make sure it works. Remember this will copy each file one by one, so it can take a while if you are moving several TB, like I was.
Hope this helps at least one person out there!
It’s a Bash shell script, you need to save this in a text file, add a .sh extension and execute it. Here it goes:
sourcepath=“/insert/path/of/WD/NAS/” # i.e. ”/dev/sdb4:/"
destinationpath=“/insert/path/of/destionation/HD/” # i.e. “/mnt/hgfs/NAScopy/" #
FILE=/path/to/file/with/filenames/exported/from/RLinux.txt #i.e. /mnt/hgfs/NAScopy/tree.txt
while read line;do
echo "Line # $k"
if [ "$lastchar" == "/" ] #crete folder if entry is a directory
echo "this is a FOLDER"
mkdir -p "$line"
sourcedir="$sourcepath$line" #need double quotes in full pathname
destdir=$(dirname "$line") #extract directory
sudo e2cp -v "$sourcedir" -d "$destdir"
#echo sudo e2cp -v "$sourcedir" -d "$destdir"
done < $FILE
echo “Copied all $k files”