I have a My Book Live that no longer works on network and had to pull drive. Need to recover files

I have a My Book Live 2TB drive that I tried all steps to get back on the network and even reading it when directly connected to a couple different computers with no luck. Finally, technician said only choice is to remove the drive from the housing and use a USB drive reader to recover the files.

When I try to access the drive I get only that the drive needs to be formatted. I did not format, but finding the files on this drive seems more difficult than I initially thought.

Any ideas how to get the files off this drive? Any hope of recovering them at all and if so can they be recovered in their original folders?

I used this as a NAS drive for 4 years so there are a lot of files on it.

It sounds like you already removed the drive from the housing.
If the drive is install to a Linux box, install “e2tools” to recover th files from your WD drive as it formatted using a non-standard 64K block size which you can not mount on a Linux x86/x64 box. The advantage of these tools is that you don’t need to mount the drive and that you can also use them to recover missing files from a WD image (extract rootfs.img from data.tar within e.g. apnc-024310-048-20150507.deb).
Example commando’s:
$ e2ls -l /dev/sdb1:/etc (list directory on WD root drive volume)
$ e2cp /home/rootfs.img:/etc/inittab /tmp/inittab (recover inittab file from WD image)

If you only have a Windows system, install Linux Mint on VirtualBox.
Regards,
Ewald

Looks like this will cause me to loose my files though?

We do embroidery and have a lot of .pes and .dst files that were saved on this drive that really need to be recovered.

@Ewald - This sounds promising do you have any guides on setting up a virtualbox on Linux? I am currently using Windows 7 professional.

I once used a program called Baby Linux (same as mini Linux?) to recover files off a bootable Windows system drive that quit booting the PC. By using same PC, Linux booted PC from CD and the data folders were found and copied off that bad drive onto a good drive. So, give it a try.

This thread (special thanks to Ewald for mentioning the Linux e2tools) along with the old thread that’s around (mentioning R-Linux) helped me recover my data from my WD NAS drive. Though I’d contribute the missing piece in my scenario.

If your situation is similar to mine, this might help. You’ll need some Linux proficiency, but this might prove useful.

First a little background…
My WD NAS wouldn’t boot after a power outage (steady yellow light), and although I didn’t have any critical data in there, I had some large files that were not yet uploaded to my offsite backup. I wanted to recover them if possible, and I ended up here in the forums. I found out about R-Linux in an older thread, and it had worked for most people. It runs on Windows and Linux, but I’m currently running Mac OS (don’t have a Windows machine on hand). Since VMWare Fusion allows a 30 day trial, I downloaded it, and created a Linux VM. I installed R-Linux inside my Linux VM. It ran, and I could see my WD drive with all the files. Doing a recovery using R-Linux looked promising… but there was a problem. A permissions problem. No big issue, since I could manually change the permissions (I used an app called BatChmod- LagenteSoft - The pioneer online casino solutions provider to change permissions). The problem is that folders themselves were unreadable when I recovered them. I could not recover any folder that contained sub-folders, R-Linux would not allow it because the folders had the wrong permissions. As a last resort, I started recovering my files folder by folder (this HD had a complex directory structure). After a while, I realized this was not working. I had too many subdirectories and I had to go one by one. Since R-Linux only presents a GUI, and I couldn’t mount the drive via the terminal this was the only way to go. Enter e2tools. I Found this thread, installed e2tools, and realized that now I could copy files using the terminal. I tried using these tools to copy a directory via the command line, but after many failed attempts I gave up. These tools could only copy individual files, not full directories (at least I couldn’t find a way). So I decided to write a script. After going back to R-Linux for one more attempt, I noticed that it allowed to export the whole directory structure into a text file. Great, at least I could use that and I wouldn’t have to do that in the script.

Instead of using Perl or Python (like a normal programmer would), I decided to create a BASH shell script (note- I’m not a shell script programmer). I only needed to create a few directories and copy a few files, how hard could that be? Turns out that my directories had complex names, some containing spaces, and string manipulation is a bit weird in shell script. It took longer than it should have, but in the end I was able to write a script that would go through my list of files (that I exported from R-Linux) and move each file (and create the folder structure at the same time).

The script is a bit rough around the edges (I was learning on the go), but it got the job done. Like I mentioned, I also used R-Linux to export the file and directory listing into a text file. Without this text file, the script won’t work. If someone bricks a WD NAS, finds my answer, and is programmer, maybe they can add that portion to the script, so there’s no dependency on R-Linux and someone else who runs into this issue can simply run one script after installing the e2tools.

You have to define a few variables. You have to specify the path to the WD NAS, the path where you want to copy the files to, and the path where the file containing all the entries exported from R-Linux is located. You should create a small test file with only a few entries to make sure it works. Remember this will copy each file one by one, so it can take a while if you are moving several TB, like I was.

Hope this helps at least one person out there!

It’s a Bash shell script, you need to save this in a text file, add a .sh extension and execute it. Here it goes:

#!/bin/bash 
sourcepath=“/insert/path/of/WD/NAS/”    # i.e. ”/dev/sdb4:/"
destinationpath=“/insert/path/of/destionation/HD/”    # i.e. “/mnt/hgfs/NAScopy/" #


FILE=/path/to/file/with/filenames/exported/from/RLinux.txt    #i.e. /mnt/hgfs/NAScopy/tree.txt


echo "################################"
k=1
while read line;do
        echo "Line # $k"

		lastchar="${line:(-1)}"
			
		if [ "$lastchar" == "/" ]   #crete folder if entry is a directory 
		then
			#create folder
			echo "this is a FOLDER"
			mkdir -p "$line" 
		
		else
			#copy file
			sourcedir="$sourcepath$line" #need double quotes in full pathname
			destdir=$(dirname "$line") #extract directory 
			destdir="$destinationpath$destdir"
			sudo e2cp -v "$sourcedir" -d "$destdir"
			#echo sudo e2cp -v "$sourcedir" -d "$destdir"

		fi


        ((k++))

done < $FILE
echo “Copied all $k files”

This seems a little deeper than I am ready for. We are struggling to recover the missing .pes files from the drive and it appears I need to be a little more proficient with LINUX than I am.

Is there a way to find a file that was open on a computer, maybe like a temp file location that saves a copy?

We use Palette 9.0 to update the .pes files so most of them have been opened at some point on a PC.

Darren, this might be a simpler plan:

http://john-hunt.com/2013/04/25/recovering-data-from-a-wd-mybook-live-2tb-3tbor-similar/