My Book Studio LX hardware issues - possible to still save the data?

Apologies for the lengthy post but I wanted to put in enough details for the experienced folks to be able to offer help and for the inexperienced folks that follow to get some useful info.

I’ve got a stack of 4 Studio LX 2TB drives connected to my Windows XP system. They were bought at different times but they are the same model and capacity. Recently, one of the drives has been generating “delayed write” errors in Windows. This would occasionally happen across all the drives - which I had chalked up to issues with the firewire 800 interface - but usually just rebooting the drives and the system would cause things to start working again. This time, this one drive just continually kept causing errors…I figured that was my warning to start backing up that data and getting off that drive.

During the various sessions to copy data off the drive though, I started getting CRC errors in Windows. Thinking that the problem still had something to do with the Firewire 800 interface, I disconnected that and connected the drive via USB. No change. I also started to notice that even though I was getting CRC errors when copying the data, there were some times when a directory/files that caused CRC errors and wouldn’t copy one time were then able to be copied another time…leading me to believe that the drive itself may not have an issue…I started to suspect the electronics surrounding the drive. Further reads here firmed up that opinion.

So now I just wanted to find a way to get the data off the drive. I decided to crack open the case and access the drive inside. I only found one article on how to open that type of case but I couldn’t really make that work and ended up (gently) mangling the case a bit to get inside. I made sure not to damage any components inside but it’s clear that the case will eventually be heading for the scrap heap.

Like many before me, I saw that the drive inside was a SATA drive, so I pulled it, opened up the Dell Windows machine, and connected it inside to see if it would read. Drive manager could see the drive but called it “uninitialized” and a Windows wizard popped up, asking me to initialize it. Wanting to go slow, I cancelled and did a bit more research…

I saw various posts talking about the encryption, etc and realized that just putting it in a machine wouldn’t  make the data readable. I pulled it out, connected it back to the Studio LX board and confirmed that it would show up in the machine as before (so no damage) but I knew I still wouldn’t be able to read all the data without error. I started to think about canabalizing one of the other Studio LXs to use the hardware board…but I knew that I’d lose the case…and didn’t want to potentially screw up the data on a second drive… I thought about looking for a Studio LX on eBay and canabalizing that instead…

Then my brilliant mind kept thinking about the encryption. Could it really be that all the data is encrypted on the disk? I brought up the SmartWare software and disabled the security feature…took about a second. I started thinking to myself that if this thing had full disk encryption, there’s no way that it could only take a second to disable it across the whole disk…

First question : Does the Studio LX really use full disk encryption?

I thought about the performance implications of full disk encryption and figured that can’t be the case…and started thinking about the HTFS formatting that I always do when putting in a new one of these…figured that if it’s formatted to NTFS, then perhaps by turning off the security feature, it may be readable by the Dell desktop…I put it back in. No change; windows wanted me to initialize it. This time, I did…

So the drive was now initialized. I did the first “destructive” thing to the drive, hoping that once initilaized, Windows would be able to access the HTFS data and I’d be home free. No dice. Windows now called it “unpartitioned”. At this point, I stopped. Pulled the drive back out and reconnected it to the Studio LX board and connected the USB back to the desktop. 

Ooops. Now the SmartWare software is now saying “No writeable WD SmartWare partition found”. Since it was able to read it before, I suspected the initialization in Windows screwed something up. I disconnected and reconnected the USB a few times…no change. Restarted the Studio LX electronics a few times…no change. Then I read this article from fzabkar and confirmed in my mind that initializing was a bad thing to do. But seeing the reference to a “relatively” easy repair if the previous poster had stopped at initialization, I am hopeful. So that’s where I am. Full stop.

Second question : I want to get the Studio LX hardware back to the point where it can read/recognize the WD partition and the data on it. How do I “undo” the effects of the Windows initialization process?

I figure I will worry about the next step - how to successfully access the data - after I recover from the Windows initialized disk problem.

Thanks to all!

Hello,

You can’t undo the initialization process, I’m pretty sure when you begin the initialization process you get the following message:

“Click  OK  to the “WARNING: Formatting will erase all data on this volume. To format the volume, Click OK. To quit, click Cancel.” message.”

In regards to the partition issue check the article below:

http://wdc.custhelp.com/app/answers/detail/a_id/3890/~/wd-smartware-error%3A-no-writable-wd-smartware-partition-found-on-mac-osx

No, there was no warning from Windows about losing data…and I didn’t go as far as formatting the disk in any manner. In Windows, the partition showed up but it identified it as “Unallocated” and prompted to initialize (but not format) and it only took about a second (which is now costing me hours to try and recover from).

The only reason that I allowed the initialization is because I was under the mistaken impression that all Windows wanted to do is write its signature to the disk/partition. I knew enough to not format in any way…saw the posts that recommended not doing any type of write activities to the disk and yet I had a feeling that - with the drive directly plugged into my desktop, if only I could get Windows to recognize that partition, I’d be home free…

Warning to those in the future : don’t do it! You can’t simply pull the drive out of a Studio LX enclosure (not sure about other models) and plug it into a desktop to read/save the data!

Now trying to image the drive with R-Studio to see what - if anything - can be saved.

compnerd30 wrote:

No, there was no warning from Windows about losing data…and I didn’t go as far as formatting the disk in any manner. In Windows, the partition showed up but it identified it as “Unallocated” and prompted to initialize (but not format) and it only took about a second (which is now costing me hours to try and recover from).

 

The only reason that I allowed the initialization is because I was under the mistaken impression that all Windows wanted to do is write its signature to the disk/partition. I knew enough to not format in any way…saw the posts that recommended not doing any type of write activities to the disk and yet I had a feeling that - with the drive directly plugged into my desktop, if only I could get Windows to recognize that partition, I’d be home free…

 

Warning to those in the future : don’t do it! You can’t simply pull the drive out of a Studio LX enclosure (not sure about other models) and plug it into a desktop to read/save the data!

 

Now trying to image the drive with R-Studio to see what - if anything - can be saved.

 

That’s exacly what windows did, but the Studio comes with the APM (Apple Partition Map) signature, so by writing another signature over it, the volume gets destroyed.

You could try this Linux Live Rescue CD. It states that it handles Mac partitions and data.

http://ubuntu-rescue-remix.org/

I found it here-

http://www.techrepublic.com/blog/10-things/10-linux-rescue-tools-for-recovering-linux-windows-or-mac-machines/

Best of Luck!

So, just an update for folks - I’m hoping to spare some other people pain like this in the future…

It looks like it won’t be a complete data loss for me; I’m in the process of restoring much of the data - 700+GB of digital video - though that process will likely take quite a few days more to complete. The drive/hw itself is going bad - not just the interface card - so the frequent CRC errors is slowing down the process dramatically.

I had contacted DriveSavers…spent quite a bit of time chatting with them and finding out about their service… It seemed like what I needed but the $700-$2,430 estimate - I’d have to send them the drive for a free evaluation - was just too expensive for me. At this age in life, I’m not sure that any of my digital data is worth even the low end estimate to me, so I decided to hunt for other solutions.

I saw R-Studio mentioned in a few posts, so I picked up a copy for about $80USD. I spent a few days - that’s right, days - letting it scan the entire 2TB disk volume but the remaining time estimate would just keep growing and growing…to like 30+ days to finish the scan, etc. I started to realize that either due to all the hardware read errors or simply the sheer size of the data that R-Studio was working with was just too much and bogging down the program. After letting that run for a few days, I interrupted it and started to look at the recognized files to see what it was finding. There was no question that it was finding data on the disk, the trouble was that it wasn’t always piecing the data back into anything meaningful. It was “recovering” file types that I’d never seen/worked with before…this didn’t seem like a viable approach…

Reading through the R-Studio docs, I started to realize that if I hadn’t destroyed the partition in the first place, I would “easily” be able to have it scan the filesystem and recover what it could. I knew that most of the data I wanted was digital video and generally speaking, some unreadable “zero” sectors in a multi-gig video file isn’t really noticeable. So I changed my focus and fixated on restoring the partition table. Since I had 3 other exactly same Studio LX drives - all partitioned in exactly the same way - I used R-Studio to look at all the bytes that make up the partition on the bad drive and copy/paste the values from the good drive into it. Unfortunately, that wasn’t a one-shot deal; I had to write the changes, then shutdown the XP system, power down the drive…then restart everything. I’m not sure if all that was necessary but it seemed like the changed bytes weren’t getting written out to the drive, so I did this to make sure they did. It took about 3 rounds of byte-by-byte copy/paste but I got to the point where Windows and the Smartware software recognized the NTFS partition (yea!).

…but neither Windows nor Smartware could do anything with that partition because the MFT (master file table) was corrupt/destroyed. So I had a NTFS partition but no NTFS filesystem…

Again learning more about R-Studio, I created a 100GB “region” - a logical section of the drive. Since working with the full 2TB seemed to be impractical, I decided to make the drive smaller to work with. Originally I was going to scan smaller regions at a time and recover the data that way…but I still would have been left with files that didn’t have their original names, filetypes or perhaps even correct replacement of the bytes that made up the file. I knew that what I really needed was to get my MFT - the filesystem - back.

Reminder : This is where that one decision long ago to “initialize” the partition for Windows to see caused so many problems! If I hadn’t done that, the filesystem would have been intact and R-Studio could have easily recovered as much data as possible - into the proper files, names, types, etc. Listen to the advice that people give here; if you are starting to have drive issues, do not make any write updates to that drive until you rescue your data!

At this point I noticed a “file scan” feature in R-Studio - a higher level scan than at had been doing before. When I ran that against the 100GB region that I had created (using the 0th btye as the start), R-Studio was able to find the MFT somewhere on the disk and like magic, it was able to show me the same directory structure and filenames, etc that was on the disk! At this point, I could select specific files to restore and R-Studio copies those files onto another disk.

And that’s where I am now. About a day and half left for the most important video to get copied…then I’ll go back and get some of the less important video…I’m guessing that at this rate, it will take another week before I’m done but I’m making progress and it looks like the data will be saved!

R-Studio turned out to be a savior. I like the fact that it tries to protect you with all sorts of default settings and warnings that keep you from writing to the damaged disk, etc. There are probably other similiar programs out there that people find just as capable/better…I felt desperate, so I went with the first thing mentioned.

…and yes, I have now created a monthly reminder to myself asking “is all the data you care about backed up?” and I am buying 2TB USB external drives to start backing up all the data I have. It will take a few months; I’ve got about 10TB of total data that I want to protect and I’ll want to have 2-3 copies…but over time, I will get everything backed up properly!

Good luck to you all!

Serious congratulations are in order! Great Job! It is great to see persistence pay off.

I have gotten far more diligent about my photography data since I went through a month or so of thinking I’d lost 500+ GB of hi res film scans. In the end I got it all back after I had totally accepted the loss. I’m glad I don’t have 10 TB to take care of in triplicate! It’s more like 1 TB in my case, but that still eats a lot of space.

Thanks :stuck_out_tongue:

I’d accepted the loss too…but honestly, now that I see the video files being restored…I feel like I dodged a bullet. Actually, this is my third bullet:

The first time I had a similar issue, it was with a network file system device that you mounted remotely on your network. Never had it backed up…the device started to give errors, then failed. In that case, I popped out the drive and it turned out to be simply a Linux formatted filesystem…I was able to mount it on my Linux system and presto - 30GB of once-in-a-lifetime photos, all saved!

The second time was with about 50GB of once-in-a-lifetime photos on a USB external drive…the drive started to whine and make noise (no errors though) and luckily I paid attention, copied everything over and when I went to (carefully) pick up the drive while it was running to move it, the platters crashed into the heads and the whole thing came to a dead stop. Bricked!

I’m an IT guy in my day job and you’d figure that, plus the near-death experiences would have fixed my lazy backup problem but truthfully, it’s the sheer volume that makes it so hard to handle. No matter what year it is, the size of all the files that I should be backing up becomes intimidating when compared to available disk volumes (that I can afford)! I’d kill to go back to “only” 30-50GB of data!

I’ve dodged a couple, and taken a couple when I didn’t know as much about recovery procedures, and permanently lost some stuff. You’d think the lesson would sink in at some point. Keeping large multiple backups organized is a challenge, though.