I know better than to contact WD support directly about this, as they will wash their hands of the problem the moment I say “Linux”.
Nonetheless, that’s what I’m using. Linux Mint 19.3, Ubuntu based, on a 64-bit AMD Phenom X2 6-core based PC. The problem that I’m seeing is that when I’m doing large recursive copies with update to my three USB external backup drives (two are 8TB WD Elements, the other is a smaller 2TB WD drive, I forget the exact type) the copies will proceed until an apparently random point in the copy operation. Then the target file system will spontaneously become read-only, according to the message I get in the terminal window, and after a few more copy operations fail because they’re allegedly to a read-only file system, the drive itself will disappear, just as if I’d unmounted it. There are two ways to cure this problem, either actually dismount and remount the drive, which fixes it immediately, or else wait 20-30 minutes and it will (equally spontaneously) ‘reappear’ and will be back in read/write mode.
Does anybody have any clue how to fix this problem, because it is a real nuisance. I set my backups going while I’m asleep, only to wake up and find they’ve failed with this error. It’s taking me three or four attempted copies (and yes, I do the copies via a script and a sudo) to do a backup - and of course, a subsequent copy attempt just sails past the place where the previous one failed without a hint of trouble. It’s almost like there’s a limit on the number of bytes that can be transferred or the amount of time the drive can be active before the problem occurs.
Any assistance VERY gratefully received!
Dont know if you still need help with this issue or not.
- the drives are still in their original NTFS format?
- how large are these are these large files?
- if they are just back-up copies could you just update the difference between the last back-up and the current files/folders that need backing up?
starting with Q2) I have only been able to copy successfully about 30-35gigs in one session without freezing or locking up on a windows system. at around the 60gig mark is where it locked up on me last under windows.
with regards to Q1) yes linux can read NTFS but as Ive seen posted elsewhere you may need to reformat the drives to work better on other OS’s (they suggest EXFAT for Apple and Windows cross compatibility).
Whilst it is now about backups, this thread of mine might be of some minor value to you, its my current experimentation to move over to Linux
Yes, the problem still exists.
Q1) No, I immediately reformat them to ext4 and then run a fsck with a bad block check.
Q2) Some of them are ISO DVD images, so up to 8GB.
Q3) That’s effectively what I do - I use a recursive copy with update.
I’m only interested in accessing these drives from a Linux system, so the rest of your post isn’t really relevant to my particular issue. I think the most I’ve managed to write in one session was with my quick and dirty test program which created an enormous fixed length file, within 50 GB of filling the disk on its own. I once managed to get that to go through to the second pass before the drive went read-only on me.
I looked at your other thread but I’m not a game player, and have no interest in running a dual-boot system. I have one old Windows XP virtual machine because I can’t break my addiction to WordPerfect, other than that I only run Linux (Mint 20 now, BTW).
Does smartmontools show any problems in SMART? Any errors in the drive’s logs?
smartmontools can’t handle USB drives.
I couldn’t find anything that suggested a solution in the drive logs.
I can’t try anything at the moment because I reformatted one of the drives, and it spectacularly failed a fsck.ext4 -cf - that produced more errors than fsck could handle! I’m in the process of RMAing the drive for a replacement, so I only have the one 8TB external at the moment.