WD10EARS - Inconsistent setting values from drive to drive

I bought 1 WD10EARS for testing in a desktop machine. Mostly it’s worked out OK so I bought 6 more - a few to use and a couple for spares. I’m finding the the new drives are not working well at all. In fact they are failing. I’m running them in a system using the Intel i7-920 and the DX58SO motherboard and they are constantly failing to write or read up to speed when running Gentoo Linux. For instance, I try to untar a large file, it starts running and then at a random time I get a kernel failure that says things have been blocked for more than 120 seconds.

I went back to look at the first drive and it’s still working well, but it seems with hdparm I get a few different values in hdparm on the old drive whereas all the new drives have a different set of values. All drive report the same version of firmware.

Things that are different :

keeper ~ # diff Drive_Fails.txt Drive_Passes.txt
6c6
<       Serial Number:      WD-WCAV56065985

       Serial Number:      WD-WCAV55464493
29,30c29,30
<       R/W multiple sector transfer: Max = 16  Current = 0
<       Recommended acoustic management value: 128, current value: 254


       R/W multiple sector transfer: Max = 16  Current = 16
       Recommended acoustic management value: 128, current value: 128
50c50
<               Automatic Acoustic Management feature set


          *    Automatic Acoustic Management feature set
79c79
<       not     frozen


               frozen
82,83c82,83
<       216min for SECURITY ERASE UNIT. 216min for ENHANCED SECURITY ERASE UNIT.
< Logical Unit WWN Device Identifier: 50014ee2ae83ca4d


       200min for SECURITY ERASE UNIT. 200min for ENHANCED SECURITY ERASE UNIT.
Logical Unit WWN Device Identifier: 50014ee2ae6b5ffe
86c86
<       Unique ID       : 2ae83ca4d


       Unique ID       : 2ae6b5ffe
keeper ~ #

The drive that works has the R/W multiple sector transfer set, whereas the one that fails does not.

The drive that works has the Acoustic Management feature set, whereas the one that fails does not.

The one that works has Security frozen whereas the one that fails does not.

I have done nothing at the system config level to change the way the drives are used.

QUESTION: If all of these drives have the same firmware revision then why do they report different values? And what can I do in Linux to change the failing drive settings to match the 1 good drive to see if these values make any difference?

Thanks,

Mark

Hi,

Are you aware of the 4KB sector sizes of these drives? If not this may shed some light …

http://hothardware.com/Articles/WDs-1TB-Caviar-Green-w-Advanced-Format-Windows-XP-Users-Pay-Attention/

I’m not sure if your version of Linux supports this natively.

Regards,

David

I have a question?Will these drives work properly in a raid 0 config?