I’m comparing drives for a NAS and was confused by something.
On many WD spec sheets, the bit error rate is specified as being less than 1 in 10^14, 1 in 10^15, etc. On the SE spec sheet (linked here), it is listed as being less than 10 in 10^15.
Is this a typo, or actually the case? It seems weird to change notation for the drive, and weirder that a drive marketed as usable in datacenters would have consumer-grade BER.
<1 in 10^14 (like WD Red) is the same as <10 in 10^15 (WD Se). Removing the zero from the “10” is the same as dropping the exponent down one: from ^15 to ^14. Desktop and Datacenter has each used their own format for the error rate but the numbers are pretty much the same. It’s not a typo, just a different way of showing the same numbers.