It does seem very unlikely for both drives to fail at the same time like that, unless there was some external factor causing the failures. A few possibilities come to mind:
Environmental issue - Was the NAS kept in a very hot environment or subject to vibrations/impacts during the drive swap? Environmental stressors like heat, vibration, or power fluctuations could potentially cause multiple drives to fail.
Faulty SATA connections - If the SATA cables or connections inside the NAS were faulty, it could cause issues when drives are swapped out. I’d inspect the SATA connections and try different cables if possible.
Accidental damage during drive swap - Any impacts or static discharge to the drives during the hot swap could potentially damage them. Always handle drives very carefully.
Faulty NAS controller/backplane - In rare cases, a problem with the NAS itself could damage drives or corrupt data when doing a swap. If issues persist with new drives, this may be suspect.
Bad sectors spread across both drives - If both contained bad sectors in the same physical areas of the platters, the stress of a rebuild could cause failure. Unlikely but possible in theory.
So in summary, I’d first inspect for environmental issues, connections, NAS hardware. If those are all fine, it may have just been incredibly bad luck with timing of existing latent drive issues. But for both to fail simultaneously like that, an external factor is most likely the culprit. Let me know if you have any other details!