I noticed few months ago that a given data folder was not getting automatically backed-up (almost 40GB zip file) on NAS Ubuntu server using an external WD 4T drive and suddenly all the external WD drive got damaged, totally unusable on Linux or Windows.
External WD drive was replaced by internal WD 3.5" hard drive, same Ubuntu server, and a full back-up operation was then launched by using 7zip for compression file creation. After finishing this initial full back-up, I then made a copy of the data onto another location in the same WD internal drive. Zip file by zip file was copied correctly except for 1 file.
The system is showing I/O error while trying to read from the very same 40GB zip file which was causing issues back then when external WD drive was in use. To be clear, a new WD internal 3.5" in an Ubuntu server, is causing an I/O error while trying to copy a Zip file “foldername.zip” of 40GB generated by 7zip from another server in the same LAN, which turned out to also caused issues before when external WD was in use in the Ubuntu server.
Obviously I have spent lots of hours on internet trying to understand what is happening. I decided to ask for help here since I cannot figure out about it. So my question is simple:
Could be possible that file compression software generates damage or failure on storage hard drive devices? Why is that detected issues are always related to the zip file created from the very same data folder source? What alternative choices do you guys suggest for LAN back-up on Ubuntu server from multiple OS LAN terminals?
Thank you so very much in advanced for your time and help.