I have something like 100GB (30k files) that is very important to me, I can’t afford to lose them.
Yes I have 2 copies and M-Disc which is today the most reliable media.
Having said that, the backup used as an archive…
I have seen before that a file (especially image/zip) get corrupted silently. There is no alarm or anything to indicate this issue and then I can use the Master to overwrite backup with this fault. (It happened to me)
For two files I run hashing with MD5 and validate if the file has been changed before I copy it (and after it’s copied. But I can’t really do this for 30,000 files, not to mention that I don’t fine the tool that can do the comparison to validate
I hope my explanation is clear (for those who read so far )
Is there any solution to the scenario above?
Thanks in advance!