I’ve had this happen several times with my EX4. NAS runs fine, suddenly spot a red light on #4. Shut down, pull #4, blow some air in the SATA connector, wipe the drive connector, reinsert, boot and all is OK. Would run for months before another red light and I do the same. I already have a spare drive in case it goes down for 10 count. I shut down before pulling the drive so the EX4 doesn’t start rebuilding the raid which could take a day or two since it’s still the same drive.
Which brings me to your upgrade. You can backup to another location , everything on it, including the configuration (from the dashboard), then shutdown, replace all the drives, boot up, configure the array, then restore all the data. This would have to reindex all of your media and you may need to redo some of the settings (not sure all are restored from config file). Of course, this assumes you have another storage device big enough to backup your PR4100. So you would backup to USB everything. How long it takes depends on how much data is in each share. Then setup the PR4100 with the new drives, then restore from USB. Again, could take a long time.
The other option is to swap drives, one at a time. With the 4100 running, simply pull the 1st drive out (or the one that showed the red light), and swap with the new larger drive. This will trigger a rebuild. Let it run. Could take a day or two depending on how much data you have. Once complete, swap in drive #2 and wait. Rinse and repeat 2 more times until everything completes. Then go into the dashboard and expand the volume to use the new capacity.
So which one? Can’t really say for sure. The one by one is easier since there’s no external drive involved plus a bunch of backups. All you do is swap drives. All your data is still available during the rebuild, just a bit slower on response. You also can start once you get your new drives.
The backup and swap all, will give you a backup (you should have one anyway, and repeat on a schedule) but will require an extra external usb drive of sufficient capacity, setting up and testing the backup jobs, then running the backups. You won’t be able to swap the drives out until you complete and verify the backups. Then swap drives, initialize the raid, restore the config, verify the settings are correct. Then start restoring the backups. So a bit more effort on your part. This could also create some “down time” regarding access the files since the shares will be empty until the restore is complete. However, you’d have a set of scheduled backups and a backup location to add another layer of protection.
FWIW, I have 2 NAS’s. My EX4 is my primary and i have another older unit from a different vendor. Everything goes to my EX4. My other simply backs up the EX4. I also have a large USB drive connected to my Router (so really another NAS). My older NAS then backs up to the USB.
Bear in mind I’m only providing an overview of the 2 processes, not detailed steps.
Also make sure you get drives suitable for NAS usage. These should be CMR drives not SMR, so drives like WD Red Plus/Pro, or Seagate Ironwolf/Pro) WD Red/Black, Seagate Barracudas and the like are SMR. You can web search CMR vs SMR to better understand the differences.