Important r/w attributes for video recording

I work for a company that builds video servers, or DVR’s if you like. These servers can have anywhere from 1 - 32 hard drives in them. These hard drives function over an sas controller on the motherboard. All video footage is recieved over ethernet through a single NIC. 

So lets say I’m looking at my network activity in the resource manager and I can see all the cameras and their bitrate over the network along with the total bitrate of the network. At what point do I need to be concerned with the write speeds of my hard drives and their configuration? (Raid 0, 1, etc)

I would imagine when the combined bitrate of the given cameras recording to a specific hard drive exceeds the drives’ write speeds then we’d be in trouble, but maybe it’s not that simple.

These questions all lead up to a current problem we have very consistantly with our servers where cameras will ghost, or put differently: The video stream frames will bleed through one another and show past frames together with live footage and it’s recorded like that. Video Example Here about a minute 10 seconds in.

There are a lot of variables that I think could cause this I’m trying to rule them out one by one.

First thing that comes to mind is the encoding of the stream, or compression/decompression. With the way I think  compression works, it uses frames before and after the respective frame and records the changes rather than the whole frame to save space. If you watch the clip it appears to only really happen on pixels that change in the frame.

I’d love to hear some thoughts from people much smarter than I.

AFAIK there should not been any correlation with the image quality and the RPM on the drive

Lets see what Fzabkar ( the hdd guru :smiley: ) has to say about his

Have you tried lowering the video resolution? If the problem keeps happening it will verify that it’s not a write speed problem.