By now, this primer should have helped you understand what RAID is, theoretically. However, you’re probably not at the point where you can confidently choose the right RAID for your project.
Even though this guide doesn’t presume to go into detail on the intricacies of RAID, it wouldn’t be of much use unless there was some practical advice to at least get you started.
Let’s get to it. It’s one thing to throw around theoretical numbers on paper, and another to achieve these numbers in the real world.
In the real world, working in RAID is like having an alligator for a pet. Your undivided attention, focus, patience and tact are mandatory.
If you take it for granted, you’re shooting yourself in the foot.
Sometimes you feel you might need RAID just because everyone else is shouting its name from their roof tops. Not so. You will surprised to see how far simplicity goes.
The first step is to estimate how much footage you are going to work with. How much space will all your source footage take?
Take a look at the chart above to get an idea. For example, working with Prores HQ, a full project, with about 20 hours of material, will need only 2 TB. The data rate for one stream is 220 Mbps, which is 27.5 MB/s. A typical consumer grade 7,200 rpm hard drive averages over 50 MB/s, and tops out at about 100 MB/s on average.
So, if you have two drives – 2 TB each, you can use them in RAID 1 to get a theoretical maximum of 200 MB/s read speed – or 7 streams of Prores HQ, and 100 MB/s write speed, or 3 streams.
That’s if you needed that kind of data transfer rates. Most people don’t.
Most professionals who shoot sub $20,000 dollar cameras shoot in the internal codec (which corresponds to the broadcast codecs above) or via an external recorder to Prores or DNxHD 220 Mbps. This footage is edited natively and finished for Television, Blu-ray, DVD or the Internet.
Simplicity of workflow is paramount. Budgets are tight and it is important to get to the end product as fast as possible.
What if this setup instead had 2 drives of 2 TB each, one with footage and the other as backup, not in RAID? For renders or writes another drive is used, maybe 1 TB or so. For cache/page/temp files, one uses a small 120 GB SSD drive, or a 128 GB CF Card in a laptop slot! Finally, for the OS and Software, one uses another 120 GB SSD drive.
The simplicity of dividing read and write drives in this way is that you are not clogging a particular transport pipe. Remember, a drive can only do one thing at a time – so why not let it do one thing well?
That’s five relatively cheap drives (or more if you need additional backups) on a simple SATA II interface that can be reused for every subsequent project. So, where’s the need for a RAID at all?
But – what if you were compelled to use RAID, simply because you have many streams of data to transfer? You might have a NAS set up in a small facility with a few editors, effects artists and graders, etc.
Calculate how many people (or streams) will be reading from the source footage at any given time. Multiply that by the data rate. Is this value higher than the average transfer rate of your hard drive? Then you might need RAID.
You see, even if tomorrow someone invents a super fast hard disk array for $10, one can still use RAID to further increase speed and provide redundancy. Enough is never enough. This technology isn’t going anywhere soon!
Okay, maybe you need speed and maybe you don’t. There is another more important reason why one might need RAID, and that is redundancy.
We have already seen that RAID is used so that one can continue working even if a drive fails. Now’s the time to calculate whether you can tolerate a drive failure, and if yes, how fast can you get the data from your backup drive to continue working?
Google reports that 3% of drives fail each year in the first 3 years of life. Then this percentage increases each subsequent year. The older the drive the higher its chances of failure.
In most scenarios, it would be wise to peg this figure at 5%. Less is good luck. More is definitely unusual and cause to re-evaluate the drive or the setup or both.
Video producers don’t have to worry about long term drive failure. A feature film has a maximum data life span of 2 years, and most productions are under the 1 year mark. We are talking about building redundancy for work, not archival. For true archival, other solutions exist. If floppy disks are any indication, hard drives are unlikely to be one of them.
Today’s consumer hard drives are good enough for most video production work. When it isn’t, that’s a good time to use RAID – not more expensive enterprise drives.
When working in a multi-user environment, it is important to keep everyone busy. There’s no time to hunt for that backup drive and rebuild or relink your data – not when clients are impatiently waiting, hopefully with a check in their hands.
In this scenario a RAID makes perfect sense. There is one another point to consider – having footage on one system, a NAS e.g., ensures that everyone is ‘on the same page’, and working with the same material. On the flip side, if you don’t manage the networking and permissions well, someone might accidentally write over your valuable data.
The biggest cause of data loss is human error. But everyone likes to blame hardware.
That’s what I meant when I said having a RAID is like having an alligator for a pet. It’s an additional burden, a new task to do everyday over and above everything else.
What you’re doing is estimating your ‘chances’. How much can you get away with? Sometimes the answer is very clear. Sometimes it isn’t. It’s balancing trade-offs between fault tolerance, cost and performance.
- Are you sure your bottlenecks are caused by your hard drive system and not any other component?
- What is the size and speed of your workflow?
- Can single drives performing one and only one function fulfill your speed requirements?
- Are you a one or two-person facility that can quickly pickup work if a drive fails?
- Can you handle the additional burden of running and troubleshooting a RAID?
By now I hope I have given you enough ammunition to at least find out whether or not you need RAID. In the next chapter we’ll take a look at some practical hardware solutions.