Most people agree ‘the film look’ is the holy grail. In my comparison of cameras based on dynamic range (which many believe is the prime cause of why video looks like video) we have seen how the Arri Alexa is the camera that comes closest to achieving this dream.
But what about all the cameras that are under $20,000? What about DSLRs, the BMCC, the Sonys and the Canons? You’ll find infinite plug-ins and tutorials on how to get the elusive ‘film look’ but do they deliver? Can they deliver?
Defining The Film Look
This is where most people get stuck. I’m no different. To make it easy I cop out and say: As long as a lay person cannot tell the difference, it’s fine. Most of the time though, after I’ve deluded myself sufficiently that my current project looks like film, I show it to a few people and ask: “Was this shot on film or video?”
It’s not a problem that digital video has, inherently. Magazine stills are shot on digital cameras and at printed in 300 dpi. Most of the world did not know or realize when they made the transition from film to digital. A cheap Canon T4i is quite capable of shooting mind-blowing stills in 14-bit RAW – so there’s nothing wrong with the camera or the sensor.
Is the Canon T4i good enough for theatrical distribution?
When most filmmakers start out to buy their first video camera they are limited by budget. The question one is most likely to hear is “Is so-and-so camera good enough for theatrical distribution?”
I asked the same question when I bought my first video camera, and I did everything I could to convince myself it did. But it didn’t, did it? This was in 2008. Have things changed today?
First, let’s look at the numbers:
|DSLRs/Prosumer||DCI 2K Flat Distribution|
|Color Bit Depth||8||12|
|Color Space||Rec. 709||CIE XYZ/DCI P3|
|Chroma Sub-sampling||4:2:0 (4:2:2)||RGB|
|Data Rate||24 to 100 Mbps||250 Mbps|
Resolution-wise, most video cameras are good enough for theatrical distribution. Even the DSLRs with line-skipping achieve about 700 vertical lines of resolution, which is equal to 16mm film. Since 16mm film can be blown up for theatrical distribution this is a non-issue.
Codecs in both categories are consumer-grade display codecs, but what separates the two is the data rate. You will notice that DCI has a higher data rate than broadcast television, but is it good enough? What’s the true story here? Let’s do a quick test:
From the article on the costs of working with uncompressed video we can figure out that an 8-bit 1080p image (according to the first column) is being compressed at about 7:1 (for intraframe) and about 20:1 (for interframe). The DCI codec (intraframe) is compressed at the rate of 7:1.
The DSLRs and Prosumer cameras, shooting in 8-bit has the same compression as 12-bit DCI! To know why both interframe and intraframe codecs equal each other, read my article on Interframe vs Intraframe compression. Data rates don’t tell the whole story, do they?
What about compression? Surely H.264/AVCHD is the culprit? I don’t have a clear answer, but ask yourself this: If a DSLR sensor can shoot great stills, and the data from this sensor is being converted to 8-bit H.264 in camera – how is this any different from film scans transcoded to 8-bit H.264? Surely, transcoding itself doesn’t introduce the ‘video look’, does it?
You see, there is nothing technically incorrect with the specifications of cameras, so you can rest assured your camera is capable of meeting the specifications of theatrical distribution. Broadcast, is another matter entirely, as I’ve written here.
This is why I decided to limit my definition to just the ‘final result’: Can a lay person tell if your movie was shot on video? If yes, you have not achieved the film look.
In Part Two let’s look at the cameras one by one, and judge for ourselves.