Elvin Jasarevic says:
"The DDP booth at Broadcast India 2014 looked impressive with cameras from Sony, Blackmagic and Panasonic RED and Arri.
I remember the days of Apple's Blue and White G3, the first version of Final Cut Pro and Sony DSR decks. It must have been some time in 1999, when DV revolution started. It was a time when some even believed that with the help of the Digieffects Cinelook or Magic Bullet Suite and it's "film look", that real film would be a thing of the past.
Now, with 4K cameras available widely, I think we are finally there, well... almost. Most promos and even a lot of the movies in cinemas right now are shot on these cameras.
With the capture cards available from multiple vendors and support from almost all NLE applications, you could build your own fully functional 4K NLE for only $10,000. I say 'only' as in 1999 Avid Symphony, based on Avid Meridian hardware, was around $150,000 and it could edit only in Standard Definition!
As with every tech revolution, there are also new challenges. At a recent Broadcast India expo in Mumbai I met many editors and engineers and many of them are still working with SD resolution!
It is not only about buying a 4K camera or 4K capable editing systems. There are many things to be considered: workflow, limited 4K varieties, codecs, bit rates, frame rates and so on. Don't forget that the higher the resolution, the frame rate and bit rate, the faster the storage needs to be.
As 4K is approx. 4 x 1080 (High Definition), it can offer incredibly detailed images. HD has approximately 2 million pixels and 4K increases this to 8 million. So, because we have a really big picture, we can easily zoom in to see part of the image. If we need to stabilize this image it's easy because with TV sets we actually do not use around 65% of this 4K footage depending on the variation of 4K being used.
Changing the current widely adopted HD standard to 4K will be a huge technical challenge for TV companies and unfortunately I cannot realistically see 4K hitting our home TV screens any time soon.
There are many other advantages to using 4K. In some Sony theatres you can actually see 4K movies and companies like Amazon, Apple, Google, Hulu, Microsoft, Netflix and others can actually create 4K productions and bypass the theatres and traditional channels by using the internet. This is far more superior and enables the companies with the ability to utilise this technology to connect to a much wider audience. I believe that this will be the real medium for 4K and if you can afford it, my advice to you would be to shoot with 4K.
Now, let's talk about video resolutions, codecs and storage requirements.
It has been over 15 years since I started to work professionally with video cameras and editing systems. I have been at almost every major exhibition and have been honoured to work as a demonstrator for companies like Adobe, Apple, Avid, Blackmagic Design and for the last 8 years for the award winning DDP - Dynamic Drive Pool, shared storage company by Ardis Technologies. During this time I have seen countless changes in post-production workflows and there are some questions I am regularly asked at shows about shared storage:
I have 15 FCP editing systems and need 20TB of storage. What is the cost?
Or, can this storage be used for 4K capture or grading and so on...
My replies would typicaly be: what video resolution and what codec is being used? How many editing systems would be connected and finally what would be the total number of video streams required. Why do I ask these questions? Well, in order to specify which storage option will be able to deliver the required bandwidth, and allow everyone to be able to edit material in real time, those questions are must.
By getting the right information, I can calculate the required bandwidth for the storage. The general rule is that more hard drives and more raid controllers will provide you with more bandwidth, but in order to be able get all this bandwidth out of storage, you would need also the correct cards for your throughput (I/O). Those are your 1GbE, 10GbE or even 40GbE Ethernet or Fiber channel cards at the back of the storage.
However it is not all this simple as different resolutions and bit rates causes different seek time. For example, if one raid system with 16 drives can play 2 streams of 2K files at 350MB/s each stream, which is around 700MB/s, it does not mean that it would be able to play 190 DV streams (each stream of DV is 3.7MB/s so 700 / 3,7 = 189).
This is because of seek time. A handy general rule regarding the seek time is that the smaller the file size and the bit rate, the more seek time you require. Also, more drives in shared storage means that video file will play only when all heads for each of the 16 drives are in the right position. However, if you use SSD drives there is no seek time and the calculation above would actually be OK.
Now let's compare some of the popular HD and Digital Cinema resolutions and codecs. HD can be 720p or 1080i/p, compressed or uncompressed, 8 or 10 bit and can use different codecs, like ProRes, DNxHD, AVC-intra etc. with bandwidth required from 4MB/s up to 180MB/s for full HD RGB 10Bit 444.
The video formats above are each referred to by their vertical (y - axis) resolutions, whereas the digital cinema formats below are referred to by their horizontal resolution.
2,3,4,5 or even 6K can have resolutions from 2048 x 2160 up to 6144 x 3160 and can use codecs such as XAVC, R3D, Arri raw etc. with bandwidth required for each video stream from 30MB/s for 4K Sony XAVC422 10Bit or R3D at 40MB/s up to full 4k DPX uncompressed 10Bit 24fp/s at astonishing 1.27GB/s!
Put simply;
2k = 2048 wide (2 x 1024, where 1024 is 1K) so it's 2048 x 1152
3k = 3072 wide (3 x 1024) for 3072 x 1728
4k = 4096 wide (4 x 1024) for 4096 x 2304 etc.
How about workflow?
With some cameras, such as the Sony F55 camera, you can capture both HD and 4K images simultaneously, so you can do rough editing of material using the HD proxies while the 4K RAW files are archived, colour-corrected etc. Working with lower resolution means your editing will be so much easier and the response in NLE application will be faster. There are a few workflows available, but in relation to editorial there are 2 ways:
Native Editing and Transcoded Editing
With native editing, you edit files as you captured them. It is more difficult and most editors try to avoid this, as in general it means you need to have the fastest PC and the fastest storage.
If a client will need lower resolution files for Dailies, they will usually create ProRes or DNxHD files for offline editing and then after the editing is done they can simply relink this final edit (AAF/ XML file) with original material from the camera for final grading which is the final step in this process. BMD Resolve would be a good and inexpensive starting point in the grading process.
Of course, in any modern workflow with thousands of files or projects, MAM or PAM will be of great help and Focal Point Servers deserve to be looked at.As you can see, one of most important components in workflow for any company is shared storage. It needs to be easy to use, with all the right tools and most importantly, it needs to be very fast.
My recommendation is of course DDP - Ethernet based SAN system and 2014 award winning shared storage by IABM - International Association Broadcasting Manufactures.
Don't take just my word for it (or the word of 22 independent judges); check out DDP at your dealer and see the difference for yourself."