When does production become post?

When does production become post?

The increasingly common combination of high shoot ratios and tight deadlines puts pressure on facilities to increase efficiencies across production and post. In an effort to reduce turnaround times and cut down the amount of time spent on non-billable activities, production and post teams are now working together from far earlier into the production schedule, with post houses sending staff to set to ensure footage is logged as soon after the shoot as possible, and DITs performing post-critical functions.

Manufacturers are keen to facilitate this collaboration – AJA seem to have kicked off the scramble to unite the two when they released the first Ki Pro back in 2009, and noone seems to have paused for breath since. But there are dozens of factors to consider when planning your workflow, from whether you’ll be handling HDR footage, to IP integration, to the impact of the incoming 5G connectivity standard. With that in mind, we’re taking a look at a few different points of contact, and how you can make sure your workflow there is mutually beneficial for production and post.

Shooting and monitoring

Accurate metadata can speed up post-production immensely, by making it far easier for artists to match the original scene conditions when compositing, compensate for issues with specific cameras or lenses when correcting footage, and more.

Zeiss are currently setting the standard for incredibly detailed metadata with the new eXtended Data lens, the CP.3 XD. As well as giving your DoP precision, quality and all the other benefits of working with Zeiss glass, XD lenses create a huge amount of metadata about each shot, containing details not just of features like focal length and exposure, but details about the lens itself. In post, tweaking this metadata becomes a quicker, easier way to compensate for lens shading, or to correct for the different distortions of individual lenses used in production. When compositing, the metadata drastically cuts down the amount of trial and error (and therefore time) needed for artists to match on-set lighting conditions. This ultimately drives down the time and money needed for post, and so could even help buy you more time on set.

Monitor and recorder manufacturers Atomos have attempted to bring a similar spirit of cooperation to monitoring with their newly announced SUMO 19 HDR production/grading monitor, which can record dailies, proxies or 4Kp60 masters as needed.  This means camera crews can see what they’ve captured in HDR, as it will appear to post teams, and be sure they’re happy with the shot as it appears, rather than having to guess based off a Rec.709 image. The recording feature also means that dailies (or a low res proxy, if you have limited bandwidth/storage) can be send to a post facility immediately, and assembly can begin far earlier than usual.

Solutions like this are making it easier for production and post crews to maintain a common vision of the project throughout, and reduce the time taken to create the final product without limiting either party’s options in that way that, say, Sony baking HLG into footage from some of its lower-end cameras does.

Logging and metadata

Loggers and ingest technicians are increasingly venturing out to log footage as close to set as possible. While data and asset management has been an intrinsic part of post for a long time, it’s now widely acknowledged that by focusing more on this on set, crews can increase the overall efficiency of the project, and drastically reduce the time needed to put everything together in post.

Asset management systems like axle Video are excellent – axle is particularly good if you’re new to this, as you can just point it at your file system and it will automatically index all media files, then update its database automatically in realtime as you add new footage. You can then share low res proxies through a web browser so that people can reject, trim and comment on clips; it’ll even integrate with NLEs so that editors can search new footage without leaving their editing application. It ships with a standard metadata schema, but you can customise this to the requirements of your shoot.

Avid’s MediaCentral | Asset Management option (formerly Avid Interplay MAM) performs a similar function, indexing media in a range of formats and allowing you to add custom metadata in order to make it easier to find. It even allows you to remotely access assets from multiple locations, so if crews at different locations both log footage, all of it will be available for review at the same time. Avid’s MediaCentral system also allows for a high degree of automation when it comes to things like ingest, logging, archiving and sharing footage, meaning you can achieve more in less time, and with a smaller team.

Cloud delivery

Once footage has been logged, it can be sent back to the post facility, or to a staging post if you’re in a remote location. As the available networks have become faster, cloud delivery has gained popularity, whether that’s ENG crews using in-camera FTP capabilities to send footage back to the newsroom, or crews on location leveraging file sharing services to deliver footage to post as quickly as possible. And with 5G set to make 100Mbps over the air file sharing a reality over the next few years, this option is only set to get more popular.

If you’re collecting or monitoring footage from drones, car-mounted cams and other inaccessible recorders, Soliton’s on-camera encoders and receivers are a great investment – they use a mixture of H.265 compression and proprietary RASCOW technology to ensure you see an HD live stream of your footage even in areas where 3G and 4G coverage is patchy, with delays as low as 240 ms.

For reliable file transfer, we’d recommend IBM’s Aspera service. While it’s pricier than WeTransfer, it uses end to end encryption to to keep your footage secure and, unlike consumer services, doesn’t get slower the larger your files are. Another feature we’re particularly keen on is that it calculates the precise time a transfer will take on your current connection before it begins, so if it says a transfer will take seven hours, you can ring ahead and let your colleagues know when to expect the file with a fairly high degree of certainty.

How does this all fit together?

We can help you develop workflows to maximise efficiency in production and post, and advise on ways to prepare your existing infrastructure for the future, or fold new releases into your existing workflow. As well as providing consultancy, workflow design and specialist hardware, we can provide ongoing support and maintenance for your core kit. To find out more, get in touch with the team on the details below.

If you want to know more, give us a call on 03332 409 306 or email broadcast@Jigsaw24.com. For all the latest news, follow @WeAreJigsaw24 on Twitter, or ‘Like’ us on Facebook.

Soliton announces NewTek integration and improved live streaming ahead of IBC

Soliton announces NewTek integration and improved live streaming ahead of IBC

Ahead of Europe’s leading media, entertainment and technology show, Soliton have announced exciting new updates to their leading H.265 bonded solutions, ZAO and Zao-S.

For those of you who are already fans of the Soliton products, updated features to enhance their Smart telecaster range are below. If you’re new to the range and would like to know more, scroll down for more information on the ZAO and Zao-S, and why we’re so excited about the new updates.

Updates to the Soliton smart-telecaster range for IBC 2017 include:

Integration and support for software-driven IP workflows using NewTek NDI.

Addition of the new ‘BOOST’ feature, that allows the system to maintain extra bandwidth where mobile phone signals are in high demand.

Further improvements to their H.265 codec, aiming to improve small details in the picture, providing an even higher video quality.

Live streaming app ML-Cam for iPhone users and, coming soon, Android users.

You can catch Soliton at IBC, where they’ll be demoing all the new features live, and answering any questions you might have about the updates at stand B.11 in hall 2.

 

The ZAO and Zao-S

The ZAO is a broadcast-grade live video encoder, and was the world’s first H.265 HEVC hardware encoder. It enables full HD live video mobile transmission of live events from a remote location to anywhere in the world, with a simple unit that can be attached to any broadcast camera. The unit allows you to maintain HD quality in low 3G or 4G bandwidth environments, and minimal latency while covering live events – typically half a second, but as low as 240ms.

While the ZAO was the world’s first, the Zao-S is the world’s smallest, providing the same high reliability but in a much smaller form factor – however not quite the same high-bit encoding rate. It’s small size allows it to be easily held in a body strap or mounted directly to the camera.

 

NewTek NDI Integration

Soliton have announced they will integrate NDI into the ZAO and Zao-S. They will be joining the growing number of companies enabling IP-based customer workflows – and Jigsaw24 is particularly interested to see how the ZAO range will work with NDI Cloud, which would be groundbreaking for live production. NewTek’s NDI technology is royalty free and allows video and audio sources to be shared bidirectionally across an IP network.

“Mobile cameras for outside broadcast are not typically network attached and still rely on traditional baseband technology such as HDMI or HD-SDI,” says Mark Andrews, Head of Broadcast for Soliton Systems Europe. “By adopting NDI from NewTek on our receiving platform, we are allowing outside broadcasters to take advantage of IP workflows without the need to change their mobile camera infrastructure.”

Michael Kornet, Executive Vice President of Business Development for NewTek also comments: “Software-driven IP workflows are quickly becoming ubiquitous in video production. NDI-enabled devices like the Smart telecaster products for mobile live streaming, exponentially increase the video sources available for live production and thus create efficiencies and opportunities for customers that did not previously exist. NDI is the most widely adopted IP technology on the market, as shown by the millions of customers with access to it today.”

 

Enhanced live streaming functionality with BOOST and ML-Cam

The addition of the new ‘BOOST’ feature will allow the system to maintain extra bandwidth where the mobile phone signals are in high demand. This feature will maintain high connectivity and improved reliability for when you need your signal not to let you down.

“There is no guarantee of quality of service when using 4G,” stated Shinya Hyakutake, Head of the Broadcast Division for Soliton in Japan. “With our RASCOW protocol for combining multiple 3G and 4G networks on one device, and now with our added BOOST functionality, we see the Zao being able to mitigate against the issues of signal degradation of cellular phone operators. It just makes the whole user experience so much more reliable for live streaming video for bonded systems.”

Bandwidth that won’t cut out right in the middle of a live steam not enough for you? Soliton have more. They’ve further improved their H.265 codec, one that they already believed to be the most advanced on the mobile market. The new codec seeks to improve small details in the picture, especially moving ones, providing an even higher video quality.

 

Over in Amsterdam at IBC, Soliton will be demonstrating the benefits of NDI with NewTek’s Tricaster with live demos, alongside their own Smart telecaster and cloud solutions. IBC attendees can check out Soliton’s products, including live demos of the updated features, at stand B.11 in hall 2.

Want to know more about Soliton’s H.265 mobile products, or want to have a chat with us about IBC? Get in touch by calling 03332 409 306 or emailing broadcast@Jigsaw24.com. For all the latest news and gossip, follow us on Twitter @WeAreJigsaw24 and like us on Facebook