When does production become post?

When does production become post?

The increasingly common combination of high shoot ratios and tight deadlines puts pressure on facilities to increase efficiencies across production and post. In an effort to reduce turnaround times and cut down the amount of time spent on non-billable activities, production and post teams are now working together from far earlier into the production schedule, with post houses sending staff to set to ensure footage is logged as soon after the shoot as possible, and DITs performing post-critical functions.

Manufacturers are keen to facilitate this collaboration – AJA seem to have kicked off the scramble to unite the two when they released the first Ki Pro back in 2009, and noone seems to have paused for breath since. But there are dozens of factors to consider when planning your workflow, from whether you’ll be handling HDR footage, to IP integration, to the impact of the incoming 5G connectivity standard. With that in mind, we’re taking a look at a few different points of contact, and how you can make sure your workflow there is mutually beneficial for production and post.

Shooting and monitoring

Accurate metadata can speed up post-production immensely, by making it far easier for artists to match the original scene conditions when compositing, compensate for issues with specific cameras or lenses when correcting footage, and more.

Zeiss are currently setting the standard for incredibly detailed metadata with the new eXtended Data lens, the CP.3 XD. As well as giving your DoP precision, quality and all the other benefits of working with Zeiss glass, XD lenses create a huge amount of metadata about each shot, containing details not just of features like focal length and exposure, but details about the lens itself. In post, tweaking this metadata becomes a quicker, easier way to compensate for lens shading, or to correct for the different distortions of individual lenses used in production. When compositing, the metadata drastically cuts down the amount of trial and error (and therefore time) needed for artists to match on-set lighting conditions. This ultimately drives down the time and money needed for post, and so could even help buy you more time on set.

Monitor and recorder manufacturers Atomos have attempted to bring a similar spirit of cooperation to monitoring with their newly announced SUMO 19 HDR production/grading monitor, which can record dailies, proxies or 4Kp60 masters as needed.  This means camera crews can see what they’ve captured in HDR, as it will appear to post teams, and be sure they’re happy with the shot as it appears, rather than having to guess based off a Rec.709 image. The recording feature also means that dailies (or a low res proxy, if you have limited bandwidth/storage) can be send to a post facility immediately, and assembly can begin far earlier than usual.

Solutions like this are making it easier for production and post crews to maintain a common vision of the project throughout, and reduce the time taken to create the final product without limiting either party’s options in that way that, say, Sony baking HLG into footage from some of its lower-end cameras does.

Logging and metadata

Loggers and ingest technicians are increasingly venturing out to log footage as close to set as possible. While data and asset management has been an intrinsic part of post for a long time, it’s now widely acknowledged that by focusing more on this on set, crews can increase the overall efficiency of the project, and drastically reduce the time needed to put everything together in post.

Asset management systems like axle Video are excellent – axle is particularly good if you’re new to this, as you can just point it at your file system and it will automatically index all media files, then update its database automatically in realtime as you add new footage. You can then share low res proxies through a web browser so that people can reject, trim and comment on clips; it’ll even integrate with NLEs so that editors can search new footage without leaving their editing application. It ships with a standard metadata schema, but you can customise this to the requirements of your shoot.

Avid’s MediaCentral | Asset Management option (formerly Avid Interplay MAM) performs a similar function, indexing media in a range of formats and allowing you to add custom metadata in order to make it easier to find. It even allows you to remotely access assets from multiple locations, so if crews at different locations both log footage, all of it will be available for review at the same time. Avid’s MediaCentral system also allows for a high degree of automation when it comes to things like ingest, logging, archiving and sharing footage, meaning you can achieve more in less time, and with a smaller team.

Cloud delivery

Once footage has been logged, it can be sent back to the post facility, or to a staging post if you’re in a remote location. As the available networks have become faster, cloud delivery has gained popularity, whether that’s ENG crews using in-camera FTP capabilities to send footage back to the newsroom, or crews on location leveraging file sharing services to deliver footage to post as quickly as possible. And with 5G set to make 100Mbps over the air file sharing a reality over the next few years, this option is only set to get more popular.

If you’re collecting or monitoring footage from drones, car-mounted cams and other inaccessible recorders, Soliton’s on-camera encoders and receivers are a great investment – they use a mixture of H.265 compression and proprietary RASCOW technology to ensure you see an HD live stream of your footage even in areas where 3G and 4G coverage is patchy, with delays as low as 240 ms.

For reliable file transfer, we’d recommend IBM’s Aspera service. While it’s pricier than WeTransfer, it uses end to end encryption to to keep your footage secure and, unlike consumer services, doesn’t get slower the larger your files are. Another feature we’re particularly keen on is that it calculates the precise time a transfer will take on your current connection before it begins, so if it says a transfer will take seven hours, you can ring ahead and let your colleagues know when to expect the file with a fairly high degree of certainty.

How does this all fit together?

We can help you develop workflows to maximise efficiency in production and post, and advise on ways to prepare your existing infrastructure for the future, or fold new releases into your existing workflow. As well as providing consultancy, workflow design and specialist hardware, we can provide ongoing support and maintenance for your core kit. To find out more, get in touch with the team on the details below.

If you want to know more, give us a call on 03332 409 306 or email broadcast@Jigsaw24.com. For all the latest news, follow @WeAreJigsaw24 on Twitter, or ‘Like’ us on Facebook.

IBC 2016: Avid demonstrates leadership in emerging IP and UHD workflows

IBC 2016: Avid demonstrates leadership in emerging IP and UHD workflows

At IBC 2016, Avid showcases IP and UHD workflow innovation on the open, interoperable Avid MediaCentral Platform.

Avid today previewed solutions for several converging technologies that are driving significant change for the media industry. By supporting real-time IP signals natively in key components of the Avid MediaCentral Platform, Avid is accelerating the industry’s transition to IP and delivering a unified environment for file-based and live signal-based media workflows that will ease the migration to emerging image formats, including UHD.

“To address the intensifying changes that our industry is facing, we are continuing to invest heavily in key technologies and innovations that are important for customer success,” said Dana Ruzicka, vice president and chief product officer at Avid. “At IBC 2016, we are demonstrating how the open, integrated Avid MediaCentral Platform will accelerate the transition to IP stream-based workflows, and make it possible for customers to create, manage, and deliver UHD content powerfully and efficiently.”

Historically, media companies have relied on specialized technologies for transporting video and audio signals throughout facilities and across geographies. Legacy technologies like coaxial cabling and baseband SDI signals were necessary because IP data networks lacked adequate bandwidth. But rapid technological advancement has made it feasible to pass professional audio and video signals over standard IP networks. Unlike traditional baseband infrastructure, IP networks are intrinsically format agnostic, paving the way for adoption of new formats like UHD. Converging on IP networks for file-based and signal-based traffic will provide media companies with increased flexibility, agility, and lower costs.

New Avid video IP integrations at IBC

Avid will demonstrate support for a variety of emerging IP standards, including SMPTE 2022-6 and VSF TR-03, illustrating how media companies can easily manage the transition to converged IP infrastructure over time. Technology presentations will showcase IP ingest, editing, playout, graphics insertion, and monitoring workflows spanning several Avid products, including Media Composer, Maestro, 3DPlay, and Playmaker.

“The innovative architecture of Avid’s MediaCentral Platform treats all content equally, regardless of its format or source,” said Alan Hoff, Vice President, Market Solutions at Avid. “We are delivering upon our vision to provide a unified, open platform for converged file and signal-based workflows by expanding support for emerging standards like IP and UHD.”

New Avid UHD integrations at IBC

Avid will showcase innovative UHD broadcast solutions that integrate seamlessly with both standard SDI production infrastructure, as well as IP production workflows. The Avid UHD workflow enables broadcasters to deliver richer, sharper content without over-investing in new solutions, and is centered on Media Composer | Software, Interplay | Production, Media | Director, Pro Tools, Avid DNxHR, and Avid NEXIS, along with graphics and replay servers.

Avid is also participating fully in the AIMS alliance and showcasing its interoperability at IBC 2016 with products from other vendors at the IP Interoperability Zone in Hall 8.

For more on the latest IBC releases, take a look at our roundup post, give us a call on 03332 409 306, email broadcast@Jigsaw24.com or pop your details in the form below. For all the latest news, follow @WeAreJigsaw24 on Twitter or ‘Like’ us on Facebook.

Media Asset Management with axle and Avid

Media Asset Management with axle and Avid

Let’s say you have this storage thing sorted. You’re got an amazing SAN, you’ve got nearline drives rumbling contentedly close by and the robot in your tape library couldn’t be happier. How are you actually going to keep track of all this stuff? Making sure you can find, manage and monetise assets wherever they are in your storage hierarchy is the job of your Media Asset Management system – and a good one will also help you get through ongoing jobs more efficiently. All the systems we can provide will help your creative and technical teams carry out day to day work more efficiently, so you can save money by automating workflows and ensure that you always deliver jobs on time.

axle

Released in 2012, axle allows you to take one of your facility’s Macs and turn it into a ‘media management and collaboration server’. What this means in practical terms is that it looks through any drives connected to the server, whether that’s the hard drive of each Mac or the contents of your server room, and indexes all the files on them. It then creates an online portal where you can browse every file, regardless of where it’s stored, and your users can preview low-res versions of documents, play back proxy videos, edit metadata and more. Even better, you can create different views of this portal, so people only see the files for work that’s relevant to them, and any clients you give a login to can only see files from their project. You can also save searches for quicker access to common groups of files (say, all the images tagged to a specific location, or everything shot with a certain type of lens).

While this may not sound like it’s that far above and beyond anything else out there, the great thing about axle is that it’s accessible from any device, from your iMac to your iPad to a client’s PC, so you can always access resources. And because everything is web-based, your users don’t have to spend time installing apps on their computers. You can export collections of clips straight to your editor, then view H.264 proxies of the result and flag anything you want to alter in those proxies, without having to have access to the editor yourself (this could work wonders on complex approval processes where you have a lot of non-creative parties to get sign off from).

Once you’re finished with a project, you can even arrange for axle to automatically move it to centralised archive storage – or the cloud, if you prefer to store things online – so everyone can access it if you ever need to reuse it, but it’s not taking up valuable space on your SAN. The project stays in the central index of files, so if you’re ever working on something similar the team will be able to see that some assets already exist, and hopefully save themselves some time and money by repurposing that work, rather than duplicating it.

Avid Interplay 

Interplay has been at the heart of large broadcast AVID environments for some time now, and has a reassuringly strong heritage. A enterprise scale system, it comprises all the components you need for news ingest and delivery straight to air, integrated archive and proxy management, and remote logging and editing. It may seem a daunting amount to take on at first, but you can actually start to harness the Interplay environment on a single server, and only scale up as and when your team are ready.

As Interplay has been designed to work seamlessly with your Media Composer workflow, it’s the perfect tool for managing your assets, projects, users and multi-platform delivery options from a single, central point. You can give your editors more time to cut (and spend less time ingesting and creating deliverables) by introducing lightweight Interplay Central clients to your facility – these act as a browser-based rough cut and project creation assist tool. You can then use the same interface to enable remote viewing and approvals by your clients, as well as allowing everyone to see existing assets while on set to ensure they have all the shots they need before wrapping for the day.

The remote capabilities of Interplay have been moved to the fore recently with the addition of Sphere. Using Interplay Sphere, a second server will dish up media to your remote Media Composer editors anywhere in the world as if they were at your main office, so you can provide a truly global, collaborative service that’ll cover long shoots abroad, journalists in the field or anyone who’s catching up on an edit while delayed on the train – all the while safe in the knowledge that your assets are being managed on your secure ISIS back at the facility.

If you want to see the rest of the Interplay iceberg, you can always call us on 03332 409 306, email broadcast@Jigsaw24.com or drop by stand F33 during BVE.

 

Want to know more? Give us a call on 03332 409 306, email broadcast@Jigsaw24.com or visit us at stand F33 at BVE. To keep up with all the latest news, follow @Jigsaw24Video on Twitter or ‘Like’ us on Facebook