17 minute read

POST-PRODUCTION ETC Publishes Specs

POST PRODUCTION POST PRODUCTION

www.content-technology.com/postproduction

Cinema 4D and Redshift Usage-Based Licensing, Cloud Rendering with Conductor Technologies

MAXON, THE DEVELOPER OF professional 3D modelling, animation and rendering solutions, has announced the immediate availability of usage-based rendering for Cinema 4D and Redshift with leading cloud rendering company, Conductor Technologies. Recently expanding their platform to include GPU support, Conductor provides on-demand metered access to an unlimited number of Cinema 4D and Redshift rendering nodes with per-minute usage and licensing options, with the ability to scale up and down based on project needs. Conductor is a secure, dynamically scalable, cloud-based platform that enables VFX, VR/AR and animation studios to seamlessly offload rendering and simulation workloads to the public cloud. It integrates into existing workflows, features an open architecture for customisation, provides data insights and can implement controls over usage to ensure budgets and timelines stay on track. Rendering in the cloud allows artists and studios to expand their rendering capabilities to preview scenes faster, meet deadlines, free up local hardware for other tasks (design, video encoding, or post-production for example) and explore more complex and challenging projects. With Conductor’s multi-cloud platform leveraging both AWS and GCP, studios can scale their compute power cost-effectively with production proven tools and expertise.

“The ability for artists to improve design visualization and generate content in shorter timeframes is critical in today’s content production environment,” said David McGavran, CEO, Maxon. “The easy integration Conductor offers into existing workflows will allow our Cinema 4D and Redshift customers the flexibility to take full advantage of cloud computing power to accelerate turnaround times and deliver greater photorealistic and high-resolution imagery.” Added Conductor CEO, Mac Moore, “We are thrilled to finally bring the Cinema 4D and Redshift offerings to Conductor. Maxon’s combined focus on ease-ofuse and high performance perfectly aligns with what we strive to enable with our cloud platform; near-infinite computing power at the push of a button.” Visit https://beta.conductortech.com/maxon and https://www.maxon.net

Onboarding Package for AWS Thinkbox Cloud Rendering

DIGISTOR HAS RECENTLY DEVELOPED an ‘onboarding package’, enabling existing on-prem Deadline users to scale onto the cloud using AWS Portal. For customers without an existing on-prem environment, a hybrid can be set up and configured for easy access. This is a solution for those operating in the areas of visual effects, animation and editing, who would benefit from less time spent on waiting for renders to finish, and would like to utilise Digistor’s experience in deploying and supporting cloud-based workflows. AWS Portal simplifies the process of launching infrastructure and rendering in the cloud, through bridging an on-prem Deadline environment into the cloud, using AWS EC2 spare capacity (EC2 Spot). It also facilitates secure communication between the on-prem and AWS cloud environments, handling the asset transfers and software licensing. Digistor’s ‘onboarding package’ enables customers to take advantage of cloud rendering utilising the AWS Portal. Taking customers directly to the cloud by setting up and configuring the AWS cloud environment and communicating with the existing onprem Deadline environment. Once the cloud environment is configured customers can either choose to maintain the environment or engage Digistor to provide ongoing support via a subscription-based support contract. Pre-requisites: • Suitable Portal Server (Digistor can quote and supply if needed) • Low-spec on-prem Windows or Linux server, physical or virtual • Could be existing Deadline Server • Existing AWS account • Alternatively, Digistor can host the cloud environment if required and/or configure a new Customer account if necessary. • Reliable internet connectivity with Min 15Mb upload speed • Higher upload speeds will shorten asset upload times and enable render jobs to start sooner in the cloud. • All required licenses for the host application, renderer and all plugin’s Digistor will help determine license requirements as part of the initial consultation process. The onboarding package includes everything required to get customers started with bursting into the cloud including: • Initial consultation meeting to discuss requirements • Preparation of AWS Portal Server • Configuration of cloud infrastructure • Building custom image if required • Establishing communication with the on-prem environment including firewall considerations • Basic training and handover documentation Visit www.digistor.com.au

New Quixel Mixer Smart Materials for Unreal Engine

FOR UNREAL ENGINE USERS, the newest Smart Materials pack has been added to Quixel Mixer. These include metals such as aluminum, oxidized iron and gold, fabrics such as denim, linen and camo gear, and a plethora of plastics, rocks, woods, and leathers. Every Smart Material is ready to go right out of the box and completely free to use. Quixel Mixer is a free tool that seamlessly fuses scan data, physically-based rendering (PBR) painting, and procedural authoring. With a vastly streamlined texturing experience, you can design Smart Materials, 3D paint meshes, and create tileable surfaces, all within a single application. With the release of Quixel Mixer 2020, the development team has focused on ways to let users make the Quixel Megascans library their own by enhancing its 3D and surface texturing capabilities. The goal is for artists to be able to make any asset and any surface bespoke to their art style. To that end, the latest update comes packed with 50 powerful new Smart Materials, bringing the total to nearly 150, along with improvements to Mixer’s ease of use. Mixer is available free. Visit https://quixel.com/mixer

AJA T-TAP Support for Mistika Boutique

SGO RECENTLY ANNOUNCED that it has partnered with AJA Video Systems to make its subscription-based full-finishing software solution Mistika Boutique compatible with AJA’s T-TAP, the portable Thunderboltpowered I/O device, in the latest Mistika 10 release. Mistika Boutique is a subscriptionbased full finishing software for Windows and macOS designed to run on industry-standard, offthe-shelf hardware. It features the complete spectrum of professional finishing tools, from conform to VFX, colour grading, Stereo 3D, VR and more. When combined with AJA T-TAP, an affordable, compact video and audio output device that allows professionals to monitor highquality 10-bit HD, SD, HDR and 2K video with embedded audio output from any compatible Mac or PC, Mistika Boutique provides users with a new cost-effective option for feeding Mac and PC outputs to preferred displays for an enhanced finishing experience. “Mistika Boutique was created for any type of post-production facility, including smaller studios or even freelance artists who want to take full advantage of the capabilities that our Mistika Technology software provides, with the added flexibility to work with their preferred hardware. T-TAP is an intuitive and cost-efficient device, making it a complement for Mistika Boutique users finishing 2D, 3D and even VR content on a laptop or computer,” said Geoff Mills, Managing Director at SGO. “In recent months, the importance of strong remote post production capabilities has become paramount, and our partnership with SGO aims to make mobile finishing that much simpler, providing an affordable way to get your Mistika Boutique output from your laptop or computer to a range of supported 3G-SDI and HDMI displays,” said Nick Rashby, President, AJA Video Systems. Visit https://www.sgo.es

DejaEdit Version 3.1

DEJASOFT HAS ANNOUNCED the release of version 3.1 to support private cloud environments using the open source MinIO object storage solution. DejaEdit allows multiple, remote Avid editors to work together by mirroring their locally stored media and edit assets to the local storage of other connected systems, using secure, background transfers across the internet. It can also be used to exchange media with on-location DITs and VFX/Audio houses. DejaEdit uses a centralised server to ensure that all connected platforms have access to the permitted up-to-date media. This central server is based on Amazon Web Services S3 storage and can be either provided by DejaSoft or it can use the client’s own AWS account. New for Version 3.1, the S3 media server can be created by the client ‘in-house’, using MinIO S3 object storage. MinIO (https://min.io) is an open source Object Server, which can be run in-house on a Linux, Mac, and Windows server and it is also available for Docker. MinIO provides data storage via an S3 compliant API. This allows applications, such as DejaEdit, to store objects in private centralised storage, rather than to a cloudhosted Amazon S3 storage device. The ability to operate in a private cloud environment further adds to the security management features that were released for DejaEdit Version 3, where permission to access specific assets by particular users can be controlled via a management console. New software release DejaEdit 3.1 is compatible with workflows for all Avid Media Composer and Nexus systems. Visit https://dejasoft.com

THE BEST

4K CONVERSIONS FRAME SYNCHRONIZER HDR PROCESSING

Because innovation, quality and performance still matters... designed and manufactured in Germany

Y M A D E I N G E R M A N

M A D E I N G E R M A N Y

POST PRODUCTION green-machine.com Asia Pacific HQ: Singapore | +656702 5277 www.lynx-technik.com | 29 joehant@lynxtechnikapac.com

ETC Publishes Specs for Naming VFX Image Sequences THE ENTERTAINMENT TECHNOLOGY CENTER To ensure all requirements are represented, (ETC@USC) VFX Working Group has published the Working Group included over two dozen a specification for best practices naming image participants representing studios, VFX houses, sequences such as plates and comps. File tool creators, creatives and others. The ETC@ naming is an essential tool for organising the USC also worked closely with MovieLabs to multitude of frames that are inputs and outputs ensure that the specification could be integrated from the VFX process. Prior to the publication of as part of their 2030 Vision for the future of this specification, each organisation had its own media creation technology. naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication. ETC’s specification, which aims to standardise the process for media production, is available online for anyone to use. A key design criteria for this specification is compatibility with existing practices. Chair of the VFX Working Group, Horst Sarubin of Universal Pictures, said, “Our studio is committed to The new specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realised that the same naming conventions being at the forefront of designing best industry practices to modernise and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.” can apply to virtually any image sequence. This specification is compatible with other Consequently, the specification was written to initiatives such as the Visual Effects Society handle a wide array of assets and use cases. (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said ETC@USC’s Erik Weaver, co-chair of the VFX Working Group. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.” “Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.” Download the white paper, VFX Image Sequence Naming, via https://drive.google.com/file/d/173 MM9VRVXKZ64dEPixTajnTv04f2o5SN/view Visit https://www.etcentric.org

RE:Vision Effects Unveils DEFlicker v2

RE:VISION EFFECTS, INC., the effects plug-in developer, has introduced DEFlicker v2, a major upgrade to its solution to problematic high frame rate and timelapse footage. DEFlicker takes control of all things that flicker whether you shoot with very high-speed shutter or at very low video frame rates (timelapses). Version 2 provides additional and enhanced tools for correcting highspeed shutter artifacts, timelapse flicker and now comes with 4 plug-ins: HighSpeed: High-speed shutter and high frame rate elements tend to exhibit strobing caused by lighting systems. New in DEFlicker v2 are tools for managing the Alpha Channel for green screen VFX shots. Also added is improved noise reduction and a simplified method for removing the flicker quickly. Now users can enter the frames-per-second of the shot, and the system frequency for their region (50 or 60 Hz) and DEFlicker takes care of the rest. Timelapse: Your solution to handle flicker of image sequences with a lot of motion discontinuities. New features include improvement of render speed for sequences over 4K and better handling of large color and lighting shifts. It now also properly deals with over-range values. Auto-Levels: Extended timelapse scenes often suffer from fluctuation in colour and levels when shot using automatic exposure. DEFlicker already allows you to display graphically the variation over time and replace missing frames in the sequence. In v2, new methods of fixing or replacing bad or damaged frames are implemented. Rolling Bands: New in v2, the Rolling Bands plug-in can be used in conjunction with the High-Speed plug-in or by itself. It allows you to model and attenuate those annoying dark bands primarily caused by lighting and rolling shutter speed time synchronization issues. Rolling Bands provide interactive ways to model the rolling bands height, the distance between bands, and the speed of roll. It also provides fine band feathering options to match the source video transition from lighter to darker. Temporal processing takes into account the rolling bands speed. All plug-ins will see better handling of linear versus gamma encoded sources (now automatic in AE, matches project settings) and v2 should load v1 based projects without issues. Visit https://revisionfx.com/ Remote, Cloud-based QC HAVING SEEN A GREAT INCREASE in the need for those in the Media processing industry (post houses, editors, colourists, and alike) to have the capability to QC their content remotely, Venera Technologies has taken idea of usage-based pricing introduced with its Pulsar Pay-Per-Use (PPU) on-premise QC software, and applied it to Quasar, the company’s native-cloud QC service. According to Fereidoon Khosravi, SVP Business Development with Venera, “Up until recently, Quasar had been used by those Media companies and organisations that had already moved their content workflow to the cloud, and typically used one of Venera’s tiered subscription plans. However, given the current circumstances, there are a large number of Media Professionals, looking for a remote way to process (including QC) their content, but have been hesitant because they are unfamiliar with the Cloud and what can be done in the Cloud, and equally important, are not interested in committing to a monthly subscription plan due to the uneven and uncertain volume of work. “Given our experience of working closely with a wide range of Media professionals, we have come up with what we think is a winning practical solution for Remote QC – the cloud-based Quasar Ad-hoc plan.” Features include: • File storage; • An easy-to-use tool to upload your content to the cloud. • Instruction on how to use Quasar and its pre-made QC templates (you can modify them as you wish); • Browser-based interface; and • Users only pay for what they QC, simply load credits into your

Quasar account using a credit card, and your account is debited as you QC your content. Visit www.veneratech.com

Join over 9,000 industry professionals who receive our weekly e-newsletter.

Avid Reimagines Workflows with Media Composer 2020 AVID HAS ANNOUNCED a major new release of its flagship Avid Media reliance on QuickTime to deliver better media importing, playback, editing Composer 2020 video editing software. Designed to give storytellers at all and export performance. The media engine increases processing speed of levels the most powerful solution for more creative freedom and workflow hi-res HDR media and provides native support for a wider range of formats, flexibility, Media Composer 2020 includes a redesigned customizable user including direct media access and Open EXR for over-the-top services such interface, a new Universal Media Engine, finishing and delivery tools, and as Netflix. Media Composer 2020 enhances a user’s ability to easily create support for Apple ProRes for Windows and Catalina, among many other content for mobile video platforms and social media by providing 9×16 and enhancements. 1:1 Mask Margins and FrameFlex framing pre-sets. More customizable user experience – With Media Composer 2020 users can tailor their workspace to exactly how they want to work. Improvements to the panelled UI dramatically increase ease of use and faster editing and mastering. A new Timeline Sequence Map increases efficiency by letting creators navigate their entire sequence without taking up the whole screen, while the Blank Panel unclutters the UI and stops panels from resizing. Finish and deliver with precision – Expanding on the editing and finishing capabilities introduced a year ago, Media Composer 2020 enables users to fine-tune colour with greater precision and make more granular gain value Apple ProRes for Windows and Catalina Support – Like Mac users, Windows users can now create, edit, collaborate, and export ProRes media natively with encoding supported on Windows machines for media creation and exporting to .MOV export, MXF OP1a, and MXF OP-Atom workflows. Media creators also can use Media Composer on Apple’s latest macOS Catalina, a 64-bit OS that provides superior performance while leveraging the power of the new Mac Pro. Media Composer | Enterprise – Additionally, Media Composer | Enterprise adjustments when working in ACES (Academy Colour Encoding System) expands its role-based customization capabilities to enable users to deploy spaces. Users can finish high-resolution and HDR projects with total colour or update site settings across an organization and deploy user settings precision and interoperability, ensuring pristine picture quality throughout independently to individuals or groups quickly without impacting any their workflow. existing site settings. With more studios managing remote teams, Media Next Generation Avid Media Engine – Media Composer’s powerful Universal Composer | Enterprise gives users more control over their productions. Media Engine enables users to accelerate their workflows by reducing the Visit www.avid.com/media-composer

Bluefish444 Supports Foundry Nuke & Nuke Studio 12 BLUEFISH444, the manufacturer of uncompressed 4K SDI, ASI, Video Over IP & HDMI I/O cards and mini converters, has announced support for Foundry Nuke and Nuke Studio 12 in its latest 2020.14 Windows Installer for KRONOS K8 and Epoch hardware. Bluefish’s K8 video I/O card and the entire Epoch range now support 2K/HD/SD-SDI playback within Nuke and Nuke Studio 12, giving post-production 2D/3D compositing and visual effects professionals access to the high quality associated with Bluefish video cards. With the K8 and Epoch video cards, Nuke and Nuke Studio 12 now have access to the highest quality SDI playback with proprietary 12-bit processing, supporting both RGB and YUV colour spaces. Currently supporting 2K/ HD/SD video modes. Bluefish will continue to work with Foundry to update support for the Nuke and Nuke Studio products and will be integrating 4K/UHD support in a forthcoming installer update. Visit https://www.bluefish444.com

Automated HDR to SDR Conversion

LYNX TECHNIK, the provider of modular signal processing interfaces, has announced that HDR Evie+ is now available for its greenMachine platform. LYNX Technik’s HDR Suite of Processing Solutions (HDR Evie+, HDR Evie, HDR Static) for the greenMachine platform addresses the challenge that broadcasters and content creators are facing when there is a need to broadcast or archive content in both SDR and HDR. The simple solution is to have two independent production routes for SDR and HDR, but this is costly and falls short of delivering full HDR content to the customer and ultimately the subscriber / viewer. The greenMachine suite of HDR processing tools addresses the issue of the simultaneous workflows by combining optimized HDR with up-converted SDR sources into a single HDR production process, thus eliminating the expensive and time-consuming dual SDR & HDR production. HDR STATIC, HDR Evie (Enhanced Video Image Engine) and HDR Evie+ processing applications run on the now familiar and award-winning LYNX Technik greenMachine hardware platform. These industry leading format conversion solutions ensure facilities can, for example, use a single greenMachine titan hardware module to up-convert four independent 3G SDR sources (e.g. SDR-only cameras, graphics, replays, external feeds, archives, etc.) to HDR in a variety of formats, and feed directly into the HDR production workflow. This conversion ensures that HDR content is delivered direct from the optimized HDR cameras without compromise. Once the content is ready for delivery / broadcast / streaming to clients and subscribers/viewers, greenMachine can down-convert one of the HDR program output feeds to SDR, ensuring media facilities can deliver content to both HDR-capable screens and to viewers who are still watching content in SDR. HDR Evie+, the most recent addition to LYNX Technik’s HDR line-up for its award-winning greenMachine platform takes things to a new level. It makes use of patented, industry leading dynamic segmented frame-by-frame algorithms that use sectional dynamic tone mapping that allows adjustment of each segment (144 segments/frame) of the 3G or 4K HDR content all in real-time. The segmented dynamic conversion by greenMachine HDR Evie+ is especially suited to demanding and unpredictable content, with fast moving subjects and high contrast conditions typically found in live sports and news broadcasts. The entire range of LYNX Technik’s greenMachine HDR < > SDR processing solutions support a range of open standards for conversion, tone mapping, and color gamut, including HLG, PQ, SDR, and SLog3. Rec709, Rec 2020, and camera standards by Panasonic, Sony, Arri, ACES, DCI-P3, RED and BMD. Visit www.green-machine.com