|Eadweard Muybridge 1878|
Now more than ever people love moving pictures. The history of film and video is a history of steadily advancing technology. That’s because the process of movie-making is taking a series of stills and making them move by passing them in front of light. It’s an illusion. During playback, the discrete nature of such temporal snapshots can create judder, motion blur, and strobing effects which our brain picks out, sometimes subconsciously.
Cinematographers work hard to manage those artifacts by adjusting shutter speed, frame rate, camera movement, and lighting. And as good as they are, other factors, outside of their control, cause changes in motion appearance—such as dynamic range, screen size, and display-specific elements such as response time.
Pixelworks thinks they have a solution which they are calling TrueCut Motion Grading—think of it as color grading but for motion.
Pixelworks says their TrueCut Motion tool works with any source frame rate (including 24 fps). It enables filmmakers to precisely manage, on a scene-by-scene basis, judder, motion blur, and frame rate appearance in post-production. The company says the fidelity of the referenced grading suite is correctly displayed when streamed to TV sets or shown on the latest cinema screens.
To get control over these artifacts, Pixelworks offers three controls. Working alongside a colorist, the cinematographer and/or director can vary the judder—tuning it to the point where it falls out of range of human perception, and adjust the motion blur, all while either retaining a 24 fps (film) frame-rate look or adjusting to alternative frame-rate appearances.
With the arrival of sound, Hollywood locked into 24 fps for cameras, which was the minimum for capturing acceptable sound quality, and the 24 fps frame rate complemented the projectors in use at the time, which used dual or triple rate shutters, allowing each frame to be shown two or three times, which reduced the flicker rate. Moving celluloid with sprockets was just asking for problems, and yet, millions of films were made and hundreds of thousands of projectors ran them. Counting home movie cameras and projectors and the number is ten times that. And engineers and technicians engaged in a continuing effort to identify and improve the side effects associated with the mechanical motion image projection system, which traces its roots back to Edison’s 1892 Kinetoscope.
But around the 1960s things started changing. Color standards imposed by Technicolor were getting established, higher resolution and wider film were being developed moving from 8 mm to 16 mm, and up to 70 mm—still sporketed, still flickering, and often blurring.
TV was also becoming predominant and it had the annoying habit of changing its images every 50 or 60 times a second. One of the most famous artifact of that was wagon wheels in Westerns looked like they were turning backward. That was due to a mismatch between being captured at 24 fps and shown at 60 fps—they were out of sync and we got a strobe effect. TV studios figured out how to manage the TV frame-rates using pulldown processes, telecine, to convert 24 fps to 60 fps—because the math 60/24 doesn’t yield whole frames, the process is complicated and compromised. To add further complication, worldwide broadcast standards require different telecine adjustments.
Then in the 2000s movie studios and a few theaters began experimenting with 48 fps. Peter Jackson's The Hobbit: An Unexpected Journey was released in December 2012 at 48 fps; the first feature film to do so. But 60 fps movies had been tried as far back as 1978 by Douglas Trumbull (Night of Dreams, and New Magic). But there were only a few, very few, theaters that had the equipment to show them, and at the time the movie industry was in a slump, so new investment was not being made.
The roof was blown off the industry in 2016 when Ang Lee had the audacity to introduce his 120 fps work, Billy Lynn's Long Halftime Walk. But like most productions in the last few years, he didn’t use film, he used digital cameras and the film, which got indifferent reviews, was not widely shown at 120 fps. Other speeds like 96 fps were tried by Victor Kossakovsky in 2018 with his Aquarela, an interesting film with almost no exhibitors due to the frame rate (scaled-down versions have since been made).
And so today we have cameras that can capture at almost any frame rate desired and digital projectors that can playback at higher frame rates, although 24 fps is still the norm for both.
Ang Lee came back in 2019 with the highest specification of 4K resolution, stereo 3D and high frame rate (HFR) 120 fps with his famous Gemini Man movie that only 14 theaters in the US could exhibit. Again, early critical response to the movie killed any chance it might have had to transform the movie theater industry, or save 3D.
While all this experimentation with frame rate, resolution, high-dynamic-range, 3D, and amazing sound systems was going on in an effort to attract consumers back to the theaters, the final payoff for a studio was TV. Movies could be delivered on Blu-ray disk at HD or 4K resolution, they could be streamed at various data rates depending on a consumer’s data plan, and they could even be shown on over-the-air (OTA) delivery systems.
As technology advances, the film industry has been struggling to bridge theatrical presentations and TV broadcasts. The challenge is to enable a reliable, consistent presentation that accurately reflects the director’s and studio’s intentions. Movies are defined by their look and feel and some directors will not allow their films to be shown if not reproduced properly.
Pixelworks has been working on these challenges and related cinema delivery problems for over 23 years. They think they have cracked it with their TrueCut motion grading solution.
To illustrate how the problem exacerbates itself, the company generated a chart that shows the difference between the source and delivery capabilities and the impact on motion quality as resolution, speed, color, and other parameters naturally go up over time.
|Display technology is outpacing content creation. (Source: Pixelworks)|
There are at least nine delivery methods for movie release today, and more are expected and even desired. In addition to TVs and computers, mobile devices and amazing new projectors are further complicating the presentation landscape. Many of these devices have advanced display processing capabilities (some of it provided by Pixelworks) to adjust the content to match the capabilities of that display. The company pioneered many of the algorithms used in TVs, projectors, and smartphones, and has established a reputation for picture-quality in the industry.
TrueCut closes the loop so to speak and addresses the issue at the source—the front-end. It provides the creative controls that not only utilize the capabilities of modern displays but actually exploit them and take advantage of the advanced delivery format. By doing so, the company can guarantee content owners a consistent presentation of a new generation of titles that are authored with the TrueCut tools. This is a big deal and nothing like this has ever existed before that is this universal.
A major component of the TrueCut solution is Pixelwork's invention of the Motion Appearance Model—MAM. It builds on the Color Appearance Model (CAM) which seeks to determine how specific colors seen in the studio will appear to the viewer at home or in the theater where the goal is to maintain the perception of the created hue in any environment.
The Motion Appearance Model does the same for movement. When combined with a Color Appearance Model, a new Motion Picture Appearance Model is created. This is the first time such a model has been developed.
Pixelworks put together a panel of expert viewers in developing their MAM. The motion appearance experts looked at a variety of content to document “just noticeable differences” for motion across different displays and viewing distances—you can think of this somewhat like training an AI system by showing it tens of thousands of pictures of cats. The content used had a variety of shots with different camera and subject motion, different contrast and peak-white levels, and different shutter speeds. Something a human brain can easily discern, but most people would have a difficult time describing or quantifying, hence the need for experts. It’s a little like a sommelier explaining the nuances and qualities of fine wine—oh, and it tastes good too. In MAM it’s—oh, and it looks right too.
This is rocket science. Consider the effects of a film that skips a frame or two. Your brain registers it instantly. You may consciously recognize it and even comment on it, but even if not, your brain knows something was missed. Judder or stutter is also affected by luminosity—how bright the images are. An analogy to that is reading the fine print in low light versus in bright light.
Some judder can be acceptable. For example, a standard dynamic range (SDR) film in 4k will be graded at 48 nits (14 foot-lamberts) and displayed at a contrast ratio of from 1500:1 to 2000:1. However, that same content if shown in EDR (extended dynamic range) at a theatrical with108 nits (31 foot-lamberts) or on a high dynamic range on a TV with 1000 nits of peak luminance.
Moving the camera across the frame (often to follow a character moving from one spot to another) is known as panning—what you do when making a video with your smartphone sweeping from left to right to capture the whole vista. Panning in a cinematic movie is very precise and carefully controlled. To do otherwise, directors and cinematographers risk inducing motion sickness, disorientation, and “brain smear” (the viewer struggles to keep up with the content). It’s something taught in filmmaking 101. And yet side effects can still happen.
Cinematographers could mitigate motion artifacts that occur in high-dynamic-range (HDR) filmmaking by slowing down the camera panning to speeds that are much slower than used for standard dynamic range (SDR). Judder can also be reduced by windowing (selecting a specific area) and re-grading with lower contrast only in that area (reading the newspaper in dim light).
Pixelworks’ TrueCut provides tools for filmmakers that allow them to avoid restrictions for motion, contrast, or detail and allows adjustments for motion to be made in post-production in much the same way as color grading is central in the digital finishing process.
The company generated a chart to show the tradeoffs, and what it also illustrates is the non-linear complexities of the problems and how difficult it is to manage in a predictable and repeatable way.
|Judder equivalence versus luminance for various camera pan speeds. (Source: Pixelworks)|
After you’ve mastered frame rate differences, and judder and luminosity differences, you’re ready to take on motion blur.
|There is a relationship between Shutter Speed, Frame Rate and the 180° Rule, but blur is a side effect. The 180° rule can be broken to emulate a specific film era or used to make video purposefully shaky, or outright jarring. The wider the shutter angle, from 270° up to 360° the more motion blur, and the narrower the shutter angle, (less than 180°), the less motion blur.|
Motion blur is a funny thing, a little like lens flare. It’s an integral part of shooting at 24 fps and something that both cinematographers and viewers have become accustomed to. In an attempt to make CGI look like film, video and game directors had visual effects artists add motion blur and (virtual) lens flare.
Motion blur, an artifact that has, sometimes, become a “feature,” is a uniquely filmic phenomenon with interesting physiological aspects: our brains do a really good job of compensating for it, and at the same time can tell when it is not right—i.e., overdone, faked.
There may be an equation somewhere that describes and quantifies blur, but until it found blur has to be graded or evaluated by experts just like the other elements mentioned. And as you might imagine, Pixelwoirks has done just that. The result is a collection of motion/blur data that can be used to compare the motion blur of a shot at 24 fps (with a 180-degree shutter angle), graded and played back at different peak white levels, with the motion blur of the same shot at different shutter angles played back at 48 nits. The next chart illustrates how a change in display brightness alters the perception of motion blur in terms of an equivalent shutter angle.
|Motion/blur model. (Source: Pixelworks)|
Comparing the two charts shows how this multi-dimensional physiological problem can affect us. As the luminance of the display increases, our perception of judder increases but our perception of motion blur decreases.
Therefore, if a director or cinematographer wants to preserve the same perception of judder and motion blur of an SDR, 48 nits grade on displays with higher luminance, then higher shutter angles and slower pans are required. So if the director (or cinematographer) knows the content is being targeted for higher luminance devices (like a 4K HDR TV), TrueCut gives them the tools, the knobs, to match the motion without having to change anything during shooting.
From a feature-benefit POV, Pixelworks TrueCut offers:
|Expanded HDR grading palette||Manages changes in judder and motion detail|
|Allows broader lighting choices||Removes strobing effects|
|Controls motion blur||Finely tunable with the virtual shutter|
|Tune the Frame-Rate look||Maintain 24 fps film feel or choose an alternative look|
|Consistent motion appearance||Creative choice maintained across various displays|
|More creative choices for visual storytelling. (Source: Pixelworks)|
To wit, the company offers a summary infograph that summarizes their end-to-end offering for the studios, content creators, directors, cinematographers, and CGI pros.
|TrueCut platform and ecosystem development. (Source: Pixelworks)|
TrueCut Motion is a software package that has been designed to work throughout the capture-to-distribution chain. The platform allows content to be captured at 24 fps or higher with the ability to deliver any motion look creatives desire.
Used on the set, cinematographers can use the tools to preview and create an initial motion look which is captured as a motion decision list. TrueCut Motion can be used to deliver this motion look for dailies as well so everyone sees the desired look.
In post-production, TrueCut Motion is used after color grading. The user interface has been integrated into popular tools such as Black Magic Design’s DaVinci Resolve, as well as in other forms.
What do we think?
Pixelworks offers TrueCut as a licensed service to its customers (e.g., studios, post facilities), complete with tech support. And the company is constantly expanding the suite of tools, including a hardware platform to run it on for clients with everything on-prem. A customer can take the whole package or just license elements of it. And, it's available now and at work in several big-name studios and facilities.
Chris Chinnock at Insight Media has put together a white paper on this technology, you can see here: New White Paper: Motion Grading Comes of Age.