Although originally used to describe a process that started with film scanning and ended with film recording, digital intermediate is also used to describe color grading and final mastering even when a digital camera is used as the image source and/or when the final movie is not output to film. This is due to recent advances in digital cinematography and digital projection technologies that strive to match or exceed the quality of film origination and film projection.
In traditional photochemical film finishing, an intermediate is produced by exposing film to the original camera negative. The intermediate is then used to mass-produce the films that get distributed to theaters. Color Grading is done by varying the amount of red, green, and blue light used to expose it. One of the key technical achievements that make the DI possible is the look up table (aka "LUT"), which visually predicts how the digital image will look once it's printed onto normal release print stock. DI facilities generally allow comparing the digital image directly to a print on the same screen, ensuring precise calibration of the process.
The digital intermediate process uses digital tools to color grade, which allows for much finer control of individual colors and areas of the image, and allows for the adjustment of image structure (grain, sharpness, etc). The intermediate for film reproduction is then produced by means of a film recorder. The physical intermediate film that is a result of the recording process is sometimes also called a digital intermediate, and is usually done using internegative (IN) stock, which is inherently finer-grain than camera negative (OCN).
The digital intermediate process uses digital tools to color grade, which allows for much finer control of individual colors and areas of the image, and allows for the adjustment of image structure (grain, sharpness, etc). The intermediate for film reproduction is then produced by means of a film recorder. The physical intermediate film that is a result of the recording process is sometimes also called a digital intermediate, and is usually done using internegative (IN) stock, which is inherently finer-grain than camera negative (OCN).
History
Telecine tools to electronically capture film images are nearly as old as broadcast television, but the resulting images were widely considered unsuitable for exposing back onto film for theatrical distribution. Film scanners and recorders with quality sufficient to produce images that could be inter-cut with regular film began appearing in the 1970s, with significant improvements in the late 1980s and early 1990s. During this time digitally processing an entire feature-length film was impractical because the scanners and recorders were extremely slow and the image files were very large compared to computing power at the time. Instead, individual shots or short sequences were processed for special visual effects. The first Hollywood film to utilize a digital intermediate process from beginning to end was "O Brother..Where Art Thou?" in 2000 and in Europe it was "chicken run" relesed that same year. The process rapidly caught on and it is anticipated that more than 90% of Hollywood films will go through a digital intermediate in 2006. This is due not only to the extra creative options the process affords film makers but also the need for high-quality scanning and color adjustments to produce movies for Digital Cinema.
No comments:
Post a Comment