How do you choose the right video color space for your project? I want to take you through a few basic color spaces and their applications.

A video color space defines RGB chromaticities that together determine a color gamut, a color component transfer function (often referred to using a confusing and unhelpful but common term “gamma”) and the chromaticity of a white point. These values define how color information is encoded for a particular video standard. All video is intended to be delivered and watched on some kind of display device. This could be a television, PC or laptop display, tablet, phone, cinema projector, or HDR television. How your video content will be consumed determines the video color space you need to create in and deliver.

Your monitoring pipeline should be calibrated for the color space of the video standard that you will deliver in.

While HDR is sexy and 4K HDR TV’s are in many homes around the world, reference grade HDR monitoring is out of reach for most of us. So, I’m going to introduce you to the most common video color spaces you’ll hear about, and the one you’ll use for the web, mobile screens and broadcast television deliverables.

Common Video Color Spaces

If you’re not sure exactly what a video color space is, I explain this in my article What is A Video Color Space? For the most part you still only need to be concerned with one color space, and that’s the standard HDTV Rec. 709, or ITU-R BT. 709.


sRGB is a display referred color space originally created for CRT computer monitors, but has become standardized for graphics and print. It is almost identical to the Rec. 709 video color space. It’s based on the same primaries and has the same gamut as Rec. 709 but specifies a different transfer function.

sRGB is still the standard for computer imaging, most consumer to mid level photo cameras and home printers. For professional printing and pre-press purposes Adobe RGB is often used which has an extended gamut that can be reproduced with professional CMYK printing. sRGB doesn’t have anything to do with video in the context of broadcast standards, except to know that for the most part Rec. 709 video will look fine on a sRGB computer display. There might be a slight difference in brightness due to the difference in transfer function.

Rec. 709 (ITU-R BT. 709)


Rec. 709 is the standard camera encoding color space for HDTV with a gamut identical to sRGB. As previously stated, sRGB and Rec. 709 primaries share the same chromaticity values (and therefore the same gamut). However, Rec. 709 differs slightly in its encoding transfer function, and doesn’t actually specify an inverse transfer function for display.

For broadcast encoding, it is defined in 8-bit color depth (values between 0 and 255) where black is level 16, and white is level 235. These are often referred to as “video levels”.

In the case of 10-bit color depth which is common for post production, full range levels are between 0 and 1023, but the final output is mapped to broadcast standard 8-bit 16-235 when creating common deliverables.

Rec. 709 is by far the most common working and delivery color space for most video projects. If you’re creating video for broadcast delivery, or that will be consumed online, then Rec. 709 is most likely what you need to work and monitor in. The Rec. 709 gamut is supported by all common display technologies across many devices. Most computer video players know how to deal with Rec. 709 encoded video, and can display it correctly on an sRGB computer display.


DCI-P3 is a wide gamut video color space introduced by SMPTE for digital cinema projection. It is designed to closely match the full gamut of color motion picture film.

It is generally not a consumer standard and is mostly used for content destined for digital theatrical projection. However, notably Apple have adopted P3 color across many device displays, and the ability to capture photo and video in the P3 color space since iOS10.

Most professional reference monitors are able to display the full DCI P3 gamut.

You will often see a white point specified along with the color space, such as P3 D55, P3 D61 or P3 D65. The D number indicates the target white color temperature given in degrees Kelvin. D55 is 5500K, D61 is 6100K, D65 is 6500K, and the DCI standard white point is 6300K.

Rec. 2020

Rec. 2020 defines the color specifications for UHD HDR. As far as color gamut, it covers a large percentage of the full CIE XYZ color space. The standard defines 10-bit or 12-bit color depth. A few display technologies are fully Rec. 2020 compliant but as yet, it is not a common video color space to be working in for the average video creator.

HDR finishing is becoming more common for commercial high-end delivery, but not something the home freelance DP/colorist or video enthusiast will be equipped to undertake for some time to come. The average consumer HDR television does not meet the requirements as a reference display for post production. Some premium OLED HDR televisions such as the LG C9 and LG CX can be calibrated for excellent Rec. 709 SDR reference monitoring, but should not be used for HDR reference.


Choosing The Right Video Color Space

The source camera files from any digital cinema camera provide images encoded at high color bit depth with a native color gamut that far exceeds the requirements for DCI-specification, and in most cases meets or exceeds Rec. 2020.

Ideally, you should be working in a sufficiently wide gamut color space to encompass all the expected output standards you need to deliver and have your monitoring calibrated to match.

If theatrical digital cinema delivery is one of the requirements, you should work in DCI-P3 using a calibrated DCI-spec projector, or monitor that covers the DCI-P3 gamut.

If HD broadcast delivery is the widest gamut color space expected, or computer desktop / mobile / web at any resolution, you should work in Rec. 709.

Color Management

Whatever color space you are working in, implementing professional color management at each step in your post monitoring pipeline is important. This means setting up your color grading working environment properly and calibrating all your displays.

That said, many of us have to make do as best we can with a consumer monitor or laptop screen. To be perfectly honest, for web delivery this is usually fine when targeting Rec. 709 as long as you’re using a high quality display. However, it’s best to avoid grading to a computer desktop GUI display because there are so many color management variables from OS, to software that are outside of your control.

I’ve compiled a guide to building your own color managed and calibrated monitoring pipeline based around the excellent LG OLED TV’s. They can be perfectly calibrated for Rec. 709 color correction and grading work.

You can add precision calibrated monitoring to your Resolve system for only $2200 – $3000 depending on whether you need to monitor in HD or UHD 4K. Note that in both cases I’m talking about Rec. 709 SDR use only.

I hope this has provided a basic understanding about video color spaces, what they are, the common standards, and how you should be using them.

Further Reading

For a more in depth look at professional color management the following paper from is well worth reading: Cinematic Color, From Your Monitor to the Big Screen

Stay in Touch

If you’d like to be notified of new articles and tutorials you can subscribe to my very occasional email updates.

Please don’t hesitate to comment with your questions either here, on Youtube, or hit me up on twitter, I will always reply.


  1. Hey Richard,
    I’ve got an SDR video I graded in Resolve using Rec2020. Is there a way to get this on YouTube while retaining the colour space (so you see better colour on phones and displays that support HDR)?

    I assume I have to pretend it’s HDR, mastered to my wide-gamut monitor’s nits (~350), with a custom LUT that (more or less) is just a bypass for the YT-generated ‘SDR’ version. Have you (or anyone) ever tried this?

    • Richard Lackey

      Hi! That’s very interesting. I have not tried anything like that myself and have not read anything from anyone about it. What exactly do you have in mind to achieve? You can’t “stretch” anything from SDR to HDR (without some kind of clever computational process attempting to extrapolate values beyond the source gamut and luminance), and if all you want to do is put SDR into an HDR container, there’s no advantage that I can see.

      • No, it’s just about the wide gamut. It’s Rec2020 SDR, but when I upload to YT they convert it to Rec709, so even on HDR capable displays (like phones) that could show the wider gamut, it’s limited.

        So as YT doesn’t support Rec2020 SDR delivery, I expect the workaround is pretending it’s an HDR file, but with suitable metadata to show the SDR content correctly. That has the added benefit of making the screen bright automatically on eg. my phone when it enters HDR mode, to make everything pop.

        I’ll try this over the weekend and post a link here if it works.

  2. Hi Richard, awesome article, thanks a lot.

    I’m working on an Rec2020 / PQ (wide gamut) project in FCPX using an Apple XDR display. When I export to Vimeo, the video looks desaturated, too dark and has a lack of contrast compared to the source file.

    The available HDR video settings for the display are P3 / D65 and there’s a direct feed option in FCPX, so I’m assuming the colors are being translated and displayed accurately.

    When I exported I added the corresponding meta tags as follows:
    Color primaries : BT.2020
    Transfer characteristics : PQ
    Matrix coefficients : BT.2020 non-constant
    Mastering display color primaries : Display P3
    Mastering display luminance : min: 0.0000 cd/m2, max: 1000 cd/m2
    Maximum Content Light Level : 1000 cd/m2
    Maximum Frame-Average Light Level : 0 cd/m2

    However, in the Youtube doc I read:
    “In cinema, it’s common to master HDR videos in the DCI P3 color space, with either the DCI (~D50) or D65 white points. Doing so is not a supported format for delivery to consumer electronics. When mastering, choose Rec. 2020 color primaries (the Rec. 2100 standard implies Rec. 2020 color in many apps”

    Could this be the issue, meaning that I cannot grade HDR on the Apple XDR display for YouTube export? Or do I need to change the metatags? And do you happen to know, is MaxCLL the screen’s peak brightness (e.g. 1000 nits) or the brightest pixel in the video (e.g. 700 nits).

    Thanks so much!

    More details here:

    • Richard Lackey

      Hi Michael, unfortunately I’m not familiar with this configuration, or setting it up in FCP. From what I’ve read (and I can’t say this from actual experience), I have doubts that the Apple XDR display is suitable as an HDR reference, despite the marketing around it. I also would not rely on desktop GUI monitoring, that isn’t driven by a dedicated video output device, and a properly calibrated monitor and monitoring signal path. I know this isn’t of any use to you in problem solving your particular setup. Unfortunately, I don’t know what might be introducing the inconsistency you are seeing.

  3. Hi,
    I just got an ASUS ProArt Display PA279CV, which has multiple presets, REC 709, sRGB etc. It also has HDR. IF I WANT TO GRADE REC 709 footage, and keep my display set at HDR, will that result in a bad or risky color grade across most platforms? Should I only grade on an HDR preset with HDR footage?

    • Richard Lackey

      Hi Tim, how are you using this monitor? Are you using it as a GUI desktop display, or is it fed from a dedicated video output interface like a Blackmagic Decklink card or a Blackmagic Ultrastudio external interface? I would avoid relying on any monitor you are using as a desktop display for color accuracy. You need to feed a separate external monitor with a true Rec.709 video signal from a dedicated video output card or interface, and calibrate the monitor and signal path.

  4. Costadinos Chatzis

    So, in other words, there are no industry standard for HDR yet. Somehow for the most of the customers are “HDR” combatible products. Most of OLED displays strungles to achive 1000 or more nits of luminance, good QLED displays are in 1000-1700 nits range and none of them can fully utilize the high imaging standards of BT.2020. There currently are no commercially affordable options that are able to meet BT.2020’s color specifications and therefore, none of those products should not be used for HDR reference.

  5. Antonio Carlos

    Hola Richard,

    I just want to thank you for all the info that you have post in your website. You are doing a great job and it is really interesting and helpful at the same time.

  6. Hi Richard, great article thanks for sharing.

    Can one argue that 8 bit and bt.2020 should never be mixed? bt.2020 covers nearly 76% of the CIE color space while rec.709 only covers about 36% of CIE.

    Since you get more than double the hues in each color primary you need at least 9 bit color to prevent significant banding, colors that appear within rec.709 will be lost and essentially skipped over when shooting bt.2020.

    Or is it really true that a file can carry a wider gamut of colours (BT.2020) but still be compressed into the same codec without losing anything? My understand is that if you push a wider gamut into a smaller channel depth, you’re going to lose data one way or another – so using BT.2020 (wider) in an 8 bit codec (already slim-ish) is actually not a good choice, rather a smaller colour gamut will give you more accurate colours in the same depth of bits – am I correct in saying this?

    To conclude, I’d expect if you shot content with a LOW or slim dynamic range, you’re actually far better off using Rec.709 as your colour space, or else BT.2020 will map more extreme colours effectively making the “steps” between those colours more prominent. Am I wrong?

    I have these questions because there is a LUT I want to use that expects me to record in BT.2020 footage, which it will then correct to the rec.709 color space. However, arguably shooting in rec709 would not only make the workflow easier but also avoid the conversion. What are your thoughts? Is it the above true, and should I be recording in rec.709?

    • Richard Lackey

      Hi Danny, you’re 100% right, the absolute minimum bit depth you can encode the wider luminance range and gamut of BT.2020 is 10-bit. An 8-bit per channel encoding can be used for Rec.709 but not BT.2020. It is possible to squeeze a bit more into an 8-bit encoding using a log transfer function, but it comes with problems, especially when combined with aggressive compression. What camera are you shooting with? What are your options?

  7. Dear Richard,
    thanks for your thorough article. Would you mind offering some quick advice?
    I’m looking for a monitor, mostly for color grading and 3D modeling (and occasional photo editing). Given my budget, I’m deciding between a BenQ SW240 or PD2700U. The SW240 covers 99% Adobe RGB and 95% DCI P3, but it’s only Full HD.. while the PD2700U is 4K and HDR10, but only 70% Adobe RGB/P3 .
    I’m leaning towards the 4K for future proofing my purchase, but not sure if it makes difference on this size of monitors, and considering I’ll be using it with my MacbookPro 15″ (which I believe already covers full DCI P3).
    Cheers from Spain!

    • Richard Lackey

      Hi Miguel, thanks for the questions! It sounds like you’ve got two different use cases here. For 3D modeling and photo editing, you’ll rely on the output of your desktop display graphics. For video color grading, you’ll want to use a separate dedicated video output interface, such as an Ultrastudio interface from Blackmagic Design which will give you an actual BT.709 video output, and ideally you’ll use a display that you can calibrate for video. Calibration requirements are different for computer desktop output vs video output. This video may prove helpful, it’s well worth watching and gives some displays that are worth considering:

      • Richard Lackey

        Just to add, HDR is another beast entirely and in my opinion (and most colorists) there are still no affordable or consumer displays that are suitable for HDR video reference work.

  8. Thank you very much ! I finally got some proper answers!
    I just want to ask you which one is the right workflow with the X-Rite colorchecker.
    For example If I shot a video with 1 BMPCC4k in RAW and e Canon EosR with Clog and I have an x-rite color checker, when I am on DaVinci Resolve and I go in the color match tab, which are the the correct settings under source gamma, target gamma and target color space? considering I work on a Retina display and I would like to export a video that is playable everywhere on the web.

    thanks for your help!

    • Richard Lackey

      Hi Mattia, when using color match, source gamma should match the encoding of your source video for the clip, so you’ll see the Blackmagic options in the dropdown, there is one for Blackmagic Design 4K film, there are also the Canon log options. Target for your purposes should be Rec.709.

  9. Hi Richard,
    I am wondering if you can assist, In Adobe CC and exporting a sequence using Wraptor DCP does this export convert sRGB to XYZ color space? There is a comment above that hints that this may be the case. I have been able to find all other technical settings required for the cinema file specs for the venue but have not been able to get my head around this.

  10. Hi Richard,
    I thought you could help me figure out a Grading setup:
    I’m about to grade old 16mm film scanned to 5K. Here are the scan specs from MediaInfo:
    Format : MPEG-4
    Format profile : QuickTime
    Codec ID : qt 2005.03 (qt )
    Overall bit rate mode : Variable
    Overall bit rate : 737 Mb/s
    Writing library : Apple QuickTime

    ID : 1
    Format : ProRes
    Format version : Version 1
    Format profile : 4444
    Codec ID : ap4h
    Duration : 8 min 58 s
    Bit rate mode : Variable
    Bit rate : 735 Mb/s
    Width : 3 840 pixels
    Height : 2 880 pixels
    Display aspect ratio : 4:3
    Frame rate mode : Constant
    Frame rate : 24.000 FPS
    Color space : YUVA
    Chroma subsampling : 4:4:4
    Scan type : Progressive
    Bits/(Pixel*Frame) : 2.769
    Stream size : 46.1 GiB (100%)
    Writing library : Apple
    Color primaries : BT.709
    Transfer characteristics : BT.709
    Matrix coefficients : BT.709

    Unfortunately, being a novice, I didn’t have the material scanned in .2020.
    However, I’d like to grade for UHD television, and a possible movie theatre screening, too.

    Should I grade in .2020 (limited for 709) ACES? Or something else?

    • Richard Lackey

      Hi Glenn, since the file is encoded to Rec.709, it contains no color information beyond the Rec.709 gamut. However, this doesn’t stop you from editing and exporting UHD, it will just be Rec.709 standard dynamic range. In any case, unless you’ve got a full color managed video output pipeline to a calibrated HDR monitor, you can’t grade HDR anyway. If I were you, I’d just stick to Rec.709.

  11. Dear,
    Thank you for this post it was helpful. I have one question and I hope to get the answer on it. With the new ACES color space introduced by the Academy I was wondering whats the benefits with coloring in a larger space color if we are in the end exporting it with 709 which is a small spectrum than ACES.

    • Richard Lackey

      Hi Aida, good question, and it all depends on what you need to deliver, how you want to work, and also if you’re using camera sources that give you sufficient image information, and that have a ACES input transform available in the first place.

      The ACES system exists to provide a standardized, ultra wide gamut, scene referred working color space that is not dependent on a particular camera, post production toolset, or a particular output display device or technology. It’s also intended to enable a universal way to archive images with the full extent of their color information, so they can later be repurposed more easily for new display technologies.

      So you can think of this as a standardized container for digital image data that is large enough to accommodate the widest possible image sources (including future camera systems with even better capabilities). The ultimate limit that this container needs to be able to accommodate in terms of color information is actually the limit of human color perception itself. So ACES currently specifies a number of color spaces, for different purposes within the ACES system, that can contain this large amount of possible color information in a scene referred, and standardized way.

      As long as a camera source has an ACES input transform, then image information encoded by that camera can be transformed to an ACES color space. This doesn’t mean you suddenly have extra dynamic range or color information that was never captured. It just means you’ve standardized that image data from a camera manufacturer specific encoding and container into a universal “master” scene referred encoding. This data can then be further manipulated throughout the post production process with a very high degree of precision, combined with other camera image sources that have also been transformed using their respective input transform, and VFX and CG elements that have been rendered for an ACES workflow. It unifies all of these otherwise independent, and differently encoded sources, into a common, scene referred master color encoding.

      This master is used for post, for archiving images, and for creating downstream deliverables for specific display standards. When it comes to preparing images for a particular display output, this requires an ACES output transform, such as BT. 709, or DCI-P3 for example. These are both color spaces limited by the capabilities of the display technology they were created for.

      So, if you work in Rec.709 for example from the very beginning, and throughout post, and create a Rec. 709 master file, the color encoding, and information contained in that master file is limited to what can be displayed on a display made to accept a Rec. 709 input, basically the limits of CRT display technology that we don’t even use anymore. It can be argued that’s not a very good master file, especially if your original camera source image contains far more information than what can be encoded to Rec. 709. However, it also depends on your source. If you’re working with a limited camera source that only ever gives you color information limited to Rec. 709 in the first place, then you don’t have anything to gain by transforming that information into an ACES color space and working with, and storing much larger files. If you’re working with source files from a digital cinema camera however, or any camera that gives you more color information, then there is definitely an advantage to working with, and storing the full color information available.

      You could of course just keep all your source camera media, and go back to that if you need to repurpose, but then, if all your post has been done in Rec. 709, all the color correction work and post processing has to be done all over again for a wider gamut deliverable. The advantage of working in an ACES color space, and grading in HDR to Rec. 2020 is that creating deliverables from that wide gamut master to smaller gamut color spaces, such as Rec. 709 or DCI P3 requires a less intensive “trim pass” to make sure color information and levels are mapped correctly to the deliverable color space while keeping the artistic intent of the grade intact.

      Not everybody is set up correctly to work with ACES though. It requires a HDR reference display, and proper calibration and color management. At the moment, true HDR reference displays still cost $30,000+ so out of reach for 99% of the people reading this for example. While a good OLED TV like the popular LG C9 and now the newer CX TV’s can be calibrated perfectly for Rec.709. You may be able to dabble with HDR grading using a consumer OLED HDR TV, but it’s not a mastering display.

      I hope at least some of that makes some sense, and has maybe helped answer your question.

  12. Marcos Alexandre

    Hello Richard,
    First of all, what a greatness of helpful content!
    Would you think that for UHD 4K Blu-Rays, which one is best to use when playing back on a OLED TV, BT2020 or DCI-P3? I understand 4K BDs are mastered using the container ‘REC 2020’ but it really looks a bit washed out, on the other hand, P3 looks better BUT red colour seems a bit over saturated. Thanks for any help. You’re such a legend!
    Cheers, Marcos

    • Richard Lackey

      Hi Marcos, which model TV are you using? It should detect that the input is HDR and adjust accordingly. You’re right that UHD Blu-Ray is Rec. 2020, and so Rec. 2020 would be correct, but the TV should automatically handle this.

  13. Vini de Moura

    Hi Richard,

    Thanks for a very informative post. You you chose a better colour accuracy (97.5 DCI-P3) over HDR for photography+office daily tasks?

    Many thanks

    • Richard Lackey

      Hi Vini, thanks for the question. Honestly my experience is with video post production rather than photo printing. For desktop computer work, sRGB is still the most common standard, and work created in sRGB will look how you intend it to others on the web, or on other devices.

  14. Hey Richard, thanks for your fast and precise reply! You are right, its a desktop GUI display so I´ll go with Rec709. The Display is a Dynascan-LCD with 8bit color depth, I assume when the LCD is working in 8Bit then Rec709 would be my choice anyway?

  15. Hey Richard, thank you for the knowledge you share on this website! I am wondering which color space I should use in Davinci Resolve when my video will be shown on a wide gamut (98% NTSC) LCD screen, calibrated in D65 (6500 K). Can you help?

    • Richard Lackey

      Hi Matthias,

      Thanks for the question. The NTSC color gamut is actually pretty much an obsolete frame of reference when it comes to comparing the gamuts of different color spaces. Manufacturers shouldn’t really quote it as any kind of meaningful reference. Saying that, so is the CIE 1931 chromaticity diagram that I make reference to, which I refer to because it’s so common. I should really update my article 🙂 A more meaningful point of reference is the CIE 1976 Uniform Chromaticity Diagram. Anyway, that’s not what you were asking.

      Since the gamut you mention, 98% NTSC isn’t any kind of standard, the most important question is how will your video projects be delivered and watched? I imagine the answer is likely to fall into the common Rec.709 video color space. So you would calibrate your monitor to Rec.709, with a white point of 6500K as you mention.

      I can help more specifically if you can let me know how your finished videos will be watched? On which kind of display will your audience see your videos?

      Also, how is your monitoring set up? Are you taking a video output from a video interface, like a Blackmagic Design Decklink card or Ultrastudio interface? Does your display have the ability to load a calibration LUT, or are you using a separate LUT box between the output of the video interface and the display? How are you calibrating the display?

      If this display is a desktop GUI display, then unfortunately you’re pretty limited in how accurately you can monitor in any case, and I would just use Rec.709.

      I look forward to hearing back from you, hopefully I can help once I have a bit more information.

  16. Mickael Delgado

    What colorspace should I use when playing video games in HDR, TV covers 99% DCI-P3 73%Rec 2020

    • Richard Lackey

      Thanks for the question. Unfortunately I won’t be much help. I’m not sure when it comes to gaming.

  17. krishna avril

    Hi Richard, I usually work on Srgb, what is the professional way of colorspace while working… I usually do photographic image works and what colorspace the usual mobile phones or pc monitors have… will it change after finishing editing photo on my Srgb or rec709 colorspace image… I’m not able to find the reasonable answer anywhere… thanks.

  18. Thank you so much for the effort, I will read it all

  19. Hello Richard, hope you can help me. I am trying to figure out the best way to use colourspace, so that what I see in my preview window in resolve is also the same color and gamut as on YouTube and Vimeo, when viewed on a computer, laptop, tablet or phone. Im working with bmpcc4k footage in resolve and on a iMac 2017 (P3-DCI display).

    The closest I have gotten to my goal is with these settings in resolve:

    Input colorspace: Blackmagic design pocket cinema camera 4k film gen4
    output colorspace: SRGB
    limit output gamut to: output color space
    timeline to output luminance mapping: 1000 nits
    timeline to output tone mapping: saturation mapping.

    It is not spot on yet, it has some luminance shifts and a bit of color shifts when uploaded. It is very close to when I play the file in QuickTime on my iMac, but still not perfect.

    Hope you can help me and save me a lot of frustration.

  20. Dear Richard, my question is similar to Peter’s. I use at home a calibrated eizo monitor (threw sdi), and I’m usually mastering in REC709. If I have to deliver a DCP (with Davinci Resolve 15 kakadu), I include in the name that it meant to be screened in REC709 color space. But if I understand your article well, I should actually do it in DCIP3 (my monitor in theory is capable to cover it). In this case I should master the shorts/documentaries that I work on in 2 different ways (REC709 for web/Television and DCIP3 for theatrical screening), or is there a more automated method to switch between the 2 color spaces? What is the usual protocol in these cases? Thanks ahead!

    • Richard Lackey

      Good question. Grading in DCI P3 usually involves a high end cinema projector, or a monitor you know 100% meets the standards. To be honest I wouldn’t make it any more complicated than it needs to be. I would recommend you grade in rec709 for both deliverables and let the Kakadu encoder handle the color space conversion to XYZ for the DCP. It will do that without any input from you. Hope that helps.

  21. Richard Lackey

    Hi Helin, your monitor will show whatever signal is input, so it’s your input to the monitor that matters in this case. If you’re feeding it Rec. 709, that’s what it will display. Calibration of your monitor is another separate matter. What type of monitor? and what is your monitoring path? I’m assuming you are using a video output card for your monitoring?

  22. kishore yadav

    Hi Richard Lackey, there are different type of color space like P3, P3 D55, P3 D61, P3 D65. If i want to grade film what is the color space of Display/Monitor?

    • Richard Lackey

      P3 is the name of the color space for DCI digital cinema, and the D number is the white point measured in degrees Kelvin, as a color temperature. D55 means the white point is 5500K, D61 will be 6100K and D65 will be 6500K. All fit within the DCI P3 gamut. 6500K is used in rec 709 and most other color spaces, 6300K is DCI white used specifically in DCI P3 target color gamut.

  23. Radhakrishnan Chakyat

    We have graded our footage in Adobe Premier Pro CC using a REC-709 calibrated 10bit monitor. But the rendered output looks almost 20% de-saturated. Tried many parameter combinations while rendering. Still not good enough. Any advice?

    • Richard Lackey

      This could be a video levels vs data levels issue. It’s been a while since I’ve looked at the Premiere render settings since I use Resolve for everything. Unfortunately I might not be much help. Are you on Mac or PC? What codec format are you rendering to? Does the rendered file look correct when played out to the monitor but incorrect when played on desktop screen? or does it look incorrect on both?

  24. Pingback:sRGB vs Rec. 709 | Gerdami's Blog

  25. Thanks Richard for this.

    I have a question: If I´m doing a broadcast project in rec709 for a client and they would like to show this film in festivals (projected), do I need to do another copy set to dci-p3 and compensate the grading? I´ve had issues with films on festivals with bleached and blacks lifted and everything looked just terrible, but the same copy looked just as nice as it should on other screenings.

    • Richard Lackey

      Hi Peter, good question. It depends on the technical requirements for the festival and how they are projecting. Are they asking for a full DCP from you?

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Designed by WPZOOM