How do you choose the right video color space for your project? I want to take you through a few basic color spaces and their applications.

A video color space defines RGB chromaticities that together determine a color gamut, a color component transfer function (often referred to using a confusing and unhelpful but common term “gamma”) and the chromaticity of a white point. These values define how color information is encoded for a particular video standard. All video is intended to be delivered and watched on some kind of display device. This could be a television, PC or laptop display, tablet, phone, cinema projector, or HDR television. How your video content will be consumed determines the video color space you need to create in and deliver.

Your monitoring pipeline should be calibrated for the color space of the video standard that you will deliver in.

While HDR is sexy and 4K HDR TV’s are in many homes around the world, reference grade HDR monitoring is out of reach for most of us. So, I’m going to introduce you to the most common video color spaces you’ll hear about, and the one you’ll use for the web, mobile screens and broadcast television deliverables.

Common Video Color Spaces

If you’re not sure exactly what a video color space is, I explain this in my article What is A Video Color Space? For the most part you still only need to be concerned with one color space, and that’s the standard HDTV Rec. 709, or ITU-R BT. 709.

sRGB

sRGB is a display referred color space originally created for CRT computer monitors, but has become standardized for graphics and print. It is almost identical to the Rec. 709 video color space. It’s based on the same primaries and has the same gamut as Rec. 709 but specifies a different transfer function.

sRGB is still the standard for computer imaging, most consumer to mid level photo cameras and home printers. For professional printing and pre-press purposes Adobe RGB is often used which has an extended gamut that can be reproduced with professional CMYK printing. sRGB doesn’t have anything to do with video in the context of broadcast standards, except to know that for the most part Rec. 709 video will look fine on a sRGB computer display. There might be a slight difference in brightness due to the difference in transfer function.

Rec. 709 (ITU-R BT. 709)

476px-CIExy1931_Rec_709

Rec. 709 is the standard camera encoding color space for HDTV with a gamut identical to sRGB. As previously stated, sRGB and Rec. 709 primaries share the same chromaticity values (and therefore the same gamut). However, Rec. 709 differs slightly in its encoding transfer function, and doesn’t actually specify an inverse transfer function for display.

For broadcast encoding, it is defined in 8-bit color depth (values between 0 and 255) where black is level 16, and white is level 235. These are often referred to as “video levels”.

In the case of 10-bit color depth which is common for post production, full range levels are between 0 and 1023, but the final output is mapped to broadcast standard 8-bit 16-235 when creating common deliverables.

Rec. 709 is by far the most common working and delivery color space for most video projects. If you’re creating video for broadcast delivery, or that will be consumed online, then Rec. 709 is most likely what you need to work and monitor in. The Rec. 709 gamut is supported by all common display technologies across many devices. Most computer video players know how to deal with Rec. 709 encoded video, and can display it correctly on an sRGB computer display.

DCI P3

DCI-P3 is a wide gamut video color space introduced by SMPTE for digital cinema projection. It is designed to closely match the full gamut of color motion picture film.

It is generally not a consumer standard and is mostly used for content destined for digital theatrical projection. However, notably Apple have adopted P3 color across many device displays, and the ability to capture photo and video in the P3 color space since iOS10.

Most professional reference monitors are able to display the full DCI P3 gamut.

You will often see a white point specified along with the color space, such as P3 D55, P3 D61 or P3 D65. The D number indicates the target white color temperature given in degrees Kelvin. D55 is 5500K, D61 is 6100K, D65 is 6500K, and the DCI standard white point is 6300K.

Rec. 2020

Rec. 2020 defines the color specifications for UHD HDR. As far as color gamut, it covers a large percentage of the full CIE XYZ color space. The standard defines 10-bit or 12-bit color depth. A few display technologies are fully Rec. 2020 compliant but as yet, it is not a common video color space to be working in for the average video creator.

HDR finishing is becoming more common for commercial high-end delivery, but not something the home freelance DP/colorist or video enthusiast will be equipped to undertake for some time to come. The average consumer HDR television does not meet the requirements as a reference display for post production. Some premium OLED HDR televisions such as the LG C9 and LG CX can be calibrated for excellent Rec. 709 SDR reference monitoring, but should not be used for HDR reference.

476px-CIExy1931_Rec_2020

Choosing The Right Video Color Space

The source camera files from any digital cinema camera provide images encoded at high color bit depth with a native color gamut that far exceeds the requirements for DCI-specification, and in most cases meets or exceeds Rec. 2020.

Ideally, you should be working in a sufficiently wide gamut color space to encompass all the expected output standards you need to deliver and have your monitoring calibrated to match.

If theatrical digital cinema delivery is one of the requirements, you should work in DCI-P3 using a calibrated DCI-spec projector, or monitor that covers the DCI-P3 gamut.

If HD broadcast delivery is the widest gamut color space expected, or computer desktop / mobile / web at any resolution, you should work in Rec. 709.

Color Management

Whatever color space you are working in, implementing professional color management at each step in your post monitoring pipeline is important. This means setting up your color grading working environment properly and calibrating all your displays.

That said, many of us have to make do as best we can with a consumer monitor or laptop screen. To be perfectly honest, for web delivery this is usually fine when targeting Rec. 709 as long as you’re using a high quality display. However, it’s best to avoid grading to a computer desktop GUI display because there are so many color management variables from OS, to software that are outside of your control.

I’ve compiled a guide to building your own color managed and calibrated monitoring pipeline based around the excellent LG OLED TV’s. They can be perfectly calibrated for Rec. 709 color correction and grading work.

You can add precision calibrated monitoring to your Resolve system for only $2200 – $3000 depending on whether you need to monitor in HD or UHD 4K. Note that in both cases I’m talking about Rec. 709 SDR use only.

I hope this has provided a basic understanding about video color spaces, what they are, the common standards, and how you should be using them.

Further Reading

For a more in depth look at professional color management the following paper from www.cinematiccolor.com is well worth reading: Cinematic Color, From Your Monitor to the Big Screen

Stay in Touch

If you’d like to be notified of new articles and tutorials you can subscribe to my very occasional email updates.

Please don’t hesitate to comment with your questions either here, on Youtube, or hit me up on twitter, I will always reply.

35 Comments

  1. Thank you very much ! I finally got some proper answers!
    I just want to ask you which one is the right workflow with the X-Rite colorchecker.
    For example If I shot a video with 1 BMPCC4k in RAW and e Canon EosR with Clog and I have an x-rite color checker, when I am on DaVinci Resolve and I go in the color match tab, which are the the correct settings under source gamma, target gamma and target color space? considering I work on a Retina display and I would like to export a video that is playable everywhere on the web.

    thanks for your help!

    • Richard Lackey

      Hi Mattia, when using color match, source gamma should match the encoding of your source video for the clip, so you’ll see the Blackmagic options in the dropdown, there is one for Blackmagic Design 4K film, there are also the Canon log options. Target for your purposes should be Rec.709.

  2. Hi Richard,
    I am wondering if you can assist, In Adobe CC and exporting a sequence using Wraptor DCP does this export convert sRGB to XYZ color space? There is a comment above that hints that this may be the case. I have been able to find all other technical settings required for the cinema file specs for the venue but have not been able to get my head around this.

  3. Hi Richard,
    I thought you could help me figure out a Grading setup:
    I’m about to grade old 16mm film scanned to 5K. Here are the scan specs from MediaInfo:
    Format : MPEG-4
    Format profile : QuickTime
    Codec ID : qt 2005.03 (qt )
    Overall bit rate mode : Variable
    Overall bit rate : 737 Mb/s
    Writing library : Apple QuickTime

    VIDEO
    ID : 1
    Format : ProRes
    Format version : Version 1
    Format profile : 4444
    Codec ID : ap4h
    Duration : 8 min 58 s
    Bit rate mode : Variable
    Bit rate : 735 Mb/s
    Width : 3 840 pixels
    Height : 2 880 pixels
    Display aspect ratio : 4:3
    Frame rate mode : Constant
    Frame rate : 24.000 FPS
    Color space : YUVA
    Chroma subsampling : 4:4:4
    Scan type : Progressive
    Bits/(Pixel*Frame) : 2.769
    Stream size : 46.1 GiB (100%)
    Writing library : Apple
    Color primaries : BT.709
    Transfer characteristics : BT.709
    Matrix coefficients : BT.709

    Unfortunately, being a novice, I didn’t have the material scanned in .2020.
    However, I’d like to grade for UHD television, and a possible movie theatre screening, too.

    Should I grade in .2020 (limited for 709) ACES? Or something else?

    • Richard Lackey

      Hi Glenn, since the file is encoded to Rec.709, it contains no color information beyond the Rec.709 gamut. However, this doesn’t stop you from editing and exporting UHD, it will just be Rec.709 standard dynamic range. In any case, unless you’ve got a full color managed video output pipeline to a calibrated HDR monitor, you can’t grade HDR anyway. If I were you, I’d just stick to Rec.709.

  4. Dear,
    Thank you for this post it was helpful. I have one question and I hope to get the answer on it. With the new ACES color space introduced by the Academy I was wondering whats the benefits with coloring in a larger space color if we are in the end exporting it with 709 which is a small spectrum than ACES.

    • Richard Lackey

      Hi Aida, good question, and it all depends on what you need to deliver, how you want to work, and also if you’re using camera sources that give you sufficient image information, and that have a ACES input transform available in the first place.

      The ACES system exists to provide a standardized, ultra wide gamut, scene referred working color space that is not dependent on a particular camera, post production toolset, or a particular output display device or technology. It’s also intended to enable a universal way to archive images with the full extent of their color information, so they can later be repurposed more easily for new display technologies.

      So you can think of this as a standardized container for digital image data that is large enough to accommodate the widest possible image sources (including future camera systems with even better capabilities). The ultimate limit that this container needs to be able to accommodate in terms of color information is actually the limit of human color perception itself. So ACES currently specifies a number of color spaces, for different purposes within the ACES system, that can contain this large amount of possible color information in a scene referred, and standardized way.

      As long as a camera source has an ACES input transform, then image information encoded by that camera can be transformed to an ACES color space. This doesn’t mean you suddenly have extra dynamic range or color information that was never captured. It just means you’ve standardized that image data from a camera manufacturer specific encoding and container into a universal “master” scene referred encoding. This data can then be further manipulated throughout the post production process with a very high degree of precision, combined with other camera image sources that have also been transformed using their respective input transform, and VFX and CG elements that have been rendered for an ACES workflow. It unifies all of these otherwise independent, and differently encoded sources, into a common, scene referred master color encoding.

      This master is used for post, for archiving images, and for creating downstream deliverables for specific display standards. When it comes to preparing images for a particular display output, this requires an ACES output transform, such as BT. 709, or DCI-P3 for example. These are both color spaces limited by the capabilities of the display technology they were created for.

      So, if you work in Rec.709 for example from the very beginning, and throughout post, and create a Rec. 709 master file, the color encoding, and information contained in that master file is limited to what can be displayed on a display made to accept a Rec. 709 input, basically the limits of CRT display technology that we don’t even use anymore. It can be argued that’s not a very good master file, especially if your original camera source image contains far more information than what can be encoded to Rec. 709. However, it also depends on your source. If you’re working with a limited camera source that only ever gives you color information limited to Rec. 709 in the first place, then you don’t have anything to gain by transforming that information into an ACES color space and working with, and storing much larger files. If you’re working with source files from a digital cinema camera however, or any camera that gives you more color information, then there is definitely an advantage to working with, and storing the full color information available.

      You could of course just keep all your source camera media, and go back to that if you need to repurpose, but then, if all your post has been done in Rec. 709, all the color correction work and post processing has to be done all over again for a wider gamut deliverable. The advantage of working in an ACES color space, and grading in HDR to Rec. 2020 is that creating deliverables from that wide gamut master to smaller gamut color spaces, such as Rec. 709 or DCI P3 requires a less intensive “trim pass” to make sure color information and levels are mapped correctly to the deliverable color space while keeping the artistic intent of the grade intact.

      Not everybody is set up correctly to work with ACES though. It requires a HDR reference display, and proper calibration and color management. At the moment, true HDR reference displays still cost $30,000+ so out of reach for 99% of the people reading this for example. While a good OLED TV like the popular LG C9 and now the newer CX TV’s can be calibrated perfectly for Rec.709. You may be able to dabble with HDR grading using a consumer OLED HDR TV, but it’s not a mastering display.

      I hope at least some of that makes some sense, and has maybe helped answer your question.

  5. Marcos Alexandre

    Hello Richard,
    First of all, what a greatness of helpful content!
    Would you think that for UHD 4K Blu-Rays, which one is best to use when playing back on a OLED TV, BT2020 or DCI-P3? I understand 4K BDs are mastered using the container ‘REC 2020’ but it really looks a bit washed out, on the other hand, P3 looks better BUT red colour seems a bit over saturated. Thanks for any help. You’re such a legend!
    Cheers, Marcos

    • Richard Lackey

      Hi Marcos, which model TV are you using? It should detect that the input is HDR and adjust accordingly. You’re right that UHD Blu-Ray is Rec. 2020, and so Rec. 2020 would be correct, but the TV should automatically handle this.

  6. Vini de Moura

    Hi Richard,

    Thanks for a very informative post. You you chose a better colour accuracy (97.5 DCI-P3) over HDR for photography+office daily tasks?

    Many thanks

    • Richard Lackey

      Hi Vini, thanks for the question. Honestly my experience is with video post production rather than photo printing. For desktop computer work, sRGB is still the most common standard, and work created in sRGB will look how you intend it to others on the web, or on other devices.

  7. Hey Richard, thanks for your fast and precise reply! You are right, its a desktop GUI display so I´ll go with Rec709. The Display is a Dynascan-LCD with 8bit color depth, I assume when the LCD is working in 8Bit then Rec709 would be my choice anyway?

  8. Hey Richard, thank you for the knowledge you share on this website! I am wondering which color space I should use in Davinci Resolve when my video will be shown on a wide gamut (98% NTSC) LCD screen, calibrated in D65 (6500 K). Can you help?

    • Richard Lackey

      Hi Matthias,

      Thanks for the question. The NTSC color gamut is actually pretty much an obsolete frame of reference when it comes to comparing the gamuts of different color spaces. Manufacturers shouldn’t really quote it as any kind of meaningful reference. Saying that, so is the CIE 1931 chromaticity diagram that I make reference to, which I refer to because it’s so common. I should really update my article 🙂 A more meaningful point of reference is the CIE 1976 Uniform Chromaticity Diagram. Anyway, that’s not what you were asking.

      Since the gamut you mention, 98% NTSC isn’t any kind of standard, the most important question is how will your video projects be delivered and watched? I imagine the answer is likely to fall into the common Rec.709 video color space. So you would calibrate your monitor to Rec.709, with a white point of 6500K as you mention.

      I can help more specifically if you can let me know how your finished videos will be watched? On which kind of display will your audience see your videos?

      Also, how is your monitoring set up? Are you taking a video output from a video interface, like a Blackmagic Design Decklink card or Ultrastudio interface? Does your display have the ability to load a calibration LUT, or are you using a separate LUT box between the output of the video interface and the display? How are you calibrating the display?

      If this display is a desktop GUI display, then unfortunately you’re pretty limited in how accurately you can monitor in any case, and I would just use Rec.709.

      I look forward to hearing back from you, hopefully I can help once I have a bit more information.

  9. Mickael Delgado

    What colorspace should I use when playing video games in HDR, TV covers 99% DCI-P3 73%Rec 2020

    • Richard Lackey

      Thanks for the question. Unfortunately I won’t be much help. I’m not sure when it comes to gaming.

  10. krishna avril

    Hi Richard, I usually work on Srgb, what is the professional way of colorspace while working… I usually do photographic image works and what colorspace the usual mobile phones or pc monitors have… will it change after finishing editing photo on my Srgb or rec709 colorspace image… I’m not able to find the reasonable answer anywhere… thanks.

  11. Thank you so much for the effort, I will read it all

  12. Hello Richard, hope you can help me. I am trying to figure out the best way to use colourspace, so that what I see in my preview window in resolve is also the same color and gamut as on YouTube and Vimeo, when viewed on a computer, laptop, tablet or phone. Im working with bmpcc4k footage in resolve and on a iMac 2017 (P3-DCI display).

    The closest I have gotten to my goal is with these settings in resolve:

    Input colorspace: Blackmagic design pocket cinema camera 4k film gen4
    output colorspace: SRGB
    limit output gamut to: output color space
    timeline to output luminance mapping: 1000 nits
    timeline to output tone mapping: saturation mapping.

    It is not spot on yet, it has some luminance shifts and a bit of color shifts when uploaded. It is very close to when I play the file in QuickTime on my iMac, but still not perfect.

    Hope you can help me and save me a lot of frustration.

  13. Dear Richard, my question is similar to Peter’s. I use at home a calibrated eizo monitor (threw sdi), and I’m usually mastering in REC709. If I have to deliver a DCP (with Davinci Resolve 15 kakadu), I include in the name that it meant to be screened in REC709 color space. But if I understand your article well, I should actually do it in DCIP3 (my monitor in theory is capable to cover it). In this case I should master the shorts/documentaries that I work on in 2 different ways (REC709 for web/Television and DCIP3 for theatrical screening), or is there a more automated method to switch between the 2 color spaces? What is the usual protocol in these cases? Thanks ahead!

    • Richard Lackey

      Good question. Grading in DCI P3 usually involves a high end cinema projector, or a monitor you know 100% meets the standards. To be honest I wouldn’t make it any more complicated than it needs to be. I would recommend you grade in rec709 for both deliverables and let the Kakadu encoder handle the color space conversion to XYZ for the DCP. It will do that without any input from you. Hope that helps.

  14. Richard Lackey

    Hi Helin, your monitor will show whatever signal is input, so it’s your input to the monitor that matters in this case. If you’re feeding it Rec. 709, that’s what it will display. Calibration of your monitor is another separate matter. What type of monitor? and what is your monitoring path? I’m assuming you are using a video output card for your monitoring?

  15. kishore yadav

    Hi Richard Lackey, there are different type of color space like P3, P3 D55, P3 D61, P3 D65. If i want to grade film what is the color space of Display/Monitor?

    • Richard Lackey

      P3 is the name of the color space for DCI digital cinema, and the D number is the white point measured in degrees Kelvin, as a color temperature. D55 means the white point is 5500K, D61 will be 6100K and D65 will be 6500K. All fit within the DCI P3 gamut. 6500K is used in rec 709 and most other color spaces, 6300K is DCI white used specifically in DCI P3 target color gamut.

  16. Radhakrishnan Chakyat

    We have graded our footage in Adobe Premier Pro CC using a REC-709 calibrated 10bit monitor. But the rendered output looks almost 20% de-saturated. Tried many parameter combinations while rendering. Still not good enough. Any advice?

    • Richard Lackey

      This could be a video levels vs data levels issue. It’s been a while since I’ve looked at the Premiere render settings since I use Resolve for everything. Unfortunately I might not be much help. Are you on Mac or PC? What codec format are you rendering to? Does the rendered file look correct when played out to the monitor but incorrect when played on desktop screen? or does it look incorrect on both?

  17. Pingback:sRGB vs Rec. 709 | Gerdami's Blog

  18. Thanks Richard for this.

    I have a question: If I´m doing a broadcast project in rec709 for a client and they would like to show this film in festivals (projected), do I need to do another copy set to dci-p3 and compensate the grading? I´ve had issues with films on festivals with bleached and blacks lifted and everything looked just terrible, but the same copy looked just as nice as it should on other screenings.

    • Richard Lackey

      Hi Peter, good question. It depends on the technical requirements for the festival and how they are projecting. Are they asking for a full DCP from you?

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.