Choosing a GPU for DaVinci Resolve can be overwhelming. I’ve listed all the best NVIDIA and AMD GPU options in order of price and performance.
Article Last Updated: June 2020
The GPU is the single most important component in any DaVinci Resolve workstation or laptop. Choosing a GPU is a careful balance between performance needs (and expectations), and budget. I’ve put together the a list of all the best NVIDIA and AMD GPU’s arranged in order of performance. This hierarchy is pretty much universal, not only for DaVinci Resolve, but any application that relies heavily on GPU processing. A list of top gaming GPU’s would probably come out in more or less the same order.
I’ve included both NVIDIA and AMD GPU’s in the list, in their respective positions relative to each other, along with the most important factors to use with DaVinci Resolve.
I’ve added Amazon product links where there is availability. These are affiliate links. I’ve skipped buy links to GPU’s that are currently priced higher than a better, newer option.
Navigate This Article
DaVinci Resolve NVIDIA and AMD GPU Ranking
Included are the latest 20-series and 16-series Turing architecture NVIDIA GPU’s along with a few relevant 10-series GPU’s for reference. This is not to say that the 10-series are not still great GPU’s for Resolve, only that if you’re looking to buy or upgrade, you should be looking at the 16-series at least, rather than an older 10-series cards. That’s not to say you shouldn’t grab a used 1080Ti if you find one at a good price but consider that a RTX 2060 Super at around $440 is tough to beat. I’d say the RTX 2060 Super is the best performance for money at the moment.
This list is based on a ton of research, trawling through user forums and various published benchmarks. It is not based on my own actual testing. I would love to be able to build a test rig and compile data for all these GPU’s but sadly that’s out of reach for me. So you should take it as a rough guideline, with some room for error. Real-world testing might swap a few positions, but I doubt it would be drastically different.
|GPU||Architecture||H.264/H.265 Encoding / Decoding||Clock||Memory||Cores||Memory Bandwidth||Power Consumption||Average Price|
|Nvidia Titan RTX||Turing||Y (NVENC)||1770MHz||24GB GDDR6||CUDA Cores: 4608||672GB/s||280W||$2500|
|Nvidia GeForce RTX 2080 Ti||Turing||Y (NVENC)||1635MHz||11GB GDDR6||CUDA Cores: 4352||616GB/s||250W||$1,200|
|Nvidia GeForce RTX 2080 Super||Turing||Y (NVENC)||1815MHz||8GB GDDR6||CUDA Cores: 3072||496GB/s||250W||$1,000|
|Nvidia GeForce RTX 2080||Turing||Y (NVENC)||1710MHz||8GB GDDR6||CUDA Cores: 2944||448GB/s||215W|
|Nvidia GeForce RTX 2070 Super||Turing||Y (NVENC)||1770MHz||8GB GDDR6||CUDA Cores: 2560||448GB/s||215W||$550|
|AMD Radeon VII||GCN 5th Gen||Y (UVD, VCE)||1905MHz||16GB HBM2||Stream Processors: 3840||1000GB/s||300W|
|Nvidia Titan X||Pascal||Y (NVENC)||1480MHz||12GB GDDR5X||CUDA Cores: 3584||480GB/s||250W|
|Nvidia GeForce GTX 1080Ti||Pascal||Y (NVENC)||1582MHz||11GB GDDR5X||CUDA Cores: 3584||484GB/s||250W|
|AMD Radeon RX 5700 XT||RDNA||Y (VCN)||1905MHz||8GB GDDR6||Stream Processors: 2560||448GB/s||225W||$380|
|Nvidia GeForce RTX 2070||Turing||Y (NVENC)||1620MHz||8GB GDDR6||CUDA Cores: 2304||448GB/s||215W||$440|
|Nvidia GeForce RTX 2060 Super||Turing||Y (NVENC)||1650MHz||8GB GDDR6||CUDA Cores: 2176||448GB/s||175W||$440|
|Nvidia GeForce GTX 1080||Pascal||Y (NVENC)||1733MHz||8GB GDDR5X||CUDA Cores: 2560||320GB/s||180W|
|AMD Radeon RX 5700||RDNA||Y (VCN)||1725MHz||8GB GDDR6||Stream Processors: 2304||448GB/s||180W||$330|
|AMD Radeon RX Vega 64||GCN 5th Gen||Y (UVD, VCE)||1546MHz||8GB HBM2||Steam Processors: 4096||484GB/s||295W|
|Nvidia GeForce RTX 2060||Turing||Y (NVENC)||1680MHz||6GB GDDR6||CUDA Cores: 1920||336GB/s||175W||$340|
|AMD Radeon RX Vega 56||GCN 5th Gen||Y (UVD, VCE)||1471MHz||8GB HBM2||Stream Processors: 3,584||410GB/s||210W||$340|
|Nvidia GeForce GTX 1070 Ti||Pascal||Y (NVENC)||1683MHz||8GB GDDR5||CUDA Cores: 2432||256GB/s||180W|
|Nvidia GeForce GTX 1660 Ti||Turing||Y (NVENC)||1770MHz||6GB GDDR5||CUDA Cores: 1536||288GB/s||120W||$280|
|Nvidia GeForce GTX 1660 Super||Turing||Y (NVENC)||1785MHz||6GB GDDR5||CUDA Cores: 1408||336GB/s||125W||$250|
|Nvidia GeForce GTX 1660||Turing||Y (NVENC)||1785MHz||6GB GDDR5||CUDA Cores: 1408||192GB/s||125W||$205|
|Nvidia GeForce GTX 1070||Pascal||Y (NVENC)||1683MHz||8GB GDDR5||CUDA Cores: 1920||256GB/s||150W|
|AMD Radeon RX 590||GCN 4th Gen||Y (UVD, VCE)||1545MHz||8GB GDDR5||Steam Processors: 2,304||256GB/s||225W||$180|
|AMD Radeon RX 580 8GB||GCN 4th Gen||Y (UVD, VCE)||1340MHz||8GB GDDR5||Stream processors: 2,304||256GB/s||185W||$170|
|AMD Radeon RX 570 4GB||GCN 4th Gen||Y (UVD, VCE)||1244MHz||4GB GDDR5||Stream Processors: 1328||224GB/s||150W||$150|
|Nvidia GeForce GTX 1650||Turing||Y (NVENC)||1665MHz||4GB GDDR6||CUDA Cores: 896||128GB/s||75W|
Hardware AVC / H.264 / H.265 Encoding / Decoding in DaVinci Resolve
All GPU’s in the list are capable of hardware accelerated H.264 / H.265 encoding / decoding. However, it’s worth noting that hardware accelerated H.264 / H.265 encoding / decoding is only available in DaVinci Resolve Studio on Windows and Linux. It’s available even in the free version of Resolve for Mac. I state this in all my DaVinci Resolve specification and performance related posts because investing $299 in a DaVinci Resolve Studio license is the best $299 you can spend if you’re working with AVC / H.264 / H.265 source media, which many of us are.
- Learn more about DaVinci Resolve Minimum System Requirements
- Read more about the The Best Budget Laptops for DaVinci Resolve In 2020 with Thunderbolt 3
- Read more about using an eGPU in my article The Best DaVinci Resolve eGPU Options
- Find out more about Resolve monitoring in my article The Best Low Budget Resolve Monitoring and Video Color Management
- Read more about storage in my article The Best Storage for Video Editing | Post Workflow Strategy & Backups
Stay in Touch
If you’d like to be notified of new articles and tutorials you can subscribe to my very occasional email updates.