AI Benchmark RTX Graphics Card – RTX2060 VS RTX3090

After searching for a comprehensive benchmark tool for various Nvidia RTX graphics cards, l came across AI Benchmark – click here for the link.

This is also available for benchmarking your smartphone. I’ve tested this benchmark against the two RTX GPU cards l have access to: An RTX 2060 on the Lenovo Legion 5 and an RTX 3090 on an AMD Ryzen Desktop machine. I use both GPU’s mainly for AI/Machine Learning projects as well as Ray Tracing and graphics rendering projects. To  run this test in Ubuntu 20.04 [which is the Linux distro l used], you only need python and pip installed as well as  the graphics card driver/Cuda installed.

Nvidia Geoforce RTX 2060 Specs

TU106 graphics processor, 1920 Cores, 120 TMUS, 48 ROPS, 6GB VRAM, GDDR6 Memory Type, 192 Bit Bus Width

Nvidia Geoforce RTX 3090 Specs

GA102 graphics processor, 10496 Cores, 328 TMUS, 112 ROPS, 24GB VRAM, GDDR6X Memory Type, 384 Bit Bus Width

Results:  [Test ran on Ubuntu 20 Linux]

RTX 2060

Device Inference Score: 942
Device Training Score: 952
Device AI Score: 1894

RTX 3090

Device Inference Score: 19624
Device Training Score: 24502
Device AI Score: 44126

Based on above 3 device scores, the RTX 3090 is 21,26,23 times more powerful than the RTX 2060

 

Screenshots:

RTX 3090

Phone benchmark test result:  [Google Pixel 4XL]

This entry was posted in Benchmark, Computer, GameEngine, General, Machine Learning, Nvidia and tagged , , , , , , . Bookmark the permalink.