Today, Nvidia showed us the true promise and vision of ray tracing—making virtual environments look as real as possible based on how light is cast and reflected off surfaces and objects. It’s not a new concept. Animation studios like Pixar have been working with ray traced effects for years. But it’s always been easier to use those effects in an animated film because nothing needs to be rendered in real-time. Nvidia changed all that when it released its RTX 20-series graphics cards. For the first time ever, real-time ray tracing in video games was made possible. But it kind of sucked. Today Nvidia showed off what might be the ray tracing holy grail we’ve been waiting for.
When Nvidia released its first ray tracing graphics cards two years ago, not only could you count the compatible games on one hand, but the lighting effects in those games were mostly limited to simple reflections or shadows. It was underwhelming. Nvidia had spent so much time hyping up its newest architecture at the time and touting the amazing realness of ray tracing, but there wasn’t much concrete evidence, especially in the hands of consumers, to make that hype seem palatable.
As time went on, games like Metro Exodus and Control started to incorporate more complex lighting in its scenes, but the performance hit wasn’t soft. Even Nvidia’s best cards struggled. A high-end card like the RTX 2080 Ti may handle ray tracing enabled games at 1080p, but raising the resolution to 4K can often tank the frame rate to below 60 fps, sometimes with ray tracing off too.
But with all the improvements Nvidia showed off this morning during its big RTX 30-series announcement, this is the ray-tracing we’ve been waiting for over the last two years. RIP AMD ‘Big Navi’. RIP Xbox Series X. RIP PlayStation 5. Nvidia has once again pulled out all the stops to keep its graphics crown, and with surprisingly low prices on the RTX 3080 and RTX 3070, you might be better off asking for a new graphics card this holiday season than a new console.
Bold statement, I know, but Nvidia’s enhanced version of its Marbles video, which it first showed off at its GTC event a few months ago, has completely won me over. The original version had textures and lighting that made you feel the warmth of the room like you were there. I could almost smell the acrylic paint, the mix of fresh and worn wood, and feel the marble’s smoothness just from listening to the gentle clacking noise it made as it traversed down the path. Everything looked as good, perhaps even better, as fully rendered cut scenes in the best video games. That’s because the original Marbles video was created with fully path-traced, photorealistic, real-time ray tracing graphics. It was running on Nvidia’s highest-end graphics card, a Turing architecture-based Quadro X 8000.
Today’s video is different. It’s more like someone shot a marble running the gamut of paint cans and brushes with a RED camera than a cut scene in a video game. It’s the same video but run on Nvidia’s Ampere architecture, which powers today’s new RTX 30-series graphics cards. It has more complex lighting effects all rendered in real-time. I can still smell the paint and wood, but now there’s a cascading mixture of light and shadows as the sun sets outside, the hanging lights twinkling and buzzing to life over the apparatus snugly nestled by the window, the marbles’ soft glow as they spiral down the slide…
It’s pure poetry.
Here’s a couple of screenshots from the original Marbles video rendered at 720p, 25 fps, and from the new Marbles video rendered at 1440p, 30fps, just to make a quick comparison.
According to Nvidia, the enhanced Marbles version is running at over four times the performance of the original version—easy to believe when you compare the day time lighting and shadows of the first version to the reflections, shadows, glow lighting, and other complex lighting effects of the new version. There are hundreds of area lights, including spherical area lights. Nothing is pre-baked, or rendered beforehand. Everything is live. There’s also depth of field in the new version, whereas there wasn’t in the previous version of Marbles.
It’s all as realistic as what you could shoot in your own home if you had the cinematography skills of Wes Anderson.
And if that’s not impressive enough, take a glance at what Cyberpunk 2077 is going to look like running on Ampere graphics.
Is gorgeous even a word that does this justice? Like the original Marbles, all Cyberpunk 2077 trailers and gameplay up until this point have looked fantastic—but Ampere takes it to a whole other level. The RTX 3090, RTX 3080, and RTX 3070 will be out before the game launches, too.
This is partially because Nvidia has seriously juiced the specs when moving from the Turing architecture to the new Ampere architecture. Nvidia’s flagship Turing cores had 11 Shader teraflops (TFLOPS), 34 RT Core TFLOPS, and 89 Tensor Core TFLOPS. All these things working in tandem are what make real-time ray tracing work. With Ampere, there’s now 30 Shader TFLOPS, 58 RT TFLOPS, and 238 Tensor Core TFLOPS, which translates into a massive computing boost to deliver more realistic graphics at higher resolutions and higher frame rates. I’ll say it again: this is the ray tracing we wanted back in 2018.
Nvidia took a huge gamble releasing its first RTX cards when it did, and if you bought a new RTX 20-series card two years ago or just bought one last week, I’m sorry to say you may have wasted your money. I hope to get my hands on the RTX 3080 soon and see just how much it blows the RTX 2080 Ti away, especially since the RTX 3080 starts at a much lower price point compared to the RTX 2080 Ti when it first released—$700 compared to $1,000. More power and better graphics for $300 less? $700 is not spare change, but it’s a much more compelling asking price than previous-gen GPUs.
I have no idea how AMD, and even Intel, are going to top the RTX 30 cards. But quite frankly, given that we haven’t heard much about AMD PC graphics cards in a while and Intel only just confirmed that its discrete GPUs will have ray tracing, I don’t see how either of those will be able to compete, even if the cards sell for peanuts in comparison. Nvidia has just made the GPU space a lot more interesting.