RTX vs GTX: Navigating Nvidia’s Graphics Card Lineup

If you are not up to date with the latest news in the world of hardware or have only recently become interested in building your PC and are thinking about getting a graphics card from Nvidia, then you have no doubt noticed that the said company offers two different types of GPUs at first glance: RTx and GTx.

So, what does this all mean, what is the difference between the GTX and RTX models, and which one should you choose?

RTX Vs GTX: Which Nvidia Geforce Graphics Card Is Better

We will answer all these questions, so we recommend that you read the article to the end!

Nvidia Geforce Graphics Cards

All Nvidia gaming GPUs are from their own GeForce brand, which was born in 1999 with the release of the GeForce 256. Since then, the company has released hundreds of different graphics cards, culminating in three recent lineups: the GeForce 20 series released in 2018, the GeForce 16 series, released in 2019, and the GeForce 30 series, released in 2020.

As of today, the GeForce 20 and GeForce 30 series consists exclusively of RTX GPUs, while the GeForce 16 series consists of GTX graphics cards. So what do all these letters mean? In fact, neither GTX nor RTX are acronyms and have no specific meaning as such. They exist simply for marketing purposes.

Nvidia has used several similar two- and three-letter designations to give users a general idea of ​​what kind of performance each GPU has to offer.

For example, manufacturers have used designations such as GT, GTS, GTX, and many others over the years, but only GTX and the new RTX have survived to this day.

Difference Between Nvidia GeForce GTX and RTX

The NVIDIA RTX line of graphics cards has two main differences from the GTX series of graphics cards, these are:

  • RT cores, which provide hardware acceleration for ray tracing calculations;
  • Tensor cores used for AI and deep learning programs;

RT cores and tensor cores are separate modules inside the graphics chip of the graphics card, and they are present only on new RTX models. There are no such modules on GTX series video cards.

In fact, the presence of these cores is a small breakthrough in the graphics accelerator market, since neither NVIDIA nor their competitor AMD had anything like this before. And in the future, most likely, the presence of such modules will become the standard for all video cards from both manufacturers. Indirectly, this is confirmed by the fact that support for hardware-accelerated ray tracing has already been announced for next-generation game consoles.

The presence of RT and tensor cores opens up a number of new possibilities. So RT-kernels can significantly speed up calculations related to ray tracing. This will allow game developers to create more realistic and dynamic lighting that can quickly change depending on game events. Also, ray tracing performed by RT cores can be used to create effects such as reflections, refractions, shadows, and limiting depth of field.

It should be noted that the best tracing can also be used on video cards without RT cores. NVIDIA even released a driver update that unlocks this feature on 10th Gen GTX graphics cards. But, without the presence of RT cores, tracing takes too much performance from the main GPU modules, which causes FPS to sag a lot. It is hardly possible to play in this format, it is rather a demonstration of the technology itself.

In turn, tensor cores are designed to speed up calculations related to artificial intelligence and deep learning. Moreover, these are not necessarily some applied or scientific tasks. Tensor cores can also be used in computer games, for example, NVIDIA has applied this frame smoothing technology. This solution is called DLSS. With it, you can significantly improve the image quality in games, while not consuming the processing power of the main graphics processor modules of the video card.

In addition, RTX has a number of smaller changes that also distinguish them from past-generation graphics cards.

For example, RTX graphics cards received:

  • Separate integer (INT) cores for simultaneous execution of integer and floating point operations;
  • Support for GDDR6 video memory;
  • DisplayPort 1.4a support;
  • Support for the VirtualLink VR standard for video streaming through the USB Type-C port;
  • Support for the VLink Bridge bus, which should replace the SLI interface;
  • Improvements in support for DirectX 12 API and Vulkan API;
  • Improvements in overclocking and voltage management functions;

what are nvidia rt cores?

As mentioned above, RT cores are GPU cores dedicated exclusively to real-time ray tracing.

So what does ray tracing do to video game graphics? The technology allows for more realistic lighting and reflections. This is achieved by tracking the backward path of the beam, which allows the GPU to produce much more realistic simulations of the interaction of light with the environment. Ray tracing is still possible even on GPUs without RT cores, but in this case, the performance is simply terrible, even on flagship models like the GTX 1080 Ti.

Speaking of performance, real-time ray tracing actually has a big impact on performance even when used with RTX GPUs, which inevitably leads to the question – is it even worth using this technology at all?

Real-time ray spreading is a major advancement in gaming that will dramatically improve video game graphics in the coming years. However, right now the hardware is not powerful enough and the developers are not yet fully exploiting the potential of the feature.

what are nvidia tensor cores?

While ray tracing is the top-selling feature of the RTX 20 and 30 series GPUs, the Turing architecture also introduced another major new feature to the mainstream GeForce line-up, advanced deep learning capabilities made possible with custom tensor cores.

These cores were introduced in 2017 in Nvidia Volta GPUs, however gaming graphics cards were not based on this architecture. So the tensor cores present in the Turing models are actually second generation tensor cores.

In terms of games, deep learning has one main application: deep learning supersampling, DLSS for short, which is a completely new anti-aliasing technique. So how exactly does DLSS work and is it better than conventional anti-aliasing methods?

DLSS uses deep learning models to generate details and upscale an image to a higher resolution, thereby making it sharper and reducing distortion. The above deep learning models are built on Nvidia supercomputers and then powered by the graphics card’s tensor cores.

Supersampling provides a sharper image but requires less hardware than most other anti-aliasing methods. What’s more, the technology can noticeably improve performance when ray tracing is enabled, which is a good thing considering how fast the performance of this feature is.

Which is better Geforce GTX Or RTX

When choosing a graphics card, the question may arise which is better GTX or RTX. The answer here is pretty obvious, RTX graphics cards are better than GTX in almost every way and have unique features that are not available for GTX. Therefore, if possible, it is better to buy RTX.

But, you need to consider that the GeForce RTX 2060, which is the most affordable in the entire RTX line, copes with the best tracing quite badly. When this feature is enabled, the FPS sags a lot and it becomes uncomfortable to play, even in FullHD resolution.

So for a full ray-traced gaming experience, you need a GeForce RTX 2070 or even a better graphics card.

Conclusion

Well, it’s time to sum it up: the RTX designation was introduced by Nvidia mainly for marketing purposes, making Turing 20-series GPUs look like a bigger upgrade than they really are.

Of course, the RTX models are equipped with cool new elements that will reach their full potential for the foreseeable future, and in terms of pure performance, the latest Ampere-based graphics cards are quite far ahead of the old Pascal-based GTX GPUs that were sold at about the same price.

All things considered, we wouldn’t say that RTX GPUs are worth buying just for ray tracing and DLSS as performance should always come first, especially if you want to get the most bang for your buck. On the other hand, these technologies will develop in the near future, and in a couple of years, GTX graphics chips will be frankly obsolete.

As an avid tech enthusiast and the brains behind MyGraphicsCard, I'm excited to welcome you to a realm dedicated to all things graphics cards. With years of experience under my belt, I'm here to offer insights and support. Don't hesitate to drop a line at admin@mygraphicscard.com if you need assistance.