Too expensive!

The RTX 5090 was announced for $1999. If the 4090 is anything to go by, realistically you will pay around $2300-$2500. (If you can get your hand on one in the first place.) By any measure $2500 is a lot of money: That’s year of rent in some places in the world, or a perfectly usable car. But what it is not is too expensive.

This is a blog post I feel like writing every other year. When Apple releases a new phone or a new tablet. When Nvidia releases a new GPU. Or basically any other highly priced luxury good you most likely do not need in your life. While this post focuses on the RTX 5090, it is mostly applicable the exact same way to other tech.

If you are one of the self proclaimed professional gamers who are always held back by their team (and who would’ve otherwise made it to pro already) you might want to skip the next four paragraphs (or the entire post). I am not on your level where I can see 2GB memory in my refresh rate, most of my arguments are based on science, experience, measurable data and a human level of engagement with the medium.

Performance improvement

To get the major marketing gag out of the way first: The 5070 will not perform on equal footing with the 4090. Not even close. But it might perform good enough for you to accept the tradeoffs and you will see something like 4090 performance for your specific use case. Let us unwrap this a bit more.

Nvidia themselves know that the 4080 and 4090 are fast enough for games. There is a reason they mostly show “1 to 2x performance improvement over”. Yes, you might get double the frames. But going from 200 to 400 in a game in which you mostly look at the beautiful scenery will not make a big difference. But how do we get to these impressive numbers?

By faking them. Upscaled images, light reconstruction and DLSS all do a very good job at pretending you get a ton of performance; especially when you consider that only one in four frames is rendered and the rest is made up by your GPU!

Thus, categorising the performance of the 5090 is tricky. If you have a slow paced game and you want to admire the beautiful landscape - crank all of it. It will look gorgeous and you will likely be able to play on 4k ultra. For a face paced competitive shooter? Turn off literally everything. The erratic behaviour of other people controlling pixels you try to shoot at does not lend itself well to “make up what the next frame could look like”. Also good luck seeing anything in current competitive games when cranked to ultra. (What’s up with the amount of ability effect these days?)

Nvidia’s DLSS numbers

Nvidia claimed 80% of gamers with an DLSS capable card are running games with DLSS on. Which I believe. It is the default setting. And many do not know better. It is that simple. Going into the settings menu is not as common as people believe, I heard many even play with motion blur on. Disgusting.

This number is actually important because it tries to legitimise the claimed performance improvements. If everyone is using it, it means the numbers are based on real world examples, right?

We can make an argument for this. The same way we could argue “all frames are fake” or that an LLM should be called AI. But what we should not forget is that the experience with DLSS is hit or miss, and depends on the use case. Which means many GPU comparisons we will see in future are tricky to understand.

Should you compare raw performance without any of the enhancements active? That would give you a picture of how two cards compare to each other. But if you are primarily playing the next Civilisation in 8k, raytracing cranked to the max? You really care more about how many frames the RTX 5090 can imagine to get you across the 60-120fps mark when scrolling the map.

Game optimisation

Most games these days are playable cross platform. Which means they have a performance target of a current generation console like the Playstation 5 Pro. And the game has to look good at this performance target.

This should equal about medium settings on a PC. More often than not games are optimised to run at 60 frames per second. You really do not need a ton of performance to run medium settings. And you certainly do not need lots of the extra features as they are only slowly rolling out.

Remember ray tracing introduced 7ish years ago? India Jones and the Great Circle is the first game to actually require your GPU to support this feature.

Try a visual comparison of medium to high to ultra settings. You will surely see the difference. Now start playing the game, move around, do things. It will most likely become far less obvious.

This is one of the problems with comparing visuals. You can always find specific scenes where a certain technology makes a huge difference. The question is how often you encounter a scene like this and how often do you have time to actually take the scene in. If enemies start shooting at you from 15 different angles you most like do not stand around and think about the street lights reflecting in the rain puddle.

And some of these optimisations come at a noticeable price. Text looking blurry with upscaling when moving quickly is in my opinion far more noticeable than half of the dials set to ultra compared to high. Cyberpunk 2077 is a good example for this.

eSports

If being good at video games is what you do for a living you are likely looking for every single competitive edge you can find. So you certainly run with all the tech off to make sure your frames are pixel perfect and latency is at a minimum.

But there is also a reason most competitive stages are equipped with 25 or 27” 1080p, or sometimes 1440p. You usually don’t play competitive at 4k ultra.

Professional eSports aim for the highest possible frame rate and least amount of input latency. They are willing to sacrifice image quality and resolution for that. But even then some of the best gamers will not always be able to tell the exact difference after a certain point.

But this is in a professional setting. Casual gaming and people who don’t have multiple screens with different refresh rates side by side rarely sit down in front of a monitor and go “oh, that’s just 120Hz, not 144Hz. Unplayable.”. And if they do it is more often than not a lucky guess. There are various diminishing return points. Going from 30Hz to 60Hz to 120Hz is far more noticeable than going from 140Hz to 165Hz (assuming your system can output enough frames).

Where does this leave us?

The RTX 5090 is a powerhouse and it has a lot going for it. I am not trying to deny this or downplay the raw performance of the card. DisplayPort 2.1b is finally here! But what if you want to build a regular gaming system today? Do not buy one.

For gaming you are most likely fine with a 5070 or 5080. Half the price, maybe less. And you would most likely also be fine with a previous generation card. And not just for now but for a relatively long time. Take a look at the Steam Hardware Survey for example. There is still a stupidly branded 1060 in the top 12 of video cards and a budget 3060 leads the list.

You can build an okay gaming PC for $500 - thank you Intel. You can get competent handhelds for $600-$800. There are so many options out there, you really do not need the RTX 50 series. And if you opt to get one, you likely do not care that much about the price. Especially because this card will not go in a $500 system and be attached to $100 peripherals. Otherwise I really don’t know what you are doing.

Also just to get this out there. The xx90 series was never a good choice for gaming. It was a choice. It was not bad. But you rarely got your money’s worth. I only got this stupid room heater of a 4090 for ML and LLM work. I have a “3.5k?” screen - 3440x1440 at 165Hz. It makes a difference at this resolution to the 3080 I had before. But let me tell you - it’s not worth it at all.

No need to be outraged

The outrage I have seen over the price of current generation technology always seems to assume you need this technology. And it is honest outrage, not just some grumbling.

Keynotes usually make it sound like you are missing out if you do not buy the current generation of anything. Look at all the made up frames the RTX 5090 can produce! The camera in the new iPhone! And Apple Intelligence which will surely become intelligent and useful in a few years, but you can use it today to generate images!

Realistically you likely do not need it. And if you do there is a good chance it is a business expense making the price far less of a problem.

I remember the 90s and early 2ks when you basically had to upgrade every other year or so because hardware was developing so fast. We got 3D accelerators! Then proper 3D GPUs! At some point we could put two CPUs in our PC for two cores! These times are long gone. Hardware is not that fast moving anymore, big strides have been replaced with incremental improvements. The actual impact is not what keynotes and paid for reviews make it sound like.

We are living in a good time to just buy previous generation hardware, maybe two generations old. (Some caveats apply obviously. Maybe do not buy a dead or dying platform that leaves no room to upgrade in future.) But the price for current generation hardware mostly matters to enthusiasts and people who work with the hardware in a professional capacity.

I totally understand wanting to spend money on a hobby. And considering the life time you can get out of hardware these days most investments are a multi-year investment, you will not replace them frequently. Maybe instead of a new GPU invest in new peripherals. A nice OLED screen is always a good investment, makes even mid to high settings games look really good.

I felt like writing this post for a simple reason: Context. People are emotionally invested in their hobbies. And this is amazing! People also want to spend money on their hobbies. I am in the same boat. I also remember hardware I wanted being too expensive for me in the past. Most reviews usually focus on what you get from buying new hardware. Many often conveniently forget the “do you need it” aspect. Some are simply paid for by a brand or manufacturer, meaning they are ads selling you something.

Having been a tech enthusiast with various financial means for nearly over 30 years (I started early, I am not that old! :p) I learned a few things and hope to be able to share my perspective. Maybe this post will help some people with their buying decisions, calm some emotions and give some hints what to look for when buying new hardware.

For most of us this is a hobby. A hobby should not make you angry. Especially not things related to it that do not have a big impact on your experience. Happy sight-seeing and happy fragging everyone.

posted on Jan. 19, 2025, 6:50 p.m. in gaming, hardware, news