DLSS5

When the RTX 5090 was released I was already very much not in favor of the whole DLSS marketing that went into it and how it gave a very unrealistic view of the performance compared to the previous generation. Companies obviously make numbers look as good as possible during release announcements. Thankfully most reviewers simply do their own benchmarking and we get numbers without imaginary frames. But like power connectors going up in flames Nvidia found a way to double down on stupid.

First let us get something out of the way. We can make a technical argument that even the images we are seeing on screen when DLSS and other technologies are turned off are not "real" and are "interpreted based on instructions". This is technically correct. But we are arguing about semantics. Because the chance that the characters and scenes look like the artists wanted it to look are extremely high compared to imagining frames on a probabilistic model and randomly making new pixels up. So yes my dear keyboard warriors, you are correct on a technical level. It is just that no one cares and people understand what is meant in this context.

Slop, I choose you!

Two days ago Nvidia announced DLSS 5. God is it a dumpster fire of "get this thing away from me". Just take a look at their official announcement. They took an amazing game (Resident Evil Requiem) and turned on DLSS 5, making it look like the cheapest AI slop with the most basic LoRA on top of Stable Diffusion you can generate on a two generation old GPU in a few minutes. Just in realtime. While ruining a visually stunning and extremely fun triple A game.

Usually I try to be more nuanced and less opinionated - I'm really good at that, right? RIGHT? - but I very much make an exception here because this is just insulting to gamers, game developers and artists.

I think Jensens quote in this article is fairly accurate. "DLSS 5 is the GPT moment for graphics" It makes random stuff up and people laugh at you when you pretend it is amazing? (This is obviously not meant as generalization for all LLM usage.)

We all know IGN. The magazine that tries to please every publisher and company in existence. Where else can you find a review ending on "best game ever, easily game of the year. 8 out of 10!" next to "game crashed during launch and caused my hard drive to explode, but I see potential. 8 out of 10"? So when even IGN starts trolling you, you should realize it is bad.

But Nvidia does not care. They did not care when their 12V high power adapters melted and caused fires in homes. They do not care that literally no one asked for what DLSS 5 is doing nor wants it - at least I have not seen a single person claiming this is a good idea.

This brings me back to the original point I made when the 5090 was announced: Nvidia will not significantly improve hardware in near future. Nor is it necessary.

They obviously do not care much about the consumer market. The money is in data centers and milking the AI bubble. Obviously memory is sparse, margins in enterprise segments are high and they already repurposed production lines from consumers to the enterprise market.

We are good, thanks.

But what the whole frame gen and DLSS marketing shows me is that there also seems to be a cap what we can get out of the current generation silicon. If you are not able to ship significantly after hardware the easiest way to make numbers go up and make it look like people should buy your new cards is by finding a neat trick or feature. And DLSS is actually what they need for that. If we see a 60xx series this year I would not expect a realistic performance improvement but more generative features.

What this means for gamer is pretty simple. You still do not need to worry too much about upgrades.

Every single point I made back in the old post still stands. Existing Hardware is fast enough for gaming. Most games will still be optimized for consoles at 4k / 60fps, so if your hardware is faster than a PlayStation 5 you are good. And with current memory prices and rumors making their rounds it seems like there will not be a new generation of consoles before end of 2027 or 2028. Even with a new generation games will not be optimized for them. Remember how long Ray Tracing needed before it became a requirement? Switch 2 games mostly run fine on the Switch 1. The shift will be slow.

There might be a publisher or two who will give this a try. But it looks like the gaming community got really good at voting with their wallet, so they will hopefully get appropriate feedback really quick. Gamers actually got so good. at it, that some devs start complaining why people do not just give them money, but actually look into a game and do some research before buying it. Inconsiderate gamers. How dare you check if a multiplayer game has players to form lobbies. Feedback? They are the professionals here!. Games should just listen to game journalists who defend delusional devs and predicted Concord, Assassins Creed Shadows, Avowed, High Guard,... are amazing and will be a success.

If there is no way to turn off the AI slop-ification of a game there is a very easy way to deal with it: do not buy Nvidia. If Miele would try to sell me a fridge that randomly decides to melt my butter I would simply buy a Siemens or Bosch. Sure, Miele as a brand is deserved to be considered the premium standard, but why would I accept my fridge melting butter?

Competition

The competition does not look too good if you go by benchmarks alone, which allows Nvidia to coast by and drop slop features. But it does not mean there is no competition when we look at where most gamers likely will land when buying hardware - read: not the most premium segment.

AMD dropped the ball and does not really play in the high end market. But their GPUs are excellent if you plan to game on Linux and do not care too much about Ray Tracing. It is a shame, because the 9070xt(x) was very promising. A higher end SKU would have been appreciated, but that would be to satisfy enthusiasts not necessarily a need for gaming. The 9070xt is available for about 650 Euro which is a great price point and the card compares very well to an RTX 5060 ti with 16Gb memory.

Intel tries to play in the GPU market and is getting better with every release. Their cards are really good at the MSRP Intel suggests, sometimes drivers need a bit more time to cook before they are ready for primetime. But overall they are good on a tight budget.

Time to prepare some tea

It is far too early to see where this goes. I am certainly annoyed just looking at what the GPU can do to an amazing game. And I am sure I do not want this. But at the end of the day when I play a competitive shooter I simply turn off frame generation, so as long as DLSS 5 can be turned off we are good.

If it is forced on me my next card will not be an Nvidia, at least if AMD makes GPUs by then. I try to not predict the future, but upgrading a 4090 will likely not be worth it for some time.

What will be far more interesting is how some of the less gamer oriented studios will react. The ones driven by mostly greed and the desire to maximize revenue might try to get away with shipping low quality assets and having DLSS 5 fix it. We have seen them rely on frame gen in the past to mask performance issues with their engines. So I do not think this is too far fetched.

For now? Have a tea and enjoy a few games. Resident Evil Requiem looks amazing and is already (I have not finished it yet) on the way to be a contender for game of the year for me. Slay the Spire 2 is tremendous fun and Silksong is as channeling as you would expect. And all run on a few generations old hardware.

posted on March 18, 2026, 7:15 p.m. in AI, gaming, hardware, news