
With almost every major game cross-released on consoles and PCs these days, it can often be tough to know what the differences are between versions. But one is always present: The need for PC users to tweak their graphical settings for the best blend of visuals and performance. Doing this well can be tricky and time-consuming, and involve a lot of trial and error that probably interests you less than playing the actual game you purchased. With its new GeForce Experience application, however, Nvidia is trying to change all that.
NVIDIA rolled out the DLSS 4.5 update at CES last week, adding 2nd Gen Transformer-based Super Resolution technology for all RTX GPUs. The performance scaling varies wildly across the older (RTX 20/RTX 30) and newer (RTX 40/RTX 50) GeForce RTX lineups. We tested NVIDIA’s next-gen upscaling solution across Cyberpunk 2077, Black Myth: Wukong, Oblivion Remastered, and KCD 2.
I've been surprised by this, the difference between 4 and 4.5 is very noticeable. It's almost completely or has removed that weird dark ghosting that you'd get in foggy games like Silent Hill 2... and Cyberpunk mixed with a high res texture pack is jaw dropping in ultra 4k.
Also if anyone doesn't know I recommend DLSS swapper, it allows you to inject the latest DLSS version into older games.
Quite amazing. But, this does probably mean devs will depend on ai even more for their supposed optimizations lol.
no offense to AMD, but this sort of stuff shows that they are always going to be playing catchup. I guess Nintendo can take advantage of some of these features.
"Better than native."
Native 4K in nearly all games nowadays is actually native resolution with forced temporal anti-aliasing.
TAA smears and blurs frames together to soften jagged edges.
Of course DLSS makes games look "better than native" because native alone without any competent AA methods makes games look horrible.
FSR 4 was a substantial improvement to AMD’s upscaling solution. It reduces ghosting, improves finer mesh retention, and particle effects. In most cases, it delivers similar visual quality to DLSS 4’s CNN model, but slightly worse than the newer transformer model.
Since FSR is open-source and nvidia's DLSS isn't, I'd personally always prefer FSR.
Frankly, I think all these differences are nice to know (and notice) about if you're playing at DF level. And I totally respect that very small need to max out performance.
But given the prices, I don't think any nvidia GPU advantage justifies paying 1000+ bucks. I don't see any game(s) exclusively (or not) available on PC that offer a fundamentally different and innovative gameplay experience.
I dont know about anyone else, but I've never had 2 screens playing at the same time to know the difference in performance of a given game. It's like those TV screen comparisons, virtually nobody in the real world engages does this, lol. Performance seems comparable to me. Besides Nvidia is no longer interested in the gaming products, its full steam ahead with "AI".
I've been lucky enough to get a new 5090 build in March, glad I went with Nvidia. Cyberpunk looks amazing.

The 9070XT matches or beats Nvidia's much more expensive 5080 in CoD: BO7 benchmarks. A rare win for AMD. The article also takes a closer look at 9600X vs 9800X3D performance.
Why does the article use misleading terms like "Crushes" and "The 9070 XT "HANDILY BEATS" the more expensive RTX 5080" ? It even admits it at the end of the article, yet keeps the terms lol
Looking at the graph, the difference is only 4-19fps, depending on the settings.
I would hardly call a 4-19fps difference, "crushes" or "handily beats" and no one is going to buy a 9070 over a 5080 for COD alone. How does the 9070 fair in other games compared to the 5080?
That's great, although many current games are going for so simple game setting options, that they can't get more simpler. For example AC3 had like 5 options.
But there are still developers that like to make it with more option settings, so this is a great application for people with not so much experience or with people who don't like to mess with the settings.
If this application truly finds the best optimal settings, such that doing it manually would be worse, we should all use this.
Nvidia is making things a lot more easy for casual/core gamers
Wow thank, i needed something like this
Good news.