
Guru3D writes: "A somewhat heated topic amongst graphics card manufacturers is how to get as much performance out of a graphics card with as little as possible image quality loss. In the past both ATI and NVIDIA have been found guilt on cheating in applications to gain better performance. Both companies promised from there on to stay away from cheats.
Topic of discussion for many weeks now is that AMD has been applying a series of optimizations that could easily be seen and explained as a cheat."

AMD has mentioned that the next-gen Xbox is on track for release in 2027, which means we might be in the final year of the Series X|S.
FSR 4 was a substantial improvement to AMD’s upscaling solution. It reduces ghosting, improves finer mesh retention, and particle effects. In most cases, it delivers similar visual quality to DLSS 4’s CNN model, but slightly worse than the newer transformer model.
Since FSR is open-source and nvidia's DLSS isn't, I'd personally always prefer FSR.
Frankly, I think all these differences are nice to know (and notice) about if you're playing at DF level. And I totally respect that very small need to max out performance.
But given the prices, I don't think any nvidia GPU advantage justifies paying 1000+ bucks. I don't see any game(s) exclusively (or not) available on PC that offer a fundamentally different and innovative gameplay experience.
I dont know about anyone else, but I've never had 2 screens playing at the same time to know the difference in performance of a given game. It's like those TV screen comparisons, virtually nobody in the real world engages does this, lol. Performance seems comparable to me. Besides Nvidia is no longer interested in the gaming products, its full steam ahead with "AI".
I've been lucky enough to get a new 5090 build in March, glad I went with Nvidia. Cyberpunk looks amazing.

The 9070XT matches or beats Nvidia's much more expensive 5080 in CoD: BO7 benchmarks. A rare win for AMD. The article also takes a closer look at 9600X vs 9800X3D performance.
Why does the article use misleading terms like "Crushes" and "The 9070 XT "HANDILY BEATS" the more expensive RTX 5080" ? It even admits it at the end of the article, yet keeps the terms lol
Looking at the graph, the difference is only 4-19fps, depending on the settings.
I would hardly call a 4-19fps difference, "crushes" or "handily beats" and no one is going to buy a 9070 over a 5080 for COD alone. How does the 9070 fair in other games compared to the 5080?