
Michael Larabel: If you've been eyeing a purchase of a 4K "Ultra HD" TV this holiday season and will be connecting it to a Linux system, here's the information that you need to know for getting started and some performance benchmarks to set the expectations for what you can expect. This article has a number of AMD Radeon and NVIDIA GeForce benchmarks when running various Linux OpenGL workloads at a resolution of 3840 x 2160.

AMD has mentioned that the next-gen Xbox is on track for release in 2027, which means we might be in the final year of the Series X|S.
NVIDIA rolled out the DLSS 4.5 update at CES last week, adding 2nd Gen Transformer-based Super Resolution technology for all RTX GPUs. The performance scaling varies wildly across the older (RTX 20/RTX 30) and newer (RTX 40/RTX 50) GeForce RTX lineups. We tested NVIDIA’s next-gen upscaling solution across Cyberpunk 2077, Black Myth: Wukong, Oblivion Remastered, and KCD 2.
I've been surprised by this, the difference between 4 and 4.5 is very noticeable. It's almost completely or has removed that weird dark ghosting that you'd get in foggy games like Silent Hill 2... and Cyberpunk mixed with a high res texture pack is jaw dropping in ultra 4k.
Also if anyone doesn't know I recommend DLSS swapper, it allows you to inject the latest DLSS version into older games.
Quite amazing. But, this does probably mean devs will depend on ai even more for their supposed optimizations lol.
no offense to AMD, but this sort of stuff shows that they are always going to be playing catchup. I guess Nintendo can take advantage of some of these features.
"Better than native."
Native 4K in nearly all games nowadays is actually native resolution with forced temporal anti-aliasing.
TAA smears and blurs frames together to soften jagged edges.
Of course DLSS makes games look "better than native" because native alone without any competent AA methods makes games look horrible.
FSR 4 was a substantial improvement to AMD’s upscaling solution. It reduces ghosting, improves finer mesh retention, and particle effects. In most cases, it delivers similar visual quality to DLSS 4’s CNN model, but slightly worse than the newer transformer model.
Since FSR is open-source and nvidia's DLSS isn't, I'd personally always prefer FSR.
Frankly, I think all these differences are nice to know (and notice) about if you're playing at DF level. And I totally respect that very small need to max out performance.
But given the prices, I don't think any nvidia GPU advantage justifies paying 1000+ bucks. I don't see any game(s) exclusively (or not) available on PC that offer a fundamentally different and innovative gameplay experience.
I dont know about anyone else, but I've never had 2 screens playing at the same time to know the difference in performance of a given game. It's like those TV screen comparisons, virtually nobody in the real world engages does this, lol. Performance seems comparable to me. Besides Nvidia is no longer interested in the gaming products, its full steam ahead with "AI".
I've been lucky enough to get a new 5090 build in March, glad I went with Nvidia. Cyberpunk looks amazing.
PC is so far ahead