
Tom's Hardware writes: Nvidia's Tony Tamasi took ECGC 2011 attendees on a trip to the past, to the present, and to the future of both GPU rendering and mobile graphics.
During ECGC 2011, Nvidia senior vice president of content and technology Tony Tamasi made a startling prediction during his keynote presentation called "The Future of Graphics Processing." He claimed that GPU performance will increase 1000-percent by 2015, allowing graphics cards to generate real-time ray tracing and procedurally generated smoke at 30 to 60 frames per second.
NVIDIA rolled out the DLSS 4.5 update at CES last week, adding 2nd Gen Transformer-based Super Resolution technology for all RTX GPUs. The performance scaling varies wildly across the older (RTX 20/RTX 30) and newer (RTX 40/RTX 50) GeForce RTX lineups. We tested NVIDIA’s next-gen upscaling solution across Cyberpunk 2077, Black Myth: Wukong, Oblivion Remastered, and KCD 2.
I've been surprised by this, the difference between 4 and 4.5 is very noticeable. It's almost completely or has removed that weird dark ghosting that you'd get in foggy games like Silent Hill 2... and Cyberpunk mixed with a high res texture pack is jaw dropping in ultra 4k.
Also if anyone doesn't know I recommend DLSS swapper, it allows you to inject the latest DLSS version into older games.
Quite amazing. But, this does probably mean devs will depend on ai even more for their supposed optimizations lol.
no offense to AMD, but this sort of stuff shows that they are always going to be playing catchup. I guess Nintendo can take advantage of some of these features.
"Better than native."
Native 4K in nearly all games nowadays is actually native resolution with forced temporal anti-aliasing.
TAA smears and blurs frames together to soften jagged edges.
Of course DLSS makes games look "better than native" because native alone without any competent AA methods makes games look horrible.
FSR 4 was a substantial improvement to AMD’s upscaling solution. It reduces ghosting, improves finer mesh retention, and particle effects. In most cases, it delivers similar visual quality to DLSS 4’s CNN model, but slightly worse than the newer transformer model.
Since FSR is open-source and nvidia's DLSS isn't, I'd personally always prefer FSR.
Frankly, I think all these differences are nice to know (and notice) about if you're playing at DF level. And I totally respect that very small need to max out performance.
But given the prices, I don't think any nvidia GPU advantage justifies paying 1000+ bucks. I don't see any game(s) exclusively (or not) available on PC that offer a fundamentally different and innovative gameplay experience.
I dont know about anyone else, but I've never had 2 screens playing at the same time to know the difference in performance of a given game. It's like those TV screen comparisons, virtually nobody in the real world engages does this, lol. Performance seems comparable to me. Besides Nvidia is no longer interested in the gaming products, its full steam ahead with "AI".
I've been lucky enough to get a new 5090 build in March, glad I went with Nvidia. Cyberpunk looks amazing.

The 9070XT matches or beats Nvidia's much more expensive 5080 in CoD: BO7 benchmarks. A rare win for AMD. The article also takes a closer look at 9600X vs 9800X3D performance.
Why does the article use misleading terms like "Crushes" and "The 9070 XT "HANDILY BEATS" the more expensive RTX 5080" ? It even admits it at the end of the article, yet keeps the terms lol
Looking at the graph, the difference is only 4-19fps, depending on the settings.
I would hardly call a 4-19fps difference, "crushes" or "handily beats" and no one is going to buy a 9070 over a 5080 for COD alone. How does the 9070 fair in other games compared to the 5080?
This summer im saving ALL of my money to get the best AMD Processor and the best Graphics Card for mah rig.
Good read, and I can't wait to see what the future brings us. By 2015, I'll be 21, old enough to afford a really good PC/console. "Next Gen" should be simply amazing.
http://www.youtube.com/watc...
That's the level of graphics in the PS4 and 720.
Proof? Well I can't really offer any however. Epic games despite their PC history make most of their money out sourcing their engine to developers, who want to sell their game to as many people as possible, me and you, you and me, we want the best graphics, the developers also have deadlines and want to get work done. Why do you think suddenly games went from Quake engine to Unreal? Check COD:MW2 if you don't believe me there's a quake license in their. But now most games have unreal.
So, it's not only in Epics interest to produce the best current gen engine, but to keep it upto date. So, not too long ago, they went to Sony, Microsoft and Nintendo and said, something along the lines of
'We produce engines for your games, games which have made you a ton of money, Gears, Mass Effect. We're working on Unreal Engine 4, now we know you have some basics already down, but this is was we suggest
2GIG RAM-Min
1GIG V.ram
HD5770 or better, we'd like the HD6950 if you want long term
If you do those expect this. Sony has learnt from this generation to work more with third party engines, Microsoft has learnt their technology needs to be better.
http://n4g.com/news/12934/e...
Proof, and you don't think that'll happen again. The next generation is already started, behind closed doors.
Look at Battlefield 3, someone at EA said 'We really need to get this done, an engine that will carry us through the next generation.' Dice picked it up, and the engine is running on the PS4 and 720. When BF3 releases if you see one GPU out-preform the others FPS-$ rate compared to other games, then THAT'S the 720 GPU albeit with modifications.
EA may not have the 720 on desks but there's internal document floating around with it's GPU details.
or just buy a next gen console
" Is it even possible to push graphics beyond photo-realism? When will the GPU run out of gas? When will performance taper off?"
Graphics are far from looking photo realistic. Every leaf on a tree would have to have far greater detail than any main character on a PS3.
Graphics will NOT be photo realistic any time soon...I would probably seee photo realistic graphics before I die though =)