
Tom's Hardware writes: Nvidia's Tony Tamasi took ECGC 2011 attendees on a trip to the past, to the present, and to the future of both GPU rendering and mobile graphics.
During ECGC 2011, Nvidia senior vice president of content and technology Tony Tamasi made a startling prediction during his keynote presentation called "The Future of Graphics Processing." He claimed that GPU performance will increase 1000-percent by 2015, allowing graphics cards to generate real-time ray tracing and procedurally generated smoke at 30 to 60 frames per second.

Darryl Linington from Notebookchect.net writes, "The backlash around Nvidia’s AI push and DLSS 5 has opened a broader question in game development. Beyond performance and image quality, veteran artists are now weighing what AI-driven rendering means for authorship and visual control. If a system can add or reinterpret detail after the fact, the issue is no longer just technical. It becomes a question of how much of the final image still belongs to the people who built it."
The latest GeForce driver introduces DLSS 4.5 Multi Frame Generation 5x and 6x alongside Dynamic Multi Frame Generation to RTX 50-series GPUs. The former increases the number of interpolated frames to 4 and 5 (between every two rendered frames), further reducing reliance on the CPU.
Big corp bowing down to another big corp is nothing more than helping each other. But try any games it doesn't work
I don't mind frame gen but only use it if I'm already >70fps without it. It is kinda nice but if I see any visual artifacts I will turn it off. Whenever I'm playing games on my 120Hz LG C3 I will almost never use it because frame rates >120fps look really bad. I think spatial super sampling is a far more interesting and beneficial tech than frame gen. Boosting 30fps to 60fps with framegen is just garbage.
Tvs were doing this 15 years ago with their telenovela effect... Idk how anyone can play with this on.
There is definitely input lag there and artifacts.
Frame gen just has too much latency and visual glitches for me, don't think I can ever use it for most games. I'd compare with it on and off and it's a world of difference in the feel. I need the very least input lag in my gaming. Companies should rely on actual optimization. As for potato hardware, I suppose it could have it's use.

WTMG's Jordan Hawes: "With the advent of NVIDIA's DLSS 5 tools, and the whole debacle surrounding AI usage in AAA gaming, is this new push an opportunity for smaller studios to showcase they are the ones vouching for artistic integrity in the gaming industry?"
They already are. Indie studios are the only developers that constantly strive to publish innovative and experimental experiences. There has been little to no art in AAA gaming, with just a few exceptions.
Indie-studios have been showcasing their creative superiority and bravery over AAA-studios/releases for a while now.
Personally. I have zero interest in AI slop in any of my entertainment, so regardless of what Sony, Ubisoft, MS, EA, etc believe the future is, I'm just not gonna touch any of that stuff.
One more thing in a long list of things that already give indie Games an advantage
In reality a dev having a simplistic tech statck does not really impact the end user experience. If the game is good and worth playing is what matters. In other words some cooks make care if 2 or 3 eggs were used to make a cake but the person eating it doesn't. And in the case of DLSS 5 the chef is soley responsible for the recipe and how its mixed together.
This summer im saving ALL of my money to get the best AMD Processor and the best Graphics Card for mah rig.
Good read, and I can't wait to see what the future brings us. By 2015, I'll be 21, old enough to afford a really good PC/console. "Next Gen" should be simply amazing.
http://www.youtube.com/watc...
That's the level of graphics in the PS4 and 720.
Proof? Well I can't really offer any however. Epic games despite their PC history make most of their money out sourcing their engine to developers, who want to sell their game to as many people as possible, me and you, you and me, we want the best graphics, the developers also have deadlines and want to get work done. Why do you think suddenly games went from Quake engine to Unreal? Check COD:MW2 if you don't believe me there's a quake license in their. But now most games have unreal.
So, it's not only in Epics interest to produce the best current gen engine, but to keep it upto date. So, not too long ago, they went to Sony, Microsoft and Nintendo and said, something along the lines of
'We produce engines for your games, games which have made you a ton of money, Gears, Mass Effect. We're working on Unreal Engine 4, now we know you have some basics already down, but this is was we suggest
2GIG RAM-Min
1GIG V.ram
HD5770 or better, we'd like the HD6950 if you want long term
If you do those expect this. Sony has learnt from this generation to work more with third party engines, Microsoft has learnt their technology needs to be better.
http://n4g.com/news/12934/e...
Proof, and you don't think that'll happen again. The next generation is already started, behind closed doors.
Look at Battlefield 3, someone at EA said 'We really need to get this done, an engine that will carry us through the next generation.' Dice picked it up, and the engine is running on the PS4 and 720. When BF3 releases if you see one GPU out-preform the others FPS-$ rate compared to other games, then THAT'S the 720 GPU albeit with modifications.
EA may not have the 720 on desks but there's internal document floating around with it's GPU details.
or just buy a next gen console
" Is it even possible to push graphics beyond photo-realism? When will the GPU run out of gas? When will performance taper off?"
Graphics are far from looking photo realistic. Every leaf on a tree would have to have far greater detail than any main character on a PS3.
Graphics will NOT be photo realistic any time soon...I would probably seee photo realistic graphics before I die though =)