270°

Nvidia: The Future of Graphics Processing

Tom's Hardware writes: Nvidia's Tony Tamasi took ECGC 2011 attendees on a trip to the past, to the present, and to the future of both GPU rendering and mobile graphics.

During ECGC 2011, Nvidia senior vice president of content and technology Tony Tamasi made a startling prediction during his keynote presentation called "The Future of Graphics Processing." He claimed that GPU performance will increase 1000-percent by 2015, allowing graphics cards to generate real-time ray tracing and procedurally generated smoke at 30 to 60 frames per second.

Read Full Story >>
tomshardware.co.uk
OMGitzThatGuy5407d ago

This summer im saving ALL of my money to get the best AMD Processor and the best Graphics Card for mah rig.

kyl2775407d ago

get an i5-2500K, there is no reason to get AMD now as bulldozer is coming at the end of the year and the 2500K beats anything and has out at the moment

AKS5407d ago

I'm ready to build a new rig this year, but AMD's news about CPUs due later this year is what's holding me up. I'm thinking about an XFX 6990 and whatever CPU looks the best (for the price; I'm not getting a $1000 CPU for no reason if I'm not going to see much benefit from it) by the end of 2011. If nothing new impresses me, I'll just get the 2600K-i7 and OC it.

Theonetheonly5407d ago

Excellent article very well written.

Star and Flag. :)

steve30x5407d ago

@AKS : In my opinion spending $1000 on a CPU just for gaming is a total waste of money. I have a Core I5 760 overclocked @ 3.8GHz with a GTX580 and 4GB DDR3 ram and I havent come across any games that it wont kick its ass. I am getting over 60FPS in every single game I play and thats including the latest games.

AKS5407d ago

@steve30x I tend to agree, which is why I'm leaning towards a ~$300 CPU to pair with a +$700 video card.

Hell, I just have an old Q8200 paired with a GTX 460 1GB at the moment and seem to be able to play most games at pretty respectable settings most of the time (@ 1080p anyway). I am not convinced that those expensive "extreme" CPUs are going to add much to the experience even though I could afford one if I wanted it.

I'm upgrading with a new build soon largely for BF3 and some other upcoming games that I think could push the envelope. My current setup is a bit dated at this point and won't get the most out of games like BF3, and I want to finally play Metro maxed.

+ Show (1) more replyLast reply 5407d ago
ATiElite5407d ago (Edited 5407d ago )

$224 will get you an awesome Intel Core i5 2500K with an unlocked multiplier allowing you to overclock it easily to 4.7GHZ on air (w/ a good heatsink of course)

But if your an AMD fanboy then by all means go ahead as their 6 core CPU's will be under $200 when the new Bulldozer CPU arrive.

As far as GPU's go 560 GTX or HD 6950 are the best bang for buck GPU's and if you get two then you have more power than a 580GTX but at a cheaper price. Even one of these GPU's are great and you wouldn't have to upgrade until DX12...even then it's optional.

Ray Tracing has already been implemented into SOME games. I can't wait for fully rendered Ray Traced games as photo realistic graphics are at our finger tips.

Bear_Grylls5407d ago

What do you think of the rig I'm building below?

GrumpyVeteran5407d ago

+1 for the ATI 6950, really good bang for buck and if you got one earlier on you can flash the firmware to unlock it into a 6970.

CPU wise, really not sure. I have an i5 750 at the moment and it's awesome, but the i5-2500k is really really awesome, and I don't know how good AMD's cpus are gonna be.

steve30x5407d ago

@ AtiElite : Crossfire or SLI is a waste of money.I had 2 x GTX470's in SLI for six months and its a pain sometimes to get games to work with it. One game I can remember would only give me 30FPS with SLI enabled and in single graphics card mode I was getting 70FPS.

Kakkoii5405d ago

@Steve30x: That's because generally when a game comes out, you have to wait for a driver update to add a good SLI profile for it. Saying it's not worth it just because some games don't work with it is kind of a pointless argument. Since you can always switch to single GPU mode for those rare games. Not going with SLI at all would merely force you to always be using single-GPU... lol

And SLI in many cases can get in the 90% performance in crease range. Nvidia has done a great job with their drivers.

steve30x5403d ago (Edited 5403d ago )

@Kakkoii : I know how SLI works. You must not have read that I was using SLI for six months up to two months ago. Also your point about wating for an SLI profile in new drivers isnt always true. In WRC 2010 I waited for an SLI profile that would work properly and there was none up to the time I stopped using SLI and moved to a single GTX580. There were also a few other games that are over a year old that ran horribly in SLI (TDU2 is one other example that never ran right in SLI even with the newest drivers).

You also say that you can switch to single graphics card mode if a game doesnt play good in sli. So whats the point in having SLI if you have to switch to single GPU mode for some games.

+ Show (2) more repliesLast reply 5403d ago
starchild5407d ago

And this is one of the big reasons I will continue to game primarily on the PC. It will always be at the cutting edge of graphics rendering.

Bear_Grylls5407d ago (Edited 5407d ago )

I have just ordered this beast of a PC for my brother and am going to put it together myself.

Super Beastly rig for just over $2000 Australian.

AMD all the way baby. Intel are just over priced and under performing if you don't spend heaps.

1x CPU CoolerMaster Hyper TX3 $25.00
1x Thermaltake Element G case $139.00
1x ASUS Crosshair IV Formula Motherboard $299.00
1x Antec Cyclone Blower Slot Mounted Cooler GPU $16.00
1x Logitech Wireless Desktop MK710 $115.00
1x Seagate Momentus XT 500GB SSD $149.00
1x Western Digital RE4 1TB $139.00
1x AMD Phenom II X6 1100T Black Edition $255.00
1x LG CH10LS20 10X Blu-ray DVD Combo Drive $125.00
1x G.Skill Ripjaws X 16GB (4x4GB) DDR3 $245.00
1x ASUS Radeon 6970 DirectCU II 2GB $399.00
1x Microsoft Windows 7 Home Premium 64bit with SP1 OEM $105.00
1x Corsair TX-850 V2 Power Supply $179.00

$2190.00 Sub-Total:
$105.00 PP Standard Shipping:
$208.64 GST Included:
$2295.00 Total:

Battlefield 3 is going to be oh so sweet and this rig will be powerful for 5-6+ years, all I will need to do is throw in a new GPU every 2 years and hey presto.

If I want an Intel rig to beat this it would cost at least $1000 more.

AMD FTW!

We are using a beautifier 105cm full HD TV to run this on so no need for a new monitor plus I have like 3 HD monitors lying around as is.

Jamaicangmr5407d ago (Edited 5407d ago )

"AMD all the way baby. Intel are just over priced and under performing if you don't spend heaps."

i have an i7 950 which retails for US$275 while the X6 1100t goes for US$225. I can assure you as 2 friends of mine has a 1090t and an 1100t and i run rings around them at stock. The i5 2500k is the same price as the 1100t and still out performs it. You can even youtube 3d mark CPU benchmark results. the AMD X6 1100t is a damn good processor but not as good as the equally priced QUAD CORE Intel processor.

So your argument really holds no water. If you are an AMD fan boy then find more power to you. However don't spread fouls truths just to justify your preference.

Oh yeah and by intel rig i assume board and processor in which case i could add an i7 950 $275 and Rampage 3 fomula $285 so thats ruffly $123 more. Or better yet i5 2500k in which case it's just $73. So i'm sure where you got that $1000 from but whatever you are wrong anyway.

steve30x5407d ago

@ Bear Grylls : Why does he need 16GB ram? Thats way overkill. You will not need any more than 8GB ram maximum.

delicia5407d ago

Good read, and I can't wait to see what the future brings us. By 2015, I'll be 21, old enough to afford a really good PC/console. "Next Gen" should be simply amazing.

mrv3215407d ago

http://www.youtube.com/watc...

That's the level of graphics in the PS4 and 720.

Proof? Well I can't really offer any however. Epic games despite their PC history make most of their money out sourcing their engine to developers, who want to sell their game to as many people as possible, me and you, you and me, we want the best graphics, the developers also have deadlines and want to get work done. Why do you think suddenly games went from Quake engine to Unreal? Check COD:MW2 if you don't believe me there's a quake license in their. But now most games have unreal.

So, it's not only in Epics interest to produce the best current gen engine, but to keep it upto date. So, not too long ago, they went to Sony, Microsoft and Nintendo and said, something along the lines of

'We produce engines for your games, games which have made you a ton of money, Gears, Mass Effect. We're working on Unreal Engine 4, now we know you have some basics already down, but this is was we suggest

2GIG RAM-Min
1GIG V.ram
HD5770 or better, we'd like the HD6950 if you want long term

If you do those expect this. Sony has learnt from this generation to work more with third party engines, Microsoft has learnt their technology needs to be better.

http://n4g.com/news/12934/e...

Proof, and you don't think that'll happen again. The next generation is already started, behind closed doors.

Look at Battlefield 3, someone at EA said 'We really need to get this done, an engine that will carry us through the next generation.' Dice picked it up, and the engine is running on the PS4 and 720. When BF3 releases if you see one GPU out-preform the others FPS-$ rate compared to other games, then THAT'S the 720 GPU albeit with modifications.

EA may not have the 720 on desks but there's internal document floating around with it's GPU details.

Bear_Grylls5407d ago

That made hardly any sense and you first paragraph couldn't be more wrong.

tiphanycufflink5407d ago

lol I've been trying to figure out his point for about 10 minutes now.

ATiElite5407d ago (Edited 5407d ago )

I use to be a HUGE AMD fanboy but got tired of seeing AMD chips about 7 to 10 chips below a bunch of Intels in Benchmark test so I switched. you paid about $55 too much for your 1100T (Newegg has them for $200). your putting together a nice rig, one to be proud of.

AMD makes very excellent chips, trust me I know from experience but Intels are worth every penny. They are not overrated. stock Core i5/i7 2500k/2600k BLOW every AMD chip out the water. Overclock them to 4.7ghz easy and it's mind blowing.

1x Seagate Momentus XT 500GB SSD $149.00-Obvious typo as NO 500GB SSD is $150....if you can do some price comparisons cause i see some items you can save some cash on.

anyway MRV321 must be speculating on what the 720 will have as far as GPU. Most likely one system will have a Tegra (720) and another will have a high end Bulldozer (Wii2) and the PS4 will maybe reuse the Cell but add a real GPU this time around from AMD. it's all speculation.

But i do feel the 720 specs are out there and manufacturing is ramping up.

Kakkoii5405d ago

@ATiElite: 720 with a Tegra? Lol? Tegra is an ultra-compact SoC meant for mobile platforms. Even Tegra's two generations from now will just be reaching the power of the current consoles. They will not be anywhere near powerful enough for a next gen console.

BeastlyRig5407d ago (Edited 5407d ago )

well epic said they could optimize that demo to run on a single gtx 580!

I have a feeling this time around pc will start off next gen already more powerful!

I'm not sure if the next console will beat a gtx 580..

dirthurts5407d ago

Next gen consoles will not look like that. That's more likely next gen pc, but no way for consoles.

Dragonlord1125407d ago

or just buy a next gen console

theonlylolking5407d ago

" Is it even possible to push graphics beyond photo-realism? When will the GPU run out of gas? When will performance taper off?"

Graphics are far from looking photo realistic. Every leaf on a tree would have to have far greater detail than any main character on a PS3.

Graphics will NOT be photo realistic any time soon...I would probably seee photo realistic graphics before I die though =)

ATiElite5407d ago

Crysis has many parts that are photo realistic and that was back in 2007.

The majority of the time that the BF3 trailer is playing i'm thinking to myself "dam that looks real".

real life graphics are at our finger tips.

40°

DLSS 3.8 vs 4.0 vs 4.5: Ultra Performance as Good as Native 4K

NVIDIA rolled out the DLSS 4.5 update at CES last week, adding 2nd Gen Transformer-based Super Resolution technology for all RTX GPUs. The performance scaling varies wildly across the older (RTX 20/RTX 30) and newer (RTX 40/RTX 50) GeForce RTX lineups. We tested NVIDIA’s next-gen upscaling solution across Cyberpunk 2077, Black Myth: Wukong, Oblivion Remastered, and KCD 2.

Read Full Story >>
pcoptimizedsettings.com
MrDead44d ago

I've been surprised by this, the difference between 4 and 4.5 is very noticeable. It's almost completely or has removed that weird dark ghosting that you'd get in foggy games like Silent Hill 2... and Cyberpunk mixed with a high res texture pack is jaw dropping in ultra 4k.

Also if anyone doesn't know I recommend DLSS swapper, it allows you to inject the latest DLSS version into older games.

batiti9343d ago

totally useless since NVIDIA app release last year... It does force latest DLSS to global settings if you ask the app to do so.

MrDead42d ago

The NVidia app doesn't let you choose which version of, DLSS Frame Gen and DLSS Ray Reconstruction like DLSS Swapper does.

Goodguy0144d ago

Quite amazing. But, this does probably mean devs will depend on ai even more for their supposed optimizations lol.

Neonridr44d ago

no offense to AMD, but this sort of stuff shows that they are always going to be playing catchup. I guess Nintendo can take advantage of some of these features.

badz14942d ago

With the Switch 2? NVidia can easily lock their proprietary tech to their latest GPUs and the Switch 2 will be stuck on 3.5 for 5 more years at least

Neonridr42d ago (Edited 42d ago )

4 and 4.5 are available on 2 and 3 series cards right now. The Switch GPU is based on 3 series architecture, meaning it has access to some of those features. Obviously not as much as the higher end cards, but still some.

TheDreamCorridor42d ago (Edited 42d ago )

"Better than native."

Native 4K in nearly all games nowadays is actually native resolution with forced temporal anti-aliasing.

TAA smears and blurs frames together to soften jagged edges.

Of course DLSS makes games look "better than native" because native alone without any competent AA methods makes games look horrible.

150°

NVIDIA DLSS 4 vs AMD FSR 4 Compared: Ray Reconstruction Makes FSR 4 Look Last-Gen

FSR 4 was a substantial improvement to AMD’s upscaling solution. It reduces ghosting, improves finer mesh retention, and particle effects. In most cases, it delivers similar visual quality to DLSS 4’s CNN model, but slightly worse than the newer transformer model.

Read Full Story >>
pcoptimizedsettings.com
dveio84d ago

Since FSR is open-source and nvidia's DLSS isn't, I'd personally always prefer FSR.

Frankly, I think all these differences are nice to know (and notice) about if you're playing at DF level. And I totally respect that very small need to max out performance.

But given the prices, I don't think any nvidia GPU advantage justifies paying 1000+ bucks. I don't see any game(s) exclusively (or not) available on PC that offer a fundamentally different and innovative gameplay experience.

Notellin84d ago

There's never a good reason to own any products from Nvidia. They are one of the most destructive and anti-consumer companies that's ever existed.

Anyone buying and using Nvidia is only contributing to the downfall and end of gaming as we know it now.

With the rise of Nvidia all we've seen is price gouging while their products that continue to become less power efficient and their performance gains are so miniscule you'd need a 100x microscope to notice the AI upscaling. Pathetic really.

Tapani83d ago

Why do you need to pay 1000 bucks for an Nvidia GPU? You can find one that is faster than the PS5 Pro at 400 bucks, RTX 5060 ti 16GB, and it has better upscaling, more VRAM, multiframe generation and RT.

Gamersunite88084d ago

DLSS will always be better. FSR sucks.

__y2jb83d ago

The examples given look essentially identical.

babadivad83d ago

Exactly. Headline says FSR looks like last gen. Implying it's years behind the competition. Article says it's slightly behind.

Examples shown, the difference are barely discernible.

derek83d ago

I dont know about anyone else, but I've never had 2 screens playing at the same time to know the difference in performance of a given game. It's like those TV screen comparisons, virtually nobody in the real world engages does this, lol. Performance seems comparable to me. Besides Nvidia is no longer interested in the gaming products, its full steam ahead with "AI".

Tapani83d ago (Edited 83d ago )

Yeah, but the gaving division is still 8.5% of their global revenue, and they just made 30% YoY topline growth per quarter. A 11.35 billion business is absolutely massive, and this will continue to increase. That means there's 11.35bn reasons why they won't stop the gaming business, nor lose their focus on it. It's also their pivot if things do not go as well in the AI race. By end of 2026, they have DOUBLED the gaming division business in 5 years.

FY 2025 $11.35 billion 8.6%
FY 2024 $10.45 billion 15.2% (approx)
FY 2023 $9.07 billion -7.5% (approx)
FY 2022 $9.82 billion (approx) 49.6% (approx)
FY 2021 $6.5 billion (approx) 61.1% (approx)

MrDead83d ago

I've been lucky enough to get a new 5090 build in March, glad I went with Nvidia. Cyberpunk looks amazing.

Show all comments (11)
100°

AMD's RX9070 XT crushes Nvidia's RTX 5080 in Call of Duty: Black Ops 7 benchmarks - Story Mode

The 9070XT matches or beats Nvidia's much more expensive 5080 in CoD: BO7 benchmarks. A rare win for AMD. The article also takes a closer look at 9600X vs 9800X3D performance.

Read Full Story >>
storymode.info
wesnytsfs137d ago

No ray tracing might be why.

Runechaz137d ago

Ray tracing is useless in a fps

thecodingart137d ago

Came looking for dumb comments - found them

Zenzuu137d ago

Not every game needs to have ray tracing.

Darkseeker137d ago

I'd even say no games need to have it. It's just a ressource hog.

Blad3runner00137d ago (Edited 137d ago )

Why does the article use misleading terms like "Crushes" and "The 9070 XT "HANDILY BEATS" the more expensive RTX 5080" ? It even admits it at the end of the article, yet keeps the terms lol

Looking at the graph, the difference is only 4-19fps, depending on the settings.

I would hardly call a 4-19fps difference, "crushes" or "handily beats" and no one is going to buy a 9070 over a 5080 for COD alone. How does the 9070 fair in other games compared to the 5080?

OpenGL137d ago

I think they exaggerate because people like when a product punches above its weight, especially from an underdog, but yeah it's not a huge difference. There are plenty of games where the 5080 is significantly faster.

wesnytsfs137d ago

That is basically what the 5090 does compared to the 4090. I dont consider it crushing either and decided to keep my 4090 over geting the 5090 with its small increase of FPS.

OpenGL136d ago

That's a no brainer, the 5090 is definitely the fastest card on the market but the 4090 is the second fastest, so it's still extremely powerful.