320°

NVIDIA Issues Official Statement Regarding GTX970's VRAM Usage

DSOGaming writes: "After the whole controversy surrounding GTX970’s VRAM usage, NVIDIA went ahead and made an official statement regarding this issue. According to the green team, GTX970’s memory is split in two sections; the first section that has a higher priority allocates 3.5GB while the second section allocates 500MB. And while a game can allocate all 4GBs, only 3.5GB are being allocated when a game needs less than that."

Read Full Story >>
dsogaming.com
sorceror1714057d ago

Sure, the car holds 20 gallons. The tank holds 15 gallons, and there are five 1-gsllon cans in the trunk. Sure, actually *using* those last five gallons is slow, but there's 20 gallons. What are you complaining about?

SilentNegotiator4057d ago

That's really not a good analogy.

UltraNova4057d ago (Edited 4057d ago )

Whuuut?

Ok what you might wanted to say is, a car's tank holds 20 gallons of which 5 is the reserve, the gas meter on the dash is calibrated that way as well so in the end you only use 15 gallons on day to day basis, unless you forget to fill up one day (a not so rare occasion, admittedly) and the next gas station will require burning fuel from that 5 gallon reserve.

OT. I dont see the problem here for those who got an 970...is not as if that 500mb of memory is off limits permanently, like the 8th core of an 8-core CPU in order to increase yield...

sorceror1714057d ago

Sure, that 512MB is there. It's just *much* slower to access than the rest of the VRAM. As in, three to eight *times* slower. That's more like having 3.5GB of RAM, plus 512MB of L2 cache or something. Check the numbers - drops from 150GB/sec to 46GB/sec, all the way down to *19* GB/sec.

No one's saying that memory is "off limits permanently". But VRAM is supposed to be immediately accessible to the GPU, and this memory... isn't. So it's memory, but lumping it in with the rest of the VRAM is deceptive at best.

Same as with the gas can in the trunk. Sure, it's fuel, and the car *can* use it... it's just dramatically slower and less convenient to access it.

UltraNova4056d ago

"Check the numbers - drops from 150GB/sec to 46GB/sec, all the way down to *19* GB/sec. "

Well if that's ^^ truly the case (I'm not doubting you)then yeah selling the card as a 4gb VRAM option is misleading.

airshiraz4057d ago

i trust nvidia
they wont cheat us
i have seen 4 gigabytes usage of ram in asassins creed unity .u can see for urself with evga precison and put asasins unity resolution on 4k

FlyingFoxy4057d ago

You trust a company that puts out Titan GPU's that cost a fortune, and are only a little better than a top card from the competitor that's much cheaper?

I have a 970 because they were well priced for the performance, but there's no way i am kissing Nvidia's ass.. they are going to make the same mistake with the new Titan against the 390x.

airshiraz4057d ago

did they lie about titan?i mean they dont lie when i say i trust them.they have so many awful cards and customer must open his eyes before buying a gpu

wannabe gamer4057d ago

lol once again someone talking out their arse that doesnt understand that titan isnt just for gaming. lmao

Maxor4057d ago

I have no idea what the fuss is about. It's not like they gimped the card. The performance is tiered or else there wouldn't be any reason for the 980 GTX to exist!

FlyingFoxy4057d ago

I think it only really affects 4k which pretty much runs crappy on any single card anyway, so anyone with a 970 or 2 is likely to be playing at 1080p or 1440p max.

fullmetal2974056d ago (Edited 4056d ago )

I still don't understand how partitioning t
the memory into 3500mb and 500mb sections helps for better traffic management. Someone care to explain?

40°

DLSS 3.8 vs 4.0 vs 4.5: Ultra Performance as Good as Native 4K

NVIDIA rolled out the DLSS 4.5 update at CES last week, adding 2nd Gen Transformer-based Super Resolution technology for all RTX GPUs. The performance scaling varies wildly across the older (RTX 20/RTX 30) and newer (RTX 40/RTX 50) GeForce RTX lineups. We tested NVIDIA’s next-gen upscaling solution across Cyberpunk 2077, Black Myth: Wukong, Oblivion Remastered, and KCD 2.

Read Full Story >>
pcoptimizedsettings.com
MrDead52d ago

I've been surprised by this, the difference between 4 and 4.5 is very noticeable. It's almost completely or has removed that weird dark ghosting that you'd get in foggy games like Silent Hill 2... and Cyberpunk mixed with a high res texture pack is jaw dropping in ultra 4k.

Also if anyone doesn't know I recommend DLSS swapper, it allows you to inject the latest DLSS version into older games.

batiti9350d ago

totally useless since NVIDIA app release last year... It does force latest DLSS to global settings if you ask the app to do so.

MrDead50d ago

The NVidia app doesn't let you choose which version of, DLSS Frame Gen and DLSS Ray Reconstruction like DLSS Swapper does.

Goodguy0151d ago

Quite amazing. But, this does probably mean devs will depend on ai even more for their supposed optimizations lol.

Neonridr51d ago

no offense to AMD, but this sort of stuff shows that they are always going to be playing catchup. I guess Nintendo can take advantage of some of these features.

badz14950d ago

With the Switch 2? NVidia can easily lock their proprietary tech to their latest GPUs and the Switch 2 will be stuck on 3.5 for 5 more years at least

Neonridr50d ago (Edited 50d ago )

4 and 4.5 are available on 2 and 3 series cards right now. The Switch GPU is based on 3 series architecture, meaning it has access to some of those features. Obviously not as much as the higher end cards, but still some.

TheDreamCorridor50d ago (Edited 50d ago )

"Better than native."

Native 4K in nearly all games nowadays is actually native resolution with forced temporal anti-aliasing.

TAA smears and blurs frames together to soften jagged edges.

Of course DLSS makes games look "better than native" because native alone without any competent AA methods makes games look horrible.

150°

NVIDIA DLSS 4 vs AMD FSR 4 Compared: Ray Reconstruction Makes FSR 4 Look Last-Gen

FSR 4 was a substantial improvement to AMD’s upscaling solution. It reduces ghosting, improves finer mesh retention, and particle effects. In most cases, it delivers similar visual quality to DLSS 4’s CNN model, but slightly worse than the newer transformer model.

Read Full Story >>
pcoptimizedsettings.com
dveio91d ago

Since FSR is open-source and nvidia's DLSS isn't, I'd personally always prefer FSR.

Frankly, I think all these differences are nice to know (and notice) about if you're playing at DF level. And I totally respect that very small need to max out performance.

But given the prices, I don't think any nvidia GPU advantage justifies paying 1000+ bucks. I don't see any game(s) exclusively (or not) available on PC that offer a fundamentally different and innovative gameplay experience.

Notellin91d ago

There's never a good reason to own any products from Nvidia. They are one of the most destructive and anti-consumer companies that's ever existed.

Anyone buying and using Nvidia is only contributing to the downfall and end of gaming as we know it now.

With the rise of Nvidia all we've seen is price gouging while their products that continue to become less power efficient and their performance gains are so miniscule you'd need a 100x microscope to notice the AI upscaling. Pathetic really.

Tapani91d ago

Why do you need to pay 1000 bucks for an Nvidia GPU? You can find one that is faster than the PS5 Pro at 400 bucks, RTX 5060 ti 16GB, and it has better upscaling, more VRAM, multiframe generation and RT.

Gamersunite88091d ago

DLSS will always be better. FSR sucks.

__y2jb91d ago

The examples given look essentially identical.

babadivad91d ago

Exactly. Headline says FSR looks like last gen. Implying it's years behind the competition. Article says it's slightly behind.

Examples shown, the difference are barely discernible.

derek91d ago

I dont know about anyone else, but I've never had 2 screens playing at the same time to know the difference in performance of a given game. It's like those TV screen comparisons, virtually nobody in the real world engages does this, lol. Performance seems comparable to me. Besides Nvidia is no longer interested in the gaming products, its full steam ahead with "AI".

Tapani91d ago (Edited 91d ago )

Yeah, but the gaving division is still 8.5% of their global revenue, and they just made 30% YoY topline growth per quarter. A 11.35 billion business is absolutely massive, and this will continue to increase. That means there's 11.35bn reasons why they won't stop the gaming business, nor lose their focus on it. It's also their pivot if things do not go as well in the AI race. By end of 2026, they have DOUBLED the gaming division business in 5 years.

FY 2025 $11.35 billion 8.6%
FY 2024 $10.45 billion 15.2% (approx)
FY 2023 $9.07 billion -7.5% (approx)
FY 2022 $9.82 billion (approx) 49.6% (approx)
FY 2021 $6.5 billion (approx) 61.1% (approx)

MrDead91d ago

I've been lucky enough to get a new 5090 build in March, glad I went with Nvidia. Cyberpunk looks amazing.

Show all comments (11)
100°

AMD's RX9070 XT crushes Nvidia's RTX 5080 in Call of Duty: Black Ops 7 benchmarks - Story Mode

The 9070XT matches or beats Nvidia's much more expensive 5080 in CoD: BO7 benchmarks. A rare win for AMD. The article also takes a closer look at 9600X vs 9800X3D performance.

Read Full Story >>
storymode.info
wesnytsfs145d ago

No ray tracing might be why.

Runechaz145d ago

Ray tracing is useless in a fps

thecodingart145d ago

Came looking for dumb comments - found them

Zenzuu145d ago

Not every game needs to have ray tracing.

Darkseeker145d ago

I'd even say no games need to have it. It's just a ressource hog.

Blad3runner00145d ago (Edited 145d ago )

Why does the article use misleading terms like "Crushes" and "The 9070 XT "HANDILY BEATS" the more expensive RTX 5080" ? It even admits it at the end of the article, yet keeps the terms lol

Looking at the graph, the difference is only 4-19fps, depending on the settings.

I would hardly call a 4-19fps difference, "crushes" or "handily beats" and no one is going to buy a 9070 over a 5080 for COD alone. How does the 9070 fair in other games compared to the 5080?

OpenGL145d ago

I think they exaggerate because people like when a product punches above its weight, especially from an underdog, but yeah it's not a huge difference. There are plenty of games where the 5080 is significantly faster.

wesnytsfs144d ago

That is basically what the 5090 does compared to the 4090. I dont consider it crushing either and decided to keep my 4090 over geting the 5090 with its small increase of FPS.

OpenGL144d ago

That's a no brainer, the 5090 is definitely the fastest card on the market but the 4090 is the second fastest, so it's still extremely powerful.