30°

NVIDIA Developing Subpixel Reconstruction Antialiasing to Compete with MLAA

NVIDIA is developing its own antialiasing (AA) technology to rival morphological antialiasing (MLAA). The subpixel reconstruction antialiasing (SRAA) NVIDIA is currently conducting research on aims to provide better image quality with minimal performance penalty. It "combines single-pixel (1x) shading with subpixel visibility to create antialiased images without increasing the shading cost," as NVIDIA puts it in its research abstract. SRAA is suited for rendering engines that don't use multisample antialiasing (MSAA) since they use deferred shading. For such renderers, SRAA works as a post-processing layer, just like MLAA.

Read Full Story >>
techpowerup.com
Pandamobile5505d ago

Neat. I wonder what kind of quality difference there is between SRAA and MLAA.

WildArmed5505d ago

At this point I don't even understand MLAA...
What does all this mumbo jumbo do? D:

ATiElite5504d ago (Edited 5504d ago )

MSAA and MLAA are to different types of AA (Anti-Ailising) the software that eliminates "Zaggies"

MLAA is a way to get rid of "Zaggies" just like MSAA but MLAA does it without hurting FPS as much as MSAA does.

I prefer MSAA because it has edge detection and it just does it better and sharper. MLAA is evolving very nicely but sometimes objects get blurred cause to me, in my opinion is that MLAA just blurs the edges to eliminate Zaggies while MSAA actually uses edge detection to smooth them out. Now don't get me wrong MLAA is good but I notice more blur in MLAA than MSAA. The PS3 uses MLAA where as the 360 uses up to 4x MSAA.

The best thing to do is use 4xAA and MLAA together for a really good picture or if your GPU has the horse power just use 8x MSAA. AMD HD6000 cards have MLAA and you can turn it on or off or use it with MSAA.

hopes this helps

WildArmed5504d ago

+bubs 4 helpful-ness

Thanks, that helps alot.
I never understood the difference between the types of AAs, but that makes alot more sense.
Now I understand how they used MLAA in GoW3.
hmm, my vid card is just a HD4600s. Seems like I'll need to upgrade in the upcoming future to see these techs in practice.

Thanks again for the detailed reply!

Pandamobile5504d ago

It's "jaggy", btw.

Jagged lines.

geodood5504d ago

The PS3's use of MLAA is very different to the MLAA usable with ATI cards, though.

40°

DLSS 3.8 vs 4.0 vs 4.5: Ultra Performance as Good as Native 4K

NVIDIA rolled out the DLSS 4.5 update at CES last week, adding 2nd Gen Transformer-based Super Resolution technology for all RTX GPUs. The performance scaling varies wildly across the older (RTX 20/RTX 30) and newer (RTX 40/RTX 50) GeForce RTX lineups. We tested NVIDIA’s next-gen upscaling solution across Cyberpunk 2077, Black Myth: Wukong, Oblivion Remastered, and KCD 2.

Read Full Story >>
pcoptimizedsettings.com
MrDead44d ago

I've been surprised by this, the difference between 4 and 4.5 is very noticeable. It's almost completely or has removed that weird dark ghosting that you'd get in foggy games like Silent Hill 2... and Cyberpunk mixed with a high res texture pack is jaw dropping in ultra 4k.

Also if anyone doesn't know I recommend DLSS swapper, it allows you to inject the latest DLSS version into older games.

batiti9342d ago

totally useless since NVIDIA app release last year... It does force latest DLSS to global settings if you ask the app to do so.

MrDead42d ago

The NVidia app doesn't let you choose which version of, DLSS Frame Gen and DLSS Ray Reconstruction like DLSS Swapper does.

Goodguy0143d ago

Quite amazing. But, this does probably mean devs will depend on ai even more for their supposed optimizations lol.

Neonridr43d ago

no offense to AMD, but this sort of stuff shows that they are always going to be playing catchup. I guess Nintendo can take advantage of some of these features.

badz14942d ago

With the Switch 2? NVidia can easily lock their proprietary tech to their latest GPUs and the Switch 2 will be stuck on 3.5 for 5 more years at least

Neonridr42d ago (Edited 42d ago )

4 and 4.5 are available on 2 and 3 series cards right now. The Switch GPU is based on 3 series architecture, meaning it has access to some of those features. Obviously not as much as the higher end cards, but still some.

TheDreamCorridor42d ago (Edited 42d ago )

"Better than native."

Native 4K in nearly all games nowadays is actually native resolution with forced temporal anti-aliasing.

TAA smears and blurs frames together to soften jagged edges.

Of course DLSS makes games look "better than native" because native alone without any competent AA methods makes games look horrible.

150°

NVIDIA DLSS 4 vs AMD FSR 4 Compared: Ray Reconstruction Makes FSR 4 Look Last-Gen

FSR 4 was a substantial improvement to AMD’s upscaling solution. It reduces ghosting, improves finer mesh retention, and particle effects. In most cases, it delivers similar visual quality to DLSS 4’s CNN model, but slightly worse than the newer transformer model.

Read Full Story >>
pcoptimizedsettings.com
dveio83d ago

Since FSR is open-source and nvidia's DLSS isn't, I'd personally always prefer FSR.

Frankly, I think all these differences are nice to know (and notice) about if you're playing at DF level. And I totally respect that very small need to max out performance.

But given the prices, I don't think any nvidia GPU advantage justifies paying 1000+ bucks. I don't see any game(s) exclusively (or not) available on PC that offer a fundamentally different and innovative gameplay experience.

Notellin83d ago

There's never a good reason to own any products from Nvidia. They are one of the most destructive and anti-consumer companies that's ever existed.

Anyone buying and using Nvidia is only contributing to the downfall and end of gaming as we know it now.

With the rise of Nvidia all we've seen is price gouging while their products that continue to become less power efficient and their performance gains are so miniscule you'd need a 100x microscope to notice the AI upscaling. Pathetic really.

Tapani83d ago

Why do you need to pay 1000 bucks for an Nvidia GPU? You can find one that is faster than the PS5 Pro at 400 bucks, RTX 5060 ti 16GB, and it has better upscaling, more VRAM, multiframe generation and RT.

Gamersunite88083d ago

DLSS will always be better. FSR sucks.

__y2jb83d ago

The examples given look essentially identical.

babadivad83d ago

Exactly. Headline says FSR looks like last gen. Implying it's years behind the competition. Article says it's slightly behind.

Examples shown, the difference are barely discernible.

derek83d ago

I dont know about anyone else, but I've never had 2 screens playing at the same time to know the difference in performance of a given game. It's like those TV screen comparisons, virtually nobody in the real world engages does this, lol. Performance seems comparable to me. Besides Nvidia is no longer interested in the gaming products, its full steam ahead with "AI".

Tapani83d ago (Edited 83d ago )

Yeah, but the gaving division is still 8.5% of their global revenue, and they just made 30% YoY topline growth per quarter. A 11.35 billion business is absolutely massive, and this will continue to increase. That means there's 11.35bn reasons why they won't stop the gaming business, nor lose their focus on it. It's also their pivot if things do not go as well in the AI race. By end of 2026, they have DOUBLED the gaming division business in 5 years.

FY 2025 $11.35 billion 8.6%
FY 2024 $10.45 billion 15.2% (approx)
FY 2023 $9.07 billion -7.5% (approx)
FY 2022 $9.82 billion (approx) 49.6% (approx)
FY 2021 $6.5 billion (approx) 61.1% (approx)

MrDead83d ago

I've been lucky enough to get a new 5090 build in March, glad I went with Nvidia. Cyberpunk looks amazing.

Show all comments (11)
100°

AMD's RX9070 XT crushes Nvidia's RTX 5080 in Call of Duty: Black Ops 7 benchmarks - Story Mode

The 9070XT matches or beats Nvidia's much more expensive 5080 in CoD: BO7 benchmarks. A rare win for AMD. The article also takes a closer look at 9600X vs 9800X3D performance.

Read Full Story >>
storymode.info
wesnytsfs137d ago

No ray tracing might be why.

Runechaz137d ago

Ray tracing is useless in a fps

thecodingart137d ago

Came looking for dumb comments - found them

Zenzuu137d ago

Not every game needs to have ray tracing.

Darkseeker137d ago

I'd even say no games need to have it. It's just a ressource hog.

Blad3runner00137d ago (Edited 137d ago )

Why does the article use misleading terms like "Crushes" and "The 9070 XT "HANDILY BEATS" the more expensive RTX 5080" ? It even admits it at the end of the article, yet keeps the terms lol

Looking at the graph, the difference is only 4-19fps, depending on the settings.

I would hardly call a 4-19fps difference, "crushes" or "handily beats" and no one is going to buy a 9070 over a 5080 for COD alone. How does the 9070 fair in other games compared to the 5080?

OpenGL137d ago

I think they exaggerate because people like when a product punches above its weight, especially from an underdog, but yeah it's not a huge difference. There are plenty of games where the 5080 is significantly faster.

wesnytsfs136d ago

That is basically what the 5090 does compared to the 4090. I dont consider it crushing either and decided to keep my 4090 over geting the 5090 with its small increase of FPS.

OpenGL136d ago

That's a no brainer, the 5090 is definitely the fastest card on the market but the 4090 is the second fastest, so it's still extremely powerful.