30°

Immersion Headset Imaginatively Increases Player Input

Sam Matson is an inventor. He’s my kind of inventor, because he also has an eye for design. His website is easy on the eyes and a good argument for the products he cooks up. And those products look pretty functional as well as functionally pretty. His latest invention is a simple, easy, elegant design for a headset called Immersion. The reasons behind the creation of the device are a bit lacking in imagination, but the technology behind it and the design are both awesome.

Read Full Story >>
thegamingheretic.com
40°

Seven Deadly Sins: Origin – Everything We Know About the Upcoming Open-World RPG

Discover everything about Seven Deadly Sins: Origin, including open-world gameplay, story, characters, combat system, graphics, platforms, and what fans can expect at launch.

Read Full Story >>
softwaretestinglead.com
80°

DLSS 5 Isn’t the Future of Graphics; It’s a Filter Over Them

TNS: DLSS 5 promises photorealism, but its AI-driven enhancements risk overriding artistic intent; especially when it comes to character design.

Read Full Story >>
thenerdstash.com
Neonridr7d ago (Edited 7d ago )

The nice thing about DLSS5 is the devs have full creative control over how they implement it. They can mask off assets if they don't want the tech to adjust anything on them, so main characters could be kept as they are intended. If they just want to adjust foliage or lighting, they can do that. That being said, I can't imagine the artists had some specific aesthetic for random NPCs in games like Starfield where this new DLSS makes the characters look that much more lifelike and real.

I can understand people getting upset over Grace and her changes, even though if you look at the new image it actually looks a lot closer to the real model for Grace funnily enough.

I think this whole demonstration was showing what is possible, not necessarily what always will happen. At the end of the day devs can just choose not to use it, and end users don't have to use it either.

thecodingart7d ago

Hi Jensen! Please read the room.

rippermcrip7d ago

This is what I don't understand. Why the hate for Nvidia when they only supplied a tool. A tool that when used appropriately could be amazing. If devs want to be lazy implementing it, that's on them.

Neonridr7d ago

and the end user can decide not to use it if they don't want to.

I personally don't get the hate, this has the potential to be huge in certain situations. People always want to hate on new stuff for some reason, I remember when people called Raytracing a gimmick and something that nobody would care about back when the 2 series cards first showed it off.

blacktiger7d ago

That's where you didn't understand the joke. Nividia purposley teaming with entire triple a studio to create shitty and to improve when nividia card is installed.

This is called scaling instead of native. They don't want us to have super computer which is a threat to elite.
These tools are fine but instead of focusing on real next gen with extra ram into video cards.

Nvidia decreases ram and giving us a filter garbage!!!!!!

Notellin6d ago

Keep kneeling down to lick their boots. You're doing a great job and the billionaires really need your support.

DarXyde7d ago

Let's set aside the issues with the plastic-looking AI for a moment.

I don't personally have much issue with an intelligent approach to improving performance through creative means - prior iterations of DLSS, PSSR, and FSR seem fine to me. Where I take issue is with the apparent misuse of the added power in terms of immersion and overt reliance on AI in everything. It's depressing to see AI shoehorned into everything and Huang is going way too far in this iteration. I'm old enough to remember that when you bought new components, you got real frames - not this frame generation nonsense. We're at a point where they're charging a premium for fake frames and, because of the very technology used in the RTX 50 cards, the cost to build/buy any computer is out of control. Without the AI features, improvements are modest at best between RTX 40 and RTX 50 cards, albeit similar performance on RTX 50 is more efficient. But the price increase is basically telling you that you shouldn't upgrade unless you're going to use AI features. Really, think about that: you're paying a premium for frames that aren't really there and image quality that isn't really there. I don't think that justifies the ridiculous component shortages, made *even worse* when we consider that helium supply is affected by the war in Iran - pretty important for things like semiconductors.

Now let's actually talk about the demonstration. I saw some typical AI weirdness. Aside from the AI slop-style image it outputs, it made the lighting worse and the color correction was weird. It seems to me that, for AI image reconstruction, all roads eventually lead to your standard AI slop output. I certainly would not take Jensen Huang's word for it when he says it can look different. Development of this technology involves machine learning. It needs to be trained on tons of images. Tell me how this technology works with novel styles or colors/lighting used in novel settings. I believe this is where the strange color correction is coming from.

And I should also add there's kind of an illusion of choice for devs here. They can decide not to use it...but that just makes people curious about what it would look like if it did. People with DLSS5-capable cards are going to want to take advantage of it because the level of AI enhancement is often the only real differentiating factor with preceding cards. We already saw that with PS5 Pro where people kept asking when this game or that game is getting PSSR enhanced. The availability of the technology creates an expectation, and people are going to feel that it is incumbent upon developers to make their purchase worth it. We do it every generation when we complain about cross-generation games because we want our games to take full advantage of the new hardware. This is no different.

darthv727d ago

....we all knew there would be a celing hit at some point when it comes to how much more fidelity and how many more frames and how much more resolution there can be before you just simply can't do anymore. With each new iteration of tech, you are only getting refinement, no more real innovation. And that is to be expected ever since we hit the uncanny valley. So you either live in the past and play the older stuff the way it was, or you embrace the present and deal with these directions that newer generations of players have already come to expect. If you are 40 and up, you are NOT the target demographic for these kinds of changes. Kids today WANT the CGI of the movies in their games. They have no appreciation for the classic animation and pixel art of what old timers grew up with.

Christopher6d ago

***If you are 40 and up, you are NOT the target demographic for these kinds of changes. Kids today WANT the CGI of the movies in their games. They have no appreciation for the classic animation and pixel art of what old timers grew up with. ***

You mean those demographics that are mostly playing Minecraft, Fortnite, Marvels Rivals, and the like?

Extermin8or3_6d ago

"I'm old enough to remember when you bought new hardware you got new frames, not this frame generation " ok grandpa. I'm sure your turning machine was might my impressive back in the day but could you have said something that made you sound like less of a tit?

Also as things stand the component shortages aren't because of Iran. They are still the knock on effect of things that happened during COVID and the massive expansion of hardware needed for AI data centres.

On your frames not really there point let me put this into some context for you. So it's using probability to predict there stuff will be to draw additional frames usually. This doesn't mean the frames aren't real. Infact it's not actually that different to how quantum mechanics works. If you have a particle inba vaccine you can know it's position or it's velocity but not both. The more accurately you know one the less accurately you know the other. As soon as you measure one the other becomes less accurate. Yes this applies to every particle in your body and infact the universe and interactions between particles are considered the same as a measurement. Now if I get a ball and throw it I can still tell you where the whole ball as a collection of particles is and what velocity it is travelling at.

Anyway, back to the quantum mechanics side of things: that means I can work out a probability that for a given velocity or position the other value will be x likely to be y. This becomes known as the probability distribution. If I want to know the precise details about a given set of particles I have to use this probability distribution to work it out. You can still use this to model a system and work out what particles are doing/will do. You are no more or less real just because the particles inside you that comprise your body are only a probability distribution until a measurement is taken.

Let me give you another example: a computer generated an image of a location. Is that real? Why? Can you go there, physically? No....

DarXyde5d ago

darthv72,

A few points:

"....we all knew there would be a celing hit at some point when it comes to how much more fidelity and how many more frames and how much more resolution there can be before you just simply can't do anymore."

Correct, but doesn't that say a lot about consumer culture? If we can't do anymore, why give us the same thing but less reliably? Frame generation and DLSS do introduce problems are imperfect technologies, and I think demanding this price to test and train their models is madness. Really think about it: we're losing jobs, our environment, and affordable PCs for this? Really?

"With each new iteration of tech, you are only getting refinement, no more real innovation."

I don't agree. Technology is still rapidly advancing, and they're using AI in genuinely interesting ways outside of gaming: a hospital in South Korea is currently pilot testing a patch that people wear and it sends notifications to their phone when their blood sugar gets low. This is well beyond "refinement" and absolutely passes muster for innovative. Within the gaming space, instead of focusing on outright visual enhancement, there are other approaches worth considering when immersion takes priority: Bringing sense of smell into games? Temperature? Eventually as we learn more about the human brain, VR where your thoughts can control your character? Even if we just think about what's possible today, Why not use the technology to go the extra mile and make path tracing a more standard thing?

"So you either live in the past and play the older stuff the way it was, or you embrace the present and deal with these directions that newer generations of players have already come to expect."

I think this is incredibly reductive and myopic. You resign yourself to what is, and don't care about what should be. Ever hear of voting with your wallet? The new generation of players' expectations doesn't make change positive - for example, I *expect* games to be sold incomplete with some egregious monetization mechanism. Doesn't mean we have to accept a damn thing, and you'd do well to remember that.

"If you are 40 and up, you are NOT the target demographic for these kinds of changes. Kids today WANT the CGI of the movies in their games. They have no appreciation for the classic animation and pixel art of what old timers grew up with. "

You don't think you're overgeneralizing with this comment? I know people in their 50s that only care about graphics and plenty of younger folks love Minecraft. How many 40+ people do you think carried sales for Terraria, Stardew Valley, UndertTale or Shovel Knight?

I think they do have an appreciation. These are new IPs by the way. It's not like they re-released something the old heads played back in the day and was carried by nostalgia (think Virtual Console titles).

DarXyde4d ago

Extermin8or3_,

"ok grandpa. I'm sure your turning machine was might my impressive back in the day but could you have said something that made you sound like less of a tit?"

Way to misunderstand the comment—the "I'm old enough" comment is clearly tongue in cheek because it was, what, just a few years ago? But thanks for the useless remark. It was a gentle reminder that the average bulb isn't terribly bright (since we're throwing shit like children).

"On your frames not really there point let me put this into some context for you. So it's using probability to predict there stuff will be to draw additional frames usually. This doesn't mean the frames aren't real."

So what you're saying is... It's adding frames that weren't there before, by predicting the how the next frame appears, then creating them when they weren't there originally? Sounds like they're making it up using "educated guessing", which is what I'm saying. They're not real frames. It's generating frames that weren't there, and we already know from AI upscaling that prediction reduces overhead, but it's less reliable: it is why we get weird results when using this technology.

"Also as things stand the component shortages aren't because of Iran."

FFS, *read* what I wrote, please. I very clearly stated the **helium supply** exacerbated an existing issue. Reread my comment: I acknowledge there is a component shortage, *made worse* by the war. To the extent that I do reference component shortages, I correctly point out that AI is causing a lot of issues here. And yes, the war does have some hand in issues: https://www.cnbc.com/2026/0...

"Let me give you another example: a computer generated an image of a location. Is that real? Why? Can you go there, physically? No...."

Weird example and I don't even understand your argument here. If a computer generates an image of a real location, then yes, you can go there. If I generate an image of the Eiffel Tower or Angkor Wat, of course you can go there.

If we're talking about some place fictional like Pandora from Avatar, no, but we clearly agree to suspend our disbelief and treat it as a "location". In reality, it's not a real place.

You're missing the point about the frames. Previously, it would simply output the frames it could based on actual power. If nothing else, there was certainty you were paying for the real performance. Now, you're paying a premium for the illusion of performance. Do you see my point? It's a marked increase in cost for smoke and mirrors that might botch the output just because it makes an erroneous prediction.

+ Show (2) more repliesLast reply 4d ago
Popsicle7d ago (Edited 7d ago )

I think in most cases the character models look better. Conversely, in most cases the lighting looks worse. It is brighter and cleaner but looks less natural.

That said, the broader issue I have with this tech is that it will likely eventually be used to replace most artist jobs in the industry. The C-Suite thought process being, why pay people to create when AI can artificially generate an image. The gaming industry has been shedding jobs dramatically over the last few years and this will only accelerate the process. All said, I think the pushback is less about the image that’s being produced and more about concerns of AI taking away peoples ability to earn a living. DLSS 5 and tech like this will eventually move us to a world that is less human and therefore less than optimal for humanity itself.

Extermin8or3_6d ago

Unlikely it will be used to replace artists. Nvidia where very clear that the higher the quality going in of assets of the artwork the better the output. I think this could be seen best with some of the environmental stuff on like assassinscreed.

TOTSUKO7d ago

I’m pretty positive these devs/publishers were given these tools and even used it for this demo. NVIDIA can’t just tamper the art style but it’s funny how the developers are super silent about it and just letting NVIDIA take the blowback lol

Profchaos6d ago

Funny you say that as insider gaming reports that developers from Capcom actually said they had no idea this was happening to RE9 and didn't approve this.

Notellin6d ago (Edited 6d ago )

What do you know another N4G bootlicker. If these are the kinds of people left in this community it has fallen to it's lowest depths.

+ Show (4) more repliesLast reply 4d ago
Goodguy017d ago

Good for old hardware...but should not be standard for more recent hardware.

Profchaos6d ago

I don't even think it will be standard until Nvidia produce a 60 series right now it requires two 5090's and clearly has a lot of issues

neutralgamer19927d ago

DLSS5 made Starfield characters come alive and look much better. I don't understand all this hate on a feature that's optional to use. I guess now a days people want to complain about everything

7d ago
DustMan7d ago

You can't just keep building cards with more & more memory, higher boost clocks, before you're going to need to keep a gpu in a refrigerator. Law of diminishing returns. DLSS / FSR was a game changer for owners of mid-range / lower end gpu's. Frame generation has been a game changer for the mid/low range cards. Each one described as a crutch at one point or still are by some.

The tech is just getting better and better, and I'm excited to see what will be accomplished.

Show all comments (30)
130°

Nvidia's new DLSS 5 Brings Photo-Realistic Lighting To RTX 50-Series

DF writes: "A massive technological leap for graphics - and we've been hands-on with four games."

Read Full Story >>
digitalfoundry.net
-Foxtrot9d ago

Why does it look so AI like.

Grace looks like a completely different character

Profchaos8d ago

Because it is they say it's just AI lighting but it's clearly more it's overhauling character models and not for the better

I'm going to give this a miss

ApexLanding238d ago

Because they are using machine learning and, "machine learning is a specific subset of AI that focuses on training algorithms to learn patterns from data and improve over time."

abstractel8d ago

Yeah, I don't like that it changes the way characters look either (why is it adding makeup?) but some things, like the hair, looks a lot better. It feels like it was an artistic choice to keep her makeup pretty dialed back, if it was my game I wouldn't like the GPUs effing with my.

(Machine Learning & Neural Networks combined together are "AI", just not as catchy.)

RaidenBlack9d ago

Remember, to stop the update at ver. 4.5 .....

Profchaos8d ago

Ohh Boy df is getting destroyed over this opinion understand why but people need to not threaten or harass these guys

I don't agree with their opinion but they don't deserve threats

8d ago Replies(2)
glenn19798d ago

Every game that will use it will look the same whit that AI filter

Grilla8d ago

And every hot chick will be the same hot chick. Grace’s face looks like every other fake hot chick on instagram.

Andrew3368d ago

They showed other games... and it did not look the same.

Show all comments (72)