@Steve30x: That's because generally when a game comes out, you have to wait for a driver update to add a good SLI profile for it. Saying it's not worth it just because some games don't work with it is kind of a pointless argument. Since you can always switch to single GPU mode for those rare games. Not going with SLI at all would merely force you to always be using single-GPU... lol
And SLI in many cases can get in the 90% performance in crease range. Nvidia has d...
He is serious, and it is true. The sub-surfacing scattering is bad in their demo. Crysis 1 even had sub-surface scattering on faces that was better.
Most people aren't very familiar with high-end graphics technologies like SSS. But for people like me who work with production level SSS in CG, I can easily say their implementation is horrible looking compared to properly done SSS.
OnLive hasn't created a feature that people paid for and then taken it away. So I would be surprised.
Also, technically Crysis 2 isn't DX11 yet, sadly =/
I'm perfectly aware of that, but it doesn't change the fact of what I said. Which is that they couldn't do that even if they wanted to. 4xxx series is a fairly different from 6xxx, not a completely different architecture like Nvidia's 4xx Fermi arch, but still enough of one that they can't do that. They have to stick with their early choice of the 4xxx series.
Even a year before launch is too late to make major system changes like that. They would have already decided on a GPU at least a year ago and have to stick with it for architecture and development sake.
I have fond memories of this game. It's one of the best RPG's of it's time and still is. I definitely recommend people to download a ROM and play it (You can't buy it anymore unless you to find a copy in a pawn shop or ebay, but you're not supporting the developer anyways so it makes no difference.)
It's almost like a game genre in it's own, it's not a typical FF style RPG.
Most people can easily see beyond 1080p with a mere 22" monitor.
No, Nintendo's innovation these days is all about making stuff that's entertaining to the AVERAGE CONSUMER. Not to gamers like you who think anything not "hardcore" gaming isn't fun.
@jony_dols: And the next-gen consoles will likely be using current-gen GPU's, which won't be anywhere near as powerful as the ones coming in the next few years. As fabrication size becomes smaller the amount of transistors per area increases exponentially. We are getting to the point now where it is an immense increase in transistors with each new node size. Later this year 28nm GPU's are coming out, and then 16nm hopefully by 2013, and 11nm by 2015.
5400d ago 1 agree0 disagreeView comment
@SuperSaiyan4:
"My comments were out of jest considering the many responses by the half witted small minded PC gamer userbase that like to insult with inapropriate manner towards consoles."
Right... Cause the console fanboys on here never resort to insults and immaturity...
It's actually quite the opposite. The PC gamer average age group tends to be a fair bit higher than the console age group. You get more immature kids on her...
Waste what money? The disc's stay the same. The 360 just get's updated so it can read the new protection scheme that takes up less space.
And Bill Gates started and runs one of the worlds biggest charities. Microsoft donates a lot to it. But it's unreasonable to expect everyone to just give ALL their money to these causes.
Because it's become synonymous with the word save, since it was the original well known format to save to. Pretty much everyone knows what a floppy is, except kids born in 1996 and up. They just know that that icon means save now, and that's it XD
No need to change what ain't broke. It's a nice stylized icon for the word save.
@CaliGamer:
The whole article is satirical. It's not meant to be taken seriously. And you're acting like a hypocrite. Who is the one here that wrote a 3 paragraph rebuttal about a single sentence in a satirical story? You. So if anyone here is a basement nerd with no life, it would be you.
Nobody has been saying it doesn't have lag. Only fanboys of other platforms have been imagining others saying that.
This guy must be blind. The actual graphics aren't that good. And using a 3D concept art for the game as one of the basis for that argument is stupid. As it has no connection to the capabilities of the game engine.
id Tech 5 engine does procedural land a lot nicer. This game is merely unique in it's gameplay, not visually amazing.
Yeah, the current textures aren't very high. The DX11 patch should bring higher res textures, along with displacement maps for a higher level of detail in geometry.
The website speculation that it could possibly be a GTX 590 seems unlikely, since Tom in the video states it's the "next generation" GPU. And when Nvidia or ATI say that, it pretty much always means a new architecture or refresh. Plus the fact he said they have been working on it for over 2 years, which they wouldn't do for just a dual GPU version of their top GPU. A typical new GPU architecture is worked on for a few years, with the first 1-2 being design, and then another ...
You're an idiot who obviously hasn't researched anything about Microsoft and just leeched onto the belief that it's a big evil corporation. And you clearly know little about Bill Gates too. Success does not equal greed and evilness. Although in a capitalist society like the USA, that's often required to become rich.
"You don't actually believe he donated that money just out of pure philanthropy, do you?"
No, we don't believe it, we kn...
@ATiElite: 720 with a Tegra? Lol? Tegra is an ultra-compact SoC meant for mobile platforms. Even Tegra's two generations from now will just be reaching the power of the current consoles. They will not be anywhere near powerful enough for a next gen console.