oh boy...
How is this limp story getting so much attention? A Twitter "fight"?
Oh boy. It is very much your loss buddy.
A Harcore character on Diablo 3 PC due to lag.
You see consoles as a lot less "hassle", I see them as extremely limited.
There are plenty of games that aren't dumbed down. Shadowrun Dragonfall, Dark Souls and Path of Exile are good examples of complicated and challenging games that reward smart playing and punish sloppiness.
I agree that The Witcher 3 will probably have lower minimum requirements, but that's because CDPR knows how to build a game on PC. They're also smart enough to know their limits and won't allow greed overcome their artistic vision by trying to downscale and port the game to last gen consoles. On the other hand the Shadow of Mordor devs are having to build a game that will run on Xbox 360, PS3, PS4, X1 and PC. Clearly (and sadly) the PC is not going to be the lead platform.
Well the xbox was always going to use DirectX. What has surprised me a bit is that MS has been using Nvidia GPUs instead of AMD GPUs whenever they do a showing be it for Xbox or PC. Even last year's E3, when they were caught using PCs with GTX 780s for the X1 demos. Maybe there's some behind the scenes drama between AMD and MS because Sony's APU turned out much stronger? That's just the UFO hunter in me talking.
@BlackTar
Actually if you look at the tags, it clearly says PC. Also this is a gaming site, so I don't think that me mentioning PC as a capable gaming platform is out of place.
Although I do commend you for your policing of the comment section. I'm sure than whenever diehard fanboys invade exclusive games articles with generally nasty and immature comments, you will be the 1st in line to put them in their place.
Hmm. Smart Kid. Good on him.
Well PS4 is ok to own. Mid to High End gaming PC is the real dream to own. Imagine one platform where running games at 1080p and 60FPS is the norm rather than the exception. A platform that allows you to run Reaper of Souls and Titanfall and Civ 5 and Company of Heroes 2 and Persona 4 through PCSX2 at a full 1080p and Super Metroid just for old times sake. That is the dream system to own, indeed.
For anyone who has seen the actual game, Star Citizen it is not. Now developers are doing the coy "hopefully it'll run at so and so" when it's clear that it will. This is not exactly a powerhouse of a title. I'll be a lot more interesting if they could render the game at 1440p and then downscale it to 1080p. That alone would blast away aliasing and give the whole game a very crisp looking aesthetic.
Welcome to PC gaming, where you'll never run out of great games to play.
Oh boy. I can't wait until actual games start looking like this. Now that engines like Unreal 4 and Cryengine are becoming so accessible to independent developers, I believe that we're in for a big boost in gaming quality overall in the coming years. Good stuff indeed.
...the pic made me think for a second that there was a new Fight Night game coming. Color me disappointed blue.
@Majin
Finally? Poor Penello has debunked this nonsense many times before. The onus is on whomever believes this silly claim.
I'm pretty sure that it's a bit more complicated than that. If a studio like Bungie decides to just "not show up for work", you can be sure that they would get hit with a massive breach of contract lawsuit- especially in America. But never mind that.
I understand the reason as to why Bungie would've wanted to leave, and I'm 100% glad that they did. What I don't understand is why Microsoft didn't do everything in their power to keep them 1st ...
I'm still baffled as to how Microsoft let Bungie leave. I remember when the rumor was floating around and everyone thought "No way Microsoft can be that stupid". Turns out, they really could. To think that there was a time when MS had studios like Bioware, Bungie and Ensemble as 1st party developers.
And to think that in 2 or 3 years, this beast of a GPU will probably be matched in performance by a Mid-Tier $200 video card.
My PC is ready. I might even have to turn on the second GPU in it. OOOooohhhhh...