Edit: EA confirms the common architecture has resulted in them being much further ahead at launch http://m.n4g.com/news/14064...
I agree with what's being said about the learning process potentially being much shorter this time around. At the end of the day, it's up to the developers to come up with engines that efficiently max out the hardware.
That said, devs have likely started much furth...
I'm pretty neutral but every time I hear about the "Cloud" it makes me want to puke. We've been over this before- there's very little you can do in real time over a marginal at best internet connection even if they gave EVERY gamer their own dedicated cloud server. For f*cks sake, just give it a rest with this cloud bullshit- the best thing about the cloud is that it makes for great PR which has apparently fooled plenty of people. SMH
Ok, rant over.
You know what makes this so funny? Is that the HDMI standard was designed such that any decent transceiver can drive a clean signal through even the cheapest cable on a short run of cable. The cable in the article is only 6.6' long.
If you go to a big box store to buy a short HDMI cable, and they give you some BS about distortion, etc, call them on it because it's just that- complete bullshit. I have 3 six foot long cables in my system, and they cost me a total of $18...
DICE, don't kick yourselves. You've invented one of the best lighting engines we've ever seen in gaming. You make games that are FUN above all, and that's what we want. BF4 is pretty much the ultimate balance in fun and realism so far this generation.
With that much power the NSA will be hitting you up to help them scan text messages if you buy it. Kidding aside, in a couple years one of these will end up serving movies and music when it gets too old to play the latest games in 4K. That'd be a sad day for whoever spent the cost of 10 good gaming pc's on that one box.
Sorry bro, I agree.
Prologue is fine with me- sooner than later. MGS is a big reason I always buy a new console in recent years- it's one of the most epic stories ever created in a video game IMO.
Agree- Bad Company 2 was one of the best battlefield-based games I've ever played and it's on the PS3. I liked it even more that BF3 on the PS3. Crappy resolution, no AA and poor framerate didn't stop it from being awesome.
@Volkama
I agree- visibility it has more to do with the draw distance you set the game to support (if you have the option to set it), the AA and anisotropic filtering on far-off textures, fog, particle effects, etc. Resolution isn't a huge factor like the article says. I know people who voluntarily play at lower resolution because it makes everything look bigger and easier to see as long as AA is on. Sometimes I even do it because for certain games with fog and super deta...
I *wish* I could say the future involved a brand new engine. But I think I'll be disappointed.
@allformats
Gotta agree. Given that (in the US) most ISPs are also cable providers I just can't be sad for them. They make you buy a ton of what you don't want to get what you do want
Can't speak for the console versions yet, but the PC version is awesome, no problems for me at all. Definitely not 65/100.
Edit: just read the account from the person who is actually going through the process- apparently they just put a hold on your card for the amount. That's fair, and it's not the same as them holding your cash to make a profit. I'd do it.
I can understand both sides of this- Microsoft covering themselves, etc. versus customer service expectations. But here's another example of how this could have been handled:
My DSL modem which I bought from m...
Happens every generation.
@Hellsvacancy
Fair enough- I shouldn't push my own ethics on ya'll. But surely you're not going to argue the fact that it's super wasteful to blow sh*t up without any other purpose than making a spectacle. Just sayin'
@CaulkSlap
Even if Sony had gone with half and half DDR3 and GDDR5 (like 4GB of each) or 2GB GDDR and 6GB DDR3. They'd still have a more powerful GPU by 40-50% on fillrate alone. The second config I mentioned would be similar to what gaming desktops are running- and thanks the the efficiency gains you get coding to the metal on a console, I'm betting they'd have still ended up with a 1080p capable machine, and it would still be more capable than Microsoft's ma...
@fr0sty
The ESRAM isn't really worthless per se, it's just inadequate for deferred rendering on a 1080p frame buffer as has been shown in several articles (although it's fine for 720p), and like you said, it's a stopgap measure. Seems like MS banked on the same thing happening with devs as last gen- that they would be happy to adapt to using the ESRAM because the competition's console would be harder to use. In hindsight that was a really poor gamble- they...
Something tells me the ESRAM isn't the only bottleneck here. Even if Microsoft's APU was hooked up to GDDR5 with unified access to that memory, the fewer GPU compute units would mean that it would lack fillrate. And as we will see, new games will only increase the geometry, texture filtering and lighting workloads over time. Maybe it's just me but I think even when the ESRAM is utilized to it's best case (NOT the theoretical max MS is so quick to toss around) developers will s...
Phillipino typhoon victims could make MUCH better use of that $900 he spent on the consoles, unless they were donated to him. You guys should donate too come to think of it.
You know, I actually approve of MS making these kinds of comments instead of try to tell us they're "sold out at every retailer".