I disagree that LGA1366 is worth it, LGA1156 processors are much cheaper. Yes, they sacrifice a few features, but it's doubtful you'd use these features. An i5 750 will trump an i7 920 in most games and it is a full 100$ cheaper (here in Canada at least). It lacks hyperthreading so if you have any applications that can benefit from this then you might lean towards the i7 920. Gaming seems to be at the top of your list so I doubt it is worth it. Proof:
@Gradient:
Son.. I am phone
Please list the features you are apparently so proud of..
I own 20 of these games, of the ones I don't own, 4 being the wii games and the 5th as SF4, just not a big fighter fan. Finished 15 of them, too.
If I could replace the wii titles I'd throw in Fallout 3, Valkyria Chronicles, Pixeljunk Shooter, and Borderlands.
It's like you're psychic! I mean, you predicted something some people would say after they've already said it! Wait.. nope, sorry that's not being psychic, that's just being a moron trying to act clever.
Sorry, but this has nothing to do with the game, and everything to do with playing for extended periods of time..
Right and you have numbers to backup this claim?
You haven't the slightest clue about how many YLODs there have been, and that makes your claim complete BS. Even if the amount claimed was only half the actual amount or a third or even a quarter, RROD was *far* more widespread, and the proof for that is Microsoft amending their warranty agreements (which cost them quite a bit of money). You don't do that if your console meets or is below expected failure rates.
I'd blame the software in this case, not the hardware.
Both being Jim Sterling reviews.. surprise surprise! Who knew he was a troll?
Yes
I wouldn't compare scores of XBLA Perfect Dark with GoW3, I mean sure it got a higher presentation score but all that means is that it has a 9.0 presentation for a 12$ game.
I definitely think GoW deserved a better presentation score, a *much* better presentation score, but it doesn't make sense to compare it to a remake of a classic game.
Uhh, Bioshock 1 and 2 don't use Unreal Engine 3.. they use Unreal Engine 2.5.
http://en.wikipedia.org/wik...
http://en.wikipedia.org/wik...
Making up terms like "processing theory" doesn't make you look smart.
Comparing gameplay and visuals in terms of art direction is fine, but in terms of technical features, CryEngine 2 is the leader. It's a no brainer really, even today modders are still creating working at making the game look even better, and the mods can still give even the latest and greatest a run for its money. The Killzone 2 engine cannot do this, obviously. That's not to say that Killzone 2 isn't...
I would avoid it just because of the fact that Activision would stick their dirty little hands in it.
They are also much, much larger maps and because of that also harder to create.
Haha, nobody cares.
PR is always skewed, and nobody takes it seriously.
Doin the same.
It's also the fact that bandwidth restrictions and latency cause real issues with the service. Latency is probably the biggest hurdle, its not like you connect to a high latency game and theres lag between players, its lag on your controller input and what you see on the screen, and that is not fun to experience. This just isn't the right time for OnLive.
If you actually read the article, you'd realize that the next generation engine is not for 360 or PS3, its in anticipation for the next gen consoles.
Yeah it was a stupid decision by them. Not to mention that there are games that don't take advantage of crossfire, and at that low of a pricepoint they should stick with a single card. You can get a 5770 for 150CDN im sure it would be even cheaper in the US.