Flower says hello.
No.
If Sony forced it, that would take up valuable memory. It's a good idea allowing developers the choice so they can allocate memory more efficiently.
It works in a number of games. (The ones that actually need it)
Something like that. I do remember that the majority of them were dead once MC arrived... or something.
People do know that Master Chief (John, at that time, I believe) was on Reach when it fell, right? He space jumped from a ship into the atmosphere. I mean, I'm no Halo fan, but I remember something of the lore.
I'm glad they put MAG there. I'm still playing the game a ton -- I don't foresee that changing either with Zipper's promises.
I'll forego online play for non-MMOs (read: non-server based) then. No big loss to me as I enjoy single player experiences the most.
Hmm. This makes me want to make a female character and be an absolute dickweed to everyone, just to add to his project thesis. (I haven't been on Home in a couple months.)
Er... What about the Nazi zombies of the first one?
I think the game should go to the Middle East for the next installment, though I absolutely loved the Tibetan/Nepali setting. There's so much lost architecture in the M.E.. Besides, it would be amazing to go through Assyrian, Babylonian, and Sumerian architecture. That world has so many secrets still. A side-trip to Cairo, the valley of death, and the tomb of kings of Egypt would be welcome. ;)
Now the question is should a money conscious person get the cheaper Calamity Trigger or chip in the extra money for this version?
o.O
Thumbnail reported.
Games of 2010.
Exactly...
Haha, epic as KB always is.
They're practically identical except for the increased emphasis on stasis/kinesis and healing/items. It'll just make it easier. You shouldn't have any problems. :)
The SPUs of the Cell are indeed capable of taking and executing algorithms that would be otherwise calculated on the GPU like physics (and a myriad of other things), which will take minimal RAM if any; thus, that frees up memory on the GPU for other graphic intensive processes. That said, it's still not a replacement for GPUs or RAM. So, neither of you is right or wrong.