Considering that they showed us some good games at E3, I feel pretty good about MS right now.
Wii U 84K compared to XB1's 34K, and encroching on the PS4's 121K? I think not. I see those Nintendo fanboys still run that site.
Last month's NPD showed the PS4 near 200K in just the US, and the Wii U only barely over 50K, according to the GAF leak. There is no way the Wii U sold 84K in just one week, even if its worldwide. This is pure NDF guesswork.
Women make for a whole extra animation set and mo-cap. We're talking hundreds of animations, 100s of megs of memory (if the anims are not compressed, to reduce CPU time), and mo-cap money probably in the $100-250K+ range, depending on the amount of animations -- which I'm sure AC is really high. Probably adds a few months of work to a typical animation team, as well. Now that you've got female players, you're gonna need female VO, high-def female character models (players a...
"Give me a story and a path and let me enjoy the experience that has been created for me."
You're describing a movie. And yes, movies are completely linear.
Games are not, by definition. The removal of linear aspects, the aspect of choice -- that's the "game" part. I swear I'm not trying to be rude by stating this obvious fact -- my real concern is that I have to wonder why you're ignoring it in your post.
The only part I dislike about Morpheus is that Sony will likely keep it proprietary to the PS4, rather than making PC drivers for it, when the inevitable DirectVR standard happens.
It's deeply ironic that MS will probably define the future of VR in this manner, and that the best VR headset for the PC, or maybe even the XB1, may very well be Sony's.
He can't hold a candle to Jack.
TBH, he seemed very much Sony's representative there on stage, though. Very... company president, as it were. I don't usually think much of such folks, but maybe he'll be good for Sony's biz... maybe.
Presidents like Jack are one in a million. His SCEA is the one that made SCEA great.
I actually thought "wow, this is a total rail shooter" when I saw the "gameplay" trailer. I was hoping for more. I mean, there was one enemy to shoot at in the whole trailer... and it didn't even die.
I can live with it though... just not $60. Maybe $30.
@ eezo, yes they are assumptions, but they're kinda bad ones, given public knowledge of MS' recent strategy.
Phil is very cool with the appreciation of his direct competitors. Very different from the faces of the XBox division of the past, who refused to acknowledge that Sony and Nintendo even existed, practically. I like him. He makes MS easy to support, and I actually expect some cool new games from MS due to Phil's focus on gamers.
Also, challenging Pachter over his guess that MS will mostly show the XB1 as a multimedia box seems right. MS has been all about "games, ...
Wow, just wow.
Go to Amazon, look up the "GPU Pro" series of textbooks, and check out the author's name.
Then look at the name of the guy this article is quoting, and the list of titles he's worked on.
Do you honestly believe he doesn't know his stuff?
Why are people arguing about theoretical vs practical bandwidth? Both the 204 (XB1) and 176 (PS4) numbers are theoretical, and there are a large number of other factors that come into play, like the number of CUs actually able to utilize this bandwidth (PS4 obviously has more), the amount of bandwidth available after the CPU cores consume part of the bus (a bigger problem with GDDR5, which is a hidden PS4 problem), and moving data to and from the ESRAM, if the dev even decides that they want...
There are no improvements in the hardware, folks. This is the Global Foundries version of the PS4. AMD has moved the manufacturing of the PS4 APU there. It's unnoticeably different, but requires a new approval.
It may have a 750GB HDD as well, but that'll be it for upgrades.
Normally I am calling out nVidia on their PC gaming elitism, but this is a load of AMD BS.
Any truly nVidia-specific optimization code paths would only execute if an nVidia card was detected. If this code actually harmed AMD performance, you can believe the game devs would bypass it if an AMD architecture was detected at runtime. If AMD would offer similar optimization routines, they could have the same benefits.
This article is pure AMD PR bullhooey. nVid...
Umm.. yeah. Not really a problem.
2 GPUs will increase fill and pixel power, but will NOT increase triangle processing muscle... which would be the primary interest of someone using a 3d modeling app.
He's better off with the Titan, than SLI.
On XBL they call those "demos" and they are free.
Seriously... Who would pay to rent a game for 24 hours rather than playing the demo and then choosing to buy if they like?
I've marathon'ed a couple games in my life, but renting seems worthless to me.
It will come to PS4 next year, as slated.
Just because its running on devkits doesn't mean its ready to go, nor does it mean its ready to go in the legal sense.
Porting takes months. Sequels and contracts can take years.
I guess its pointless to state that he's right about the frame rate, on CPU-bound games (open-world games, mostly).
ND specifically called out the PS4 CPU as being the problem with the Last of Us port, and the XB1 CPU is stronger... although I would argue that the Cell still beats it, when used by good engineers.
I'm a sucker for these kinds of games. I will buy it even if it is mediocre. I thought Knight's Contract was great, for example -- most people hated it, because yeah, some of the boss fights had some bad insta-kill-wtf-moments, and forced you to restart.
Demon's/Dark Souls have some flaws too. I can live with a few flaws, if the basic gameplay and story are fun.
There is no PS4 shortage.
If you look at the numbers of PS4s sold in the past month, relative to the few months before that, you can see that the only way they could be "supply constrained" would be by actually manufacturing fewer units.
They sold out of the initial production run of PS4s well before the end of the holiday season in 2013, so the "they had a stockpile then" excuse doesn't fly. Likewise it would make NO sense for them t...