Um... they're the same CPU, basically. Comparing clock speeds is completely valid.
1) Old benchmarks (on 1.6 GHz XB1 devkits) on specialized applications that multi-thread well, and that can potentially use GPGPU. (unlike entire game engine loops)
2) XB1 OS likely uses 2 cores, whereas PS4 OS likely uses 1, directly affecting these numbers, whereas games tend to weigh down just a couple threads for most of the frame, and thus would be more affected by clockrate.
Give us a link about a game, on a post-1.75 GHZ upgrade machine, rather than posting lin...
It's actually only 2 cores (1 per module), when it clocks up to 2.75 GHz. It's a heat distribution problem, not some sort of other limit.
But yeah, we've known that the base clock was 1.6 GHz for a long while. It's only the uber fanboys that this upsets.
It's a notebook CPU. Pretty big difference between that and a tablet CPU, my friend.
Dante, both those benchmarks were done on XB1 devkits, when the clock was still 1.6 GHz, and you'll notice the benches both show a 7/6 ratio -- because those benchmarks are for easily multithreaded applications of a CPU (i.e. specialized tasks), and the OS takes an extra core on the XB1, supposedly.
Since the XB1's CPU has been kicked up a notch after those benchmarks, most games, which have been demonstrated many times in the PC space to not use more than 2-4 logical...
I highly recommend this game/game-maker. If you are on the fence about getting a XB1 (or Windows 8.1, if you are done playing all your old XP games), this is the reason.
The Kinect capture on the XB1 is pretty amazing, considering its just a depth camera, and you don't have to wear a body tracking suit or anything crazy like that. I love how you can use and edit your creations on both platforms as well -- I have both, and its pretty awesome to be able to edit on PC (or ...
I can't believe the number of disagrees you have. New Sony fans hating on the awesomeness of old Sony, and trying to pretend this is "normal". Disgraceful.
It is NOT normal to be letting go of such talent, when the PS4 is doing so well.
You're making the assumption that he stepped down, when the PS blog clearly states that he and Sony couldn't agree to "continue their relationship". That basically means they wanted somebody else running the show, and he was let go.
He didn't jump off, he was pushed off.
PS1-PS3 were Jack's legacy, and they were amazing.
PS4 is nice hardware. We'll see what the "new" SCE has in store for us. They're already charging us to play online, and are making controllers with low-grade rubber covers on the sticks, and tiny batteries, after all.
This engine is OpenGL-based, and was also exclusive to PS3 and PC last Gen.
Despite being on the PS3 since 2006, there has never been a PS3 game that uses it.
This isn't news.
Oh.. give us examples?
50 game devs is a bigger loss to gaming than 1000 retail store employees, my friend. Those 50 devs were creating a product that the retail industry was going to sell.
There are a lot of retail opportunities for those 1000 clerks and managers. I'm not saying that laying those folks off was "ok", but the dev layoff has a MUCH bigger impact on gaming, which is why N4G would be upset.
He made a lot of mistakes though (admittedly he's only a journalist).
The worst part is that he takes the word of an obviously unknowledgable "professional" for the gospel truth. The blind leading the blind, as it were.
Replying to myself, because edit got disabled:
Next:
"I can’t detail the Wii U GPU but remember it’s a GPGPU. So you are lifted from most limits you had on previous consoles. I think that if you have problems making a great looking game on Wii U then it’s not a problem of the hardware."
Lifted from most limits on previous consoles, because its a GPGPU? Wow. Just wow. GPGPUs are not so magical, my friend. If they were, we wouldn't nee...
The author mentions "Artificial Intelligence" as being something the eDRAM on the GPU can speed up. The makes NO sense.
He also quotes another (unknowledgable) guy stating that bandwidth only makes sense for a GPU which makes "scattered reads" around memory, which is bad for performance in the first place. Nope. That'd be true IF you had a small number of pipelines doing different, simple operations, sure. And the number of professional game devs w...
Sony may believe that they are in better financial shape by letting 3rd parties produce software on their console, and have it be the highest rez console version of those games, etc.
First party studios are expensive, and their primary purpose is to sell the hardware, not rake in a huge profit (although, that's nice). Since the PS4 sells itself, it makes sense that Sony would dispose of some internal projects that probably aren't going to be big money makers or produ...
The Wii U is never going to match the X1 or PS4, and so its mild advantage over the PS3 and X360 is basically immaterial.
All the bandwidth in the world won't cure the lack of ROPs and shaders that are at the heart of the issue.
I think that, to call a game studio "lazy", you have to really understand what making a game is all about in the first place.
I'm gonna go with you not knowing these details, and therefore having no authority to call devs "lazy".
If the game isn't demanding, why is it 720p on the 360/PS3... you know, like CoD:G on those platforms?
GraveLord,
Most of the actual game industry greats you can pass by at places like PAX without even realizing it, because you never see their faces on N4G.
I don't know this guy either, but he's probably more important than most of the knuckleheads who yammer on to the public about whatever they think will net them some good advertising. If he's an engineer (and it seems like he was a senior graphics engineer at Epic, from reading the other commen...