Look at all of that die space WASTED by having the ESRAM, if they had just gone with GDDR5 then they could have used all of that space for a better GPU.
If Microsoft were designing Xbox today they would go with the same design as PS4, GDDR5 and a better GPU.
"And the PS4 is a cut-down and de-clocked HD7870."
You say that like it's a bad thing?
Cut-down and de-clocked HD7790
vs
Cut-down and de-clocked HD7870
I know which I'd rather have in a games machine.
The GPU is what is significantly better which means that all future games will be similar, they will run fine Xbone just at a lower resolution (720P) and the extra GPU power of PS4 will be used for 1080P.
MS had to sacrifice GPU power to make room for the ESRAM, which again is only there because of Microsoft's choice to go with slower DDR3.
As an aside this confirms that the Xbox GPU is a cut down Radeon 7790:
http://www.amd.com/uk/produ...
They are for redundancy, not every chip can be made perfectly so they intentionally overdesign them.
If you make a chip with only 12 CU's and 1 is faulty, the chip is no good because you only have 11 working CU's.
If you make a chip with only 14 CU's and 2 are faulty, the chip is fine for use because you only need 12 CU's.
In the PC space those GPU's which are fully enabled usually cost a lot more than the cut down ones.
Lets hope it's not due to those last minute overclocks they did.
Xbox
$110 CPU/GPU/ESRAM
$60 DDR3
Total = $170
PS4
$100 CPU/GPU
$88 GDDR5
Total = $188
Wow, so Sony are only paying $18 more for a far more superior GPU/memory subsystem.
I bet Microsoft are kicking themselves for committing to DDR3 and the ESRAM it requires, imagine if they had gone with GDDR5 like Sony and (without any need for ESRAM) included the same class of GPU as PS4, the...
I like how they call the UK a small country, we are probably Xbox's biggest market after the US.
It seems like Microsoft just don't give a **** about anywhere except the US.
Well if you're happy paying more for a system which is 'balanced' towards 720P... I'd rather buy the cheaper system 'balanced' towards 1080P.
16 ROPS v 32 ROPS is the main reason why Xbox One is limited to 720P, now can please stop with the "better ESRAM use will make Xbone on par with PS4" bullshit? it's just clutching at illogical straws.
"In order to accommodate the eSRAM on die Microsoft not only had to move to a 12 CU GPU configuration, but it’s also only down to 16 ROPs (half of that of the PS4). The ROPs (render outputs/raster operations pipes) are responsible for final pixel output, and a...
It's not Sony's responsibility to police how users decide to use the Twitch 'service', it's like saying all Smart TV's etc should pull Youtube support because a few morons upload dodgy videos.
Well said.
Although 900P with AA may be beyond Xbox One if you want 'next gen' effects. Xbox One has half the pixel rendering output of PS4, 720P has about half the pixels of 1080P.
Anandtech told you all you needed to know in their hardware review:
"In order to accommodate the eSRAM on die Microsoft not only had to move to a 12 CU GPU configuration, but it’s also only down to 16 ROPs (half of that of the PS4). The ROPs (render outputs/raster operations pipes) are responsible for final pixel output, and at the resolutions these consoles are targeting having 16 ROPs definitely puts the Xbox One as the odd man out in comparison to PC GPUs. Typically A...
I would have thought that in a console things like 'turbo' were bad, consoles should be fixed specification... dynamically changing frequencies is going to cause problems for developers.
They are treating people like fools...
Even if the cloud was the greatest supercomputer cluster the world has ever seen, Xbox One consoles would not be able to leverage any of it for better graphics.
You average internet connection = 10 Mbit/s
DDR3 = 68GB/s = 557056 Mbit/s
As you can see if the DDR3 in Xbox One is 'slow' and having to be helped by ESRAM, then there is really no hope for the 'power of the cloud'.
Xbox One is a much cheaper console than PS4, or at least it would be if Microsoft removed Kinect.
http://www.gamesindustry.bi...
Much cheaper RAM, much cheaper GPU than PS4, it's positioned half way between Wii U & PS4 in all but price.
It will have been severely rushed to meet launch deadline and they probably plan to rob customers selling the missing portions of the game as DLC.
Review copies will likely include custom code to identify and source of leaks, I doubt they would send out standard retails copies to reviewers weeks ahead of release.
If this is the case then why are AMD not supporting the majority of non GCN Radeon cards out there?
What happens if Xbone is as unreliable as 360? will people be forced to buy multiple Kinect devices?