Crossplat with PC is easier on XB1. I would not be surprised to see devs favor the XB1 as a dev console for that reason, and for the reason that XB1 dev kits will be $500 (i.e. the consoles), instead of $2500, like the PS4 kits. I have to wonder why the PS4 kits are so expensive. I know Sony loans some out, but.. come on. $2500 each? Is that *really* supporting indies? Is it?
Sony should be more concerned about this, IMO. But there's really nothing they can do about...
...because no game is CPU bound, and GDDR5 is used on PCs for its uber responsiveness to CPU memory requests.
/s
The above post seems like intentional sarcasm -- I'm not really addressing it, but it pokes fun at a humorous concept.
People love to look at the past, and pretend the future will derive directly from it.
The end of the semiconductor era is upon us, and people believe this industry will continue on as normal. LoL.
In all likelyhood, the PS4 will beat the XB1 in year, thanks to early adopters being the same people who care about minor har...
If this shared memory space is accessible by the CPU cores, as well as the GPU, that's a pretty freaking huge deal.
...and, no one seems to have noticed the last slide -- the one that states there is a custom audio chip. That basically means the XB1 can remove audio processing from the CPU cores, making more CPU available for gaming. Another huge deal.
n4rc is right, with regards to the pre-order ratio.
The PS4 will only outnumber the XB1 sales in the beginning, while the hardcore crowd (the kind who read N4G) jumps onboard. It will likely mirror this generation, after that, unless something serious changes, over the next couple of years.
Sony's new pay-to-play-online policy may swing the sales toward MS in some regions, since free-to-play used to be a major selling point. Likewise, if one platforms e...
Throw us some numbers that actually back up the 10x claim.
FLOPs isn't the issue here, if that's what you're referring to. The texel and pixel fill rates are the primary hangups, when you are bumping resolution, and the PS4 GPU is nowhere near 10x what the RSX was. It's about 5x, like I stated. Only high-end PC GPUs can hit 1080p at 60 fps for most AAA titles, or multi-midrange GPU setups.
The PS4 GPU is nice, but you'd have to roll bac...
The GPUs on the PS4 and XB1 are basically 4-5x as pixel/texel powerful as their predecessors.
Since vertex work is not typically a major factor:
1080p @ 30fp requires about 2.2x as much GPU power, with NO other advancements...
720p @ 60 Hz requires 2x as much GPU power, and 2x as much CPU power, with NO other enhancements...
**Drum roll**
1080p @ 60 Hz requires 4.4x the GPU power, and 2x as much CPU power, wi...
I would actually be MORE surprised if the PS4 supported hUMA, than the XB1. The original Kaveri/Jaguar design supports hUMA with DDR3 on laptops, after all -- it comes for free with the basic design.
Sony would have to pay extra for the GDDR5 unified interface design.. maybe they did, I dunno. But in any case, its almost certain that the XB1 has it. The PS4, with its GPU-oriented memory setup, is the one that deviates from standard AMD laptop setups, and really seems like ...
How many people actually believe that the same mass-produced CPU supports two different memory access models? Note that the laptop version ONLY supports hUMA.. so who really believes that MS would willingly pay extra (that's what it would take) to have the XB1 version not support it.. so they could lower memory perf, and raise the cost of the chip? Hahahaha..
Agree means: Voter thinks both consoles support hUMA, because yeah, they are practically the same, and we alread...
All Kaveri/Jaguar architectures support hUMA. Changing the memory access model would require a serious redesign, and would make for a custom chip -- which would cost more money to make for a lesser performance design.
The original source was referring to the unified architecture of the 360 not having it. That's not a Jaguar, or even an AMD design (hUMA is an AMD thing), so that makes sense.
Both architectures have hUMA. Anyone who understands hardware at all will realize that this must be the case. They aren't going to redesign the Kaveri architecture to NOT be hUMA, specifically for the XB1 -- that would cost them MORE money, not save it.
The original comparison was between the PS4 and the XBox 360, anyway -- and yeah, the 360 does not use hUMA, while still having a unified architecture.
Some ignorany journalist decided that meant the XB1...
The characters also need to be from the same playset, and the 3 that come with are all different sets.
You can't play coop without buying another toy, and even then you can only play coop in the world where you now have 2 toys.
I doubt these are for PS4. Japan has a load of PS3s, and they need to sell a bunch to make a profit.
Some recent polls in Japan showed that people weren't really interested in the PS4 (or XB1) yet... so I doubt Namco would be dumb enough to bank on that console when there are a zillion PS3s in Japan that want to play more Tales. TBH, Tales games are pretty, in that cartoony Anime way, but I doubt the extra GPU muscle of the PS4 will benefit them all that much. Japanese...
Wow, a livestream is something more games should do during development!
I'll definitely be watching this one, if I can find the time.
The money man will not go away. I have to LoL at anyone who seriously thinks that will ever be the case.
The distribution model may change, but, even today only a small % of a project's overall budget can come from things like Kickstarter. They typically use that money to get the ball rolling, and to *pitch* their game to a publisher. Then, assuming the publisher likes it, they have to spend 20x as much to actually make it.
@blackOo,
You sound pretty naive. Do you realize that the actual performance of prototype hardware (8 core Jaguar APUs aren't even on the market) varies drastically until AFTER launch?
Case in point, Sony designed the PSP to run at 333 MHz... and underclocked it to 222 MHz for the first couple years of its existance, because the hardware wasn't reliable and power-conservative enough to run at 333.
Another example, the Cell processor ...
I applaud Phil Fish, for demonstrating so brilliantly what being a self-important jerk can really be like.
Good work, Phil. Fez.. was not really that great, yet Phil acts like he created platforming, projected puzzling (Echochrome 1&2, just to name a couple), and the use of orthographic projection.
Don't even get me started on Jonathan Blow. Braid *sucks*, IMO. It's not really much _fun_? Kinda the point, I thought. Maybe if you have no other...
I have to agree. The number of one-hit (or zero-hit) wonder indie devs talking trash has been making me ill lately.
For the most part, they got lucky. They're not the brilliant game industry masterminds they seem to believe themselves to be. There are 20 FAR superior indie, or small dev games, for every one of these smash hit wonder games, that never got the chance to make it big.
I didn't pick at your selection of play style, I merely stated that, if you are a dedicated sniper, then yeah, that was probably the sole role that wasn't very fun. And yeah, 13% is weak.. again, sorry, but that's the truth.
CoD is for snipers. Sorry, but its true. As far as I'm concerned, sniper was the one boring role in MAG. The field was littered with cover, and getting headshots in those huge battlefields was fairly hard when you had 32-128 enemy scop...
You realize that XB1 development is 99% the same as Windows 8 development, right?
Sony's APIs are unique to their platforms, and they're not even the same from platform to platform (although they have been getting closer since the Vita)...
If you don't understand how the XB1 being near identical to W8 is a huge boon for any crossplatform game, well... I can't help you. Do you have ANY idea what a tidal wave of development apps there are for W...