"We're now left with a GPU difference greater than previously though. To be more specific, 1.84 v. 1.18. Launch games won't even begin to show this, but it will become very noticeable in the near future."
You're making the big, BIG assumption that the GPU is the bottleneck on these 1.6/1.75 GHz mobile CPU systems. It's entirely possible that MS's better CPU/GPU balance will yield better overall perf in some (not all, but a lot of) titles, becaus...
Same process size.. what do people expect?
It's not like nVidia did any different.
These 28nm chips are basically nothing more than the output of a more refined 28nm process, that yields more reliable chips that continue to operate reliably at higher temperatures. There may be some really minor tweaks, but we can't expect nVidia or AMD to produce any truly phenomenal upgrades until they can put their designs affordably on 20nm hardware.
Keep in mind that the European tax system is radically different than the US system.
$90 comes out of a person's paycheck who has not paid taxes in the same way US citizens have, and is inclusive of sales tax as well. US citizens pay a good ~25-30% of that $90 in taxes before it ever hits their bank account, leaving about $66, and then they go and pay about 9% in sales tax... end result:
$60.
It's no different. Don't be ignorant...
Cloud computing actually can improve graphics -- merely by freeing the CPU to do other tasks than, for example, multiplayer server computations, AI computations (the output can come from a datastream, and latency/lag isn't really an issue, since AI pathing decisions don't need to happen within a frame, typically), etc.
With more CPU muscle, the CPU can take some fairly big tasks away from the GPU, like mesh skinning, physics, etc. -- and the GPU then has more free tim...
@vulcanproject,
I was merely pondering why you brought dual GPUs to a discussion of single GPUs.
Your comment kinda sounds like a hyper-defensive nVidia fanboy. We (PC gamers who build their own rigs, or care about tech specs -- the primary demographic of this kind of article) all know that AMD routinely brings good GPUs to the market, so your seemingly biased criticism really isn't justified.
Also... the article you linked has benchmarks...
Why are you bringing up dual GPUs?
You know there are already dual GPU setups which best the Titan and probably this one, right?
Now it all makes sense. You can't have a console for too cheap, without somebody suffering. They want to sell 5M PS4s by March, too... that's $60 x 5M == $300M lost by March. Ouch.
I have some faith that early adopters will buy PS+... but then again, they probably already have it, don't they? Throwing launch titles profit at this seems like a bad idea, since those titles will only be significantly profitable for about 3 months...
Is the cost t...
The PS4 is easier to use than the PS3, yes. It's not easier than the XB1, however, which shares the same APIs with Windows 8. That API sharing is a huge huge deal, even if the multithreading job issues with the SPUs are now out of the picture.
Be careful when you state "easier (than the PS3)" and try to imply that to mean "easier (than the XB1)"... because the implication is untrue.
I can think of at least four serious issues present with your desire for "non-gimped" PS4 editions of games.
1) Games share a codebase, except for platform-specific stuff, like a rendering engine, or online service interfaces. They ALSO share assets -- i.e. models, textures, and everything inbetween. It would be foolhardy for a developer to spend extra money on ONE version, when they can make assets that work on all versions much more cheaply.
Any...
Sony does have a mediocre lineup, I agree.
TBH, Killzone is not some great shooter franchise, as I see it. I've played em all, and the only one that even stood out was KZ2, due to its impressive rendering tech (at the time). KZ3 kinda stunk, and the Killzone universe is.. not really that cool.
I know that hurts the ears of some fanboys. OTOH, I love Infamous.. although Second Son does look like more of the same, combined with some new powers. Knack.. ...
Wow. I hope this game can recover the $400M they are probably spending on it. I guess it will, in that way, break some records at least.
No, the PSXB4One version will. You know they will be basically identical, no matter what strengths each hardware platform may have.
EA, nor any 3rd party, will spend (read: waste) extra money to make one console version "more special", when they don't want to endanger their business relations with both MS and Sony in the first place, let alone make less profit by spending MORE money on just ONE version, just so they can... what? Hurt the sales of the other ver...
The PS4 CPU is weaker, because of the bad latency, high-bandwidth GPU-centric memory.
The XB1 GPU is weaker, because it is just plain smaller. The bandwidth issue is not nearly as big a deal as most people would have you believe, but it IS smaller.
You'll get games built to use the PS4 CPU, and the XB1 GPU, just like most games were built to use the PS3 GPU and Xbox 360 CPU, this gen. Lowest common denominator.
The plague of the multipl...
The masses love partial truths. All great, meaning powerful, politicians know this.
Anyone reading who doesn't understand how my comment is related, do yourself a favor, and don't believe everything the Man tells you, just because the limited truth he doled out was true, by itself.
Man, I cannot *stand* journalists who spout drivel as if they knew the tech they were writing about.
"Our contacts have told us that memory reads on PS4 are 40-50 per cent quicker than Xbox One"
No, dumb journalist (or maybe dumb "contact"), memory BANDWIDTH is about 200% WIDER on PS4, meaning that, when you request it, you can get 3x the data at any one moment. Since latency is GREATER (some people say GDDR5 has 10x the latency of DDR3),...
You sir, claim to build your own rig, but ignore the FACT that many modern games are CPU limited while considering these two architectures? The CPUs of these machines are *easily* the limiting factor -- and the one with the stronger one will get a lot of use out of that extra muscle.
Use that browser, go check out some gaming rig framerate comparisons, by professionals, on places like Tom's Hardware, chum. Learn something about the real world, as opposed to some drivel ...
So.. the claim is that 50% more GPU hardware pipes, combined with (way) worse CPU memory latency and a slower CPU and GPU clock, in practice, runs 50% faster?
Someone break out Skyrim on a PC, and pair it with these two machines:
3.1 GHz i5
Radeon HD 7850 (with GDDR5)
8GB of DDR3 10-10-10 timing RAM (or worse, even though you can't pick up RAM with as bad of latency that GDDR5 would have... like 70-70-70)
vs
Let me just rephrase the above responses to my post, for clarity and brevity:
Physics is a bottomless treasure chest,and humans are leprechauns that can reach in and get lucky every couple years.
Let me guess... you guys were BORN during the PC era.
I advise everyone who thinks the PC will continue to evolve, as it has over the past 10 years, at the same rate over the next 10 years, to pay attention to the tech news.
Moore's Law is dead folks. PCs will cease progression forward, for *financial reasons* by the time 2020 rolls around, according to Intel's former chief architect:
You guys (@above) are misinterpreting Sony/Yoshida's comments, regarding the PS app on iOS and Android. Said app is NOT Sony's version of smartglass. Its a tool for social networking and checking out your trophies, game news, buying from the PS store, etc. The only version capable of interacting with streaming gameplay is the Vita one. The others can only do "share button" style interaction -- i.e. you can watch other people play.
Thus, this guys snarky ...