A Cell with 10 PPUs and 70 SPUs, and 10 RSXs in SLI would be 10x the power of the PS3, if an army of programmers could even write the engine that would stride such a monster. That's really the only system that would truly qualify as "10X". Maybe 10 PS3s linked in a distributed system? Sorta? Not really...
Anyone who spouts a "N times the power" number is just spouting BS.
That said, as far as GPU only considerations, it would take ...
Tiny review sites are where all the whacked reviews come from. I have a total of 0% faith in this.
"Because we have no interest in playing games all day. Some people enjoy having time away from games."
In that case, you wouldn't buy a 3DS either...
/agree with above statements
They ignore MS and Sony shipped #s, and make the foolish assumption that retailers are so dumb as to keep a whole load of spare PS3s in some warehouse somewhere, and keep ordering them.
The truth is that the sold numbers are within less than 1 million of the shipped numbers, at all times, because retailers know what they're doing. The only time they ever stockpile consoles is around the holidays. Even then, if they do oversto...
MS could go a looong way with XBL Gold, by offering free titles from XBLA, and a few full-size titles from time to time.
Why they don't is beyond me. It's not like people will stop paying, if they make the service better, and its not like MS is going to lose a lot of money by giving away old (but great quality) games like Crackdown, Alan Wake, PD Zero, and Gears 1. At the very least, they'll be hurting used sales of those titles, while gaining subscribers, and h...
@vulcanproject
All you will ever get from your quadcore overclocked i7, vs a mid-range hexacore AMD chip, in any PC game, is extra frames, and extra frames over a vsync'ed 30, to boot.
"For example Battlefield 3 is DX10 only and its MINIMUM GPU spec was a Geforce 8800 (which is easily faster than PS3/360), and that is over a year old now. "
Exactly my point. A PC *requires* a GPU at least 2x that of its console counterparts, an...
I forgot to add something. This generation, we have something new, with regards to PC gaming.
Nowadays, there is a HUGE gap in available processor power for PCs, that just wasn't present in previous eras. Around 2005, the best gaming processors were Pentium 4s clocked around 3.6 GHz, and the typical PC user had a.. *drum roll* Pentium 4 clocked around 3.0 GHz. (Okay, AMD processors were pretty good at the time, but the overall difference wasn't much better... and Pe...
These AMD CPUs could be Jaguars, or they could be Steamrollers... in any case if BOTH consoles have 8 cores, you can be guaranteed that games made for them will USE all 8 cores, which CANNOT be said for PC games.
PC games, in targetting a low (not "the lowest") common denominator, will usually shoot to have no more than 2 active cores, and will often have *some* ability to utilize more cores during some portions of game loop processing. They do not, however, assume...
This is pure BS. Developing for the Vita is FAR easier than developing for the 3DS, and frankly not much different than developing for the 360.
The 3 cores available on the Vita (the fourth core is for the OS) are not as fast as the 3 Xenon cores, but they're still decently fast, and the memory architecture is very easy to use, just like the 360. The 3DS, on the other hand, has two cores, one of which is mostly devoted to the OS, and both of which are individually slowe...
The games will be tailored to the least common denominator, if they are close enough -- just like this gen. Thus, it doesn't really matter which is "better", as long as they are close.
Enemy, you just stuck a red sign on your head, with the word "CPU clueless" on it.
The next gen platforms will probably have barely the performance of the Cell, when it comes to some tasks, like movie-quality animation, and good physics -- its a real shame, from a CPU and gamer standpoint, that we won't see a quad-core, 16-32 SPU Cell this next gen.
I want the PS4 cost me $500, but for an additional $200 to be subsidized by this guy.
More consoles sold means more (and better) games, bro. A game on a $500 console with the '$$$' behind the eyes of the developers, knowing they can rake in some $$$ when they ship their game, will beat the socks off ANY game on any $700 console, because the $700 console game won't be worth the investment by the publisher (and hence, devs).
Great games are not jus...
It matters, because its a fact, which a certain set of of folks now have the option to look at and say to themselves (and to others, with some courage) -- "wow, I was stupid. PS3 rocked the past 6 years after all the BS I regurgitated on teh interwebz".
Perhaps the world will be a better place, if a slap-to-the-face manages to cure a few fanboys of their mania. The less stupid people, the better, assuming they can crawl of their hole and move on, without trolling ...
Umm.. projectors have been used for large-scale driving and flight-sims, in research environments, for several decades. Heck, sometimes I like to see a show at this crazy kinda place they call "IMax", and I feel fairly certain they're doing some of the same thing there.
This isn't "new", as much as it is "new to the livingroom", and no, article author, some random English guy did not just invent image projection in your livingroom in 201...
@Kingofwiiu
Where was the Wii U Skyrim announcement? I missed it. Certainly Wii U has enough memory...
\s
The PS3's RAM being in two parts is a minor pain to workaround, but a performance issue, it's not. No more than doing tiled rendering inside a 10MB framebuffer certainly.
If you knew a little more about the Vita, you'd realize that it's main issues would be that Skyrim, per se, would be CPU bound on ANY mobile CPU. Cutting GPU expenses would be a far easier task than cutting physics, AI, etc. without damaging the experience. Memory would be a non-issue.
If there's a 1080p game, it'll be something simple. I'm sure they will exist, just like 1080p PS3 games exist... but 1080p and 60Hz does not come from cheap Nintendo livingroom hardware.
For that matter, I don't imagine PS4 or 720 devs will choose over 2x the pixel work and 2x the framerate (~4x the work at 1080p/60Hz, relative to 720p/30Hz), as well as halving the CPU muscle (60 Hz is half the time per frame to get work done) instead of some portion of extra...
Trine 2 is not exactly the most intensive CPU app. They got some physics, sure, but there are only a couple simple characters, in a relatively simple small environment, at once.
I would expect such games, and game devs, to do quite well on the Wii U. This isn't a surprise, really.
Conversely, I expect the guys at Crytek to scream bloody murder, about how Nintendo should have put a $300 CPU in their $300 console, for the sake of running Crysis.
The PS3 has smoked every other console, when it comes to revenue invested by users -- great numbers, at a greater price, from day 1. Pretty astounding job by Sony, getting people to buy premium hardware like it was discount stuff.
Now if only they could turn that into profit! I expect next gen will be a good turnaround for them, even if the next PS isn't the mightiest hardware.
I find these limited-understanding CPU and GPU commentaries utterly fascinating.