@Razzer
***"@wasabullshit
That doesn't change the TF, Einstein. It is still has max TF of 10.28. Not 9.2 TF. FACT.
The leak was wrong. FACT.
I'm not doing this crap with you again. You can believe 2+2=5 all you want"**
Ad hominem, again.
"Wasabullshit" "Einstein"
Just because Sony quotes a boosted clock speed in it's pre...
@Razzer
**"How is that "right on the money"?"**
Because it's a boost clock, and as Cerny said himself it's changeable dependant on load.
The 2.23Ghz is maximum boost under perfect TDP conditions.
It's the same chip, 36 compute units 64x2 shader cores the only thing that's changed is the speed, which as Cerny told you himself will fluctuate dependent on power draw and thermal paramet...
@Razzer
**"Yeah.....I don't think anyone got this right. What is funny is that I read 9 TF, 11 TF, 12 TF, and 13 TF. Not once did I see 10 TF mentioned and yet.....here we are"**
The Digital Foundry covered Komachi Ensaka leak was 💯 on the money.
The leak named Series X at 12.1TF and PS5 9.2TF @2GHZ.
The same leak also displayed the backwards compatibility frequencies of PS4 and PS4 Pro as detail...
@Razzer
**"A USB-C port would have been preferable."**
It makes no difference. USB-C is simply a style of physical plug and not a specification/ standard.
USB-C can run USB 1, 2 or 3 transfer speeds and variants of each or even Thunderbolt 3. The speed of the drive is dependent on the interface of the drive (NVME / SATA) and the quality and capability of the connecting cable, not the physical type of plug on each end of the wi...
@rainslacker
**"So, three years is the time that developers start becoming irrelevant?"**
No one has mentioned irrelevance other than you. Please don't attempt to take my words out of context - stay on topic.
My post was in response to Razzer's (edited comment) . As he's amended what he wrote I don't think it's fair to make it the focus of debate. The only thing I will say is that personally I feel that it&...
@Razzer
**"And yet Naughty Dog continues to excel"**
That depends on what standards you are basing your comment on. Maybe by your standards Razzer, certainly not by the standards of one of the studios senior animators who cites a lack of relevant talent, extreme crunch and difficulty recruiting new talent as reasons for excessive delays on TLoU 2, which by his own reckoning would have shipped a year earlier with a more capable team.
...
@SyntheticForm
**"Fact is these guys are the best in the business and they're in high demand. Naughty Dog has a standard - a very, very high one. People employed there or are seeking employment there know this"**
Jonathan Cooper (Ex Naughty Dog Lead Animator - Left Last Year ) -
Twitter Link
https://twitter.com/GameAni... ...
@Ziggurcat
**"How do they know there's 128 shader cores? Is that just based on known HW that's already out there"**
It's actually 64 x 2 on each of the traditional CU's. Yes, based on what is already known about the architecture.
**"I recall Sony posting some coy Tweet to the effect that you shouldn't believe what you read right around the time those rumours were circulating (I'll try to find that ...
@Razzer
**"16-512 = 16 GB GDDR6 at 512 GB/s bandwidth!"**
...Or 16 GB Mem with a 512 GB sized SSD.
@Zigguracat
**"The only thing that I can't see is where they're getting 11TF and 1743MHz from"**
WCCFTECH have actually made an error in the math, it should actually be 11.6TF not 11.06 as mentioned in the article.
To work it out,
Multiply the number of compute units (52) by the number of Shader Cores per CU (128) x GPU Clock Speed (1743MHz)
= 11,601.... Which is 11.6TF's
If you have any interest in this product then I'd highly recommend picking one up, worth every penny IMHO.
I have the first run of the NT Mini and not only is the FPGA 💯 accurate (no emulation) the machines themselves only increase in value, NT's and NT Mini's regularly sell for $1,000+ on eBay, after this confirmed final run, I expect the price will continue to go up as they will become very rare in a couple of years.
The Noir looks r...
@throne
**"what's the size of the demo?"**
It's 7.59GB buddy.
Very reasonable for a decently sized demo.
@rainslacker
**"And where did they find a set of code, made with all the same variables, with the same rendering pipeline to possibly test that?"**
The games ran better on the same system using the newer RDNA GPU architecture. The systems were running the same games and the same in-game benchmark tests, it's not a hugely scientific test they simply worked out the average framerate using easily repeatable areas of a game or GPU benchmarks for...
@rainslacker
**"EG is using theoretical numbers which dont translate into real world application"**
They tested the framerates of games running on GCN architecture (multiple generations) vs games running on RDNA (as other users have pointed out not even RDNA 2.0 which is confirmed for Series X)
I'm not sure how you think this is "theoretical" and not "real world application", it's games running on G...
@rainslacker
**"5-10% real world results are more likely"**
https://www.eurogamer.net/a...
@Krib
Reading the Xbox website it looks like Series X also features hardware accelerated ray traced audio.
"You can expect more dynamic and realistic environments powered by hardware-accelerated DirectX Raytracing – a first for console gaming. This means true-to-life lighting, accurate reflections and REALISTIC ACOUSTICS in real time as you explore the game world"
@darth
@shaggy is correct in that RDNA 2.0 is far more efficient than older GCN GPU architecture.
Xbox One X GPU power is 6TF on GCN architecture, Series X is 12TF on RDNA 2.0.
Digital Foundry did tests on PC comparing GCN to RDNA 1.0 and found that in the best case scenario performance per TF was improved by as much as 60%, although on average the increase is around 20-30%. Additionally Series X is RDNA 2.0 which is even more efficient...
@Echo
**"That smart delivery feature is pretty cool"**
And very pro consumer.
"This technology is available for all developers and publishers, and they can choose to use it"
Hopefully this will put an end to buying the same games via remake / remaster once and for all.
This.
Komachi Ensaka is well known for his credibility and is well known for his accuracy on "predicting" the specs of unreleased AMD hardware, he was also named in the article you linked. (along with _rogame) and was definitely not anonymous.
Digital Foundry also vetted the source of that GitHub leak (data-mined by Ensaka) and were happy enough to publish the article. All signs pointed to this leak being accurate and originating from AMD's ...
@ReadyPlayer22
You are 💯 correct in what you are saying. Razzer is incorrect.
With a reduced frequency the GPU cannot hit its peak compute of 10.2TF.
On a variable clock GPU it's performance per MHz that is counted and in this respect it's performance per MHz is identical regardless of whether the GPU is ruining 10.2TF at 2.23GHz or 9.2TF at 2.0GHZ.
The GPU in PS5 is Oberon, exactly the same silicone a...