You are going to need 2x R9 295Xs = ~2.5k
An HEDT processor + Mobo ~1k
Everything else ~1k
You are looking at around the 5k mark. But diminishing returns apply. Lets face it how many people actually have a 4k or an 8k screen? most pc gamers will game at 1080p or 1440p and these should run on most mid-high end rigs just fine.
Hmm, i kinda get why they are doing this though. The PC platform has been neglected for so long now. All the major players are vested in consoles. Most, if not all, exclusives arrive on consoles. Honestly, the devs dont really care about PC.
You have exceptions like the Witchers and Crysis 3, but they are few.
The awkward moment when you realize that you have been fighting for PC rights and when the time has actually come.... you are absolutely broke. :(
What you see here is a software implemented 'Polygon Culling' technique to maintain a high enough frame rate. By the time the game is released:
1. This will either happen out of sight
2. Graphics will be downgraded a bit to allow for more polygons.
3. The engine is currently not fully optimized so more optimization will result in more NPCs on screen.
This is the result of Ubisoft over hyping the game engine's capability to handle tons of NPCs. It cant. This is not really a bug but a problem of not hiding the Polygon Culling effectively. Should be fixed by the time it releases.
It isnt really something to be worried about (if you ignore Ubisoft's x100 exaggerated claims of being able to support thousands of NPCs). There is a very limited polygons that a renderer computes. Any polygons above that are automatically cut. Usually this cut occurs as far away as possible from the player so USUALLY there is some static entity blocking your view.
By the time this glitch is fixed, you wont see the "bug" but it will still be happening, somewhere...
Read the TL;DR given at the end of the article. But in-case you dont want to. Here is the essense:
The PS4 could have handled 1080p 30. The Xbox One couldn't have. (Resolution is entirely GPU dependent. And not affected by the CPU at all)
The reason ubi locked it at 900p was because it was being politically correct.
"PS4 could handle higher resolution or higher graphical settings than the XB1 and should have better fps when both are at the same settings/res. "
Yeah, this is exactly the 'Second Opinion' (to Ubisoft PR) this article is trying to provide. That the resolution lock (on the PS4) was due to being politically correct in the NextGen wars then due to any technical reason such as being CPU bound.
That TL;DR section summarizes the entire thing pretty well (for those that don't want to read the whole thing). Imo the gaming community is becoming slightly too demanding on the devs. Nothing can make up for the hardware superiority of the PS4 but the gaming part in gaming is increasingly becoming irrelevant and that is pretty sad.
The wise man buys a 980 and then chills the f*** out for at least 3 years. Maximum life/dollar point :P
The max is a an insane Dual Titan-Z (4GPUs technically) or 3 GTX 780 Tis.
If you are AMD then dual vesuvius. or 3x r9 290x.
This baby will be able to max out ANY game in 4k 60fps. <3.<3
One the A51s start shipping you can get the casing directly from OEMs in 4-6 months :)
@2pacalypsenow Nah, with 3 GPUs an i5 would bottleneck the crap out of the Triple GPUs. GPUs cannot render frames on their own, they need the CPU to give them orders. So a triple gpu config would put 300% burden on an i5. Simply put your best bet is an eight core/16thread processor.
This is why they are making Low Level APIs too, to give an alternative solution to this.
New FX CPUs Refresh announced along with Tonga.
Tonga GPU is coming.....
Heres why its not lame --->
A few months ago AMD raised up a lot of storm over Watch Dogs. AMD cards were performing very bad and it blamed Nvidia saying that since Watch Dogs is made on Game Works, a closed source library, its deliberately hurting AMD.
However, as you can see in the benchmarks you are getting a difference of 1-2 frames in diff versions however, in watch dogs you suddenly and out of nowhere get a massive 12 fps boost. What that means, yo...
Its not about the Lame/Spam reports, its about the general mob mentality of the pending section. If there is something they dont like, they will fail it even if its true.
This is the post I was referring to: http://n4g.com/news/1570624...
I am a tech head and I can assure you this does claim the PS4 is...
@iamnsuperman But this is exactly what I am trying to convey. Why does it matter if the source document is old if it is something no one ever knew before and is quite important?
And rest assured I have no hard feelings towards you. It is the general Mob Psyche i am speaking against.
thankyou for commenting,
1)the document was from 2013 but contained something never revealed or known before, i believe the old tag is designed for something outdated or old news.
2) my news was not mister x? i have no clue where that guy entered the discussion from? the PDF is official, and everything is given in the article :/ As for substance, you can see the article for yourself. Even bandwidth calculations are given. The piece failed because it got to 10...
Actually hardware sales DO suggest the same. The trick is differentiating the mainstream PC OEM industry (usually without GPUs) from the Gaming Oriented hardware.
Nvidia's share price has been enjoying an absolutely positive trend since the past 7 years or so and thats 60% of the PC Gaming industry right there. AMD is in a bit of a pickle with a fluctuating trend but that is because R&D costs are too high, and they can't afford it. And imagine for a second that AM...