I have two 780s, so this is no problem to me, but I'm not seeing anything in that game that would even remotely require something like a 780 JUST TO RUN the game. I mean seriously, there are only 3 cards more capable that the 780 on the market right now: the GTX 690, HD 7990(both of those are dual GPU cards) and then of course GTX Titan. That means only people with those cards can run the game? Yea...OK
Cool, as I'm still getting a PS4 even though my PC destroys any console. Poor Xbox One...devs coming out publicly now talking about how powerful a console PS4 is. That's just kinda demoralizing for Xbox architects. Oh well, Sony PC/PS4 ftw.
This is much worse AI than the first time I originally saw footage of this game. Looks, really bad.
Best time to build a PC is now as the prices are about to drop on current high-end products. If you're going to build a high-end gaming PC, I suggest you get well ahead of the entry level game and put a future-proof build as the fore-front of your priorities. Just one spec suggestion: You should go for an octa-core CPU to really make your system future-proof. Games will sooner rather than later be optimized with 8-cores in mind because of the next-generation of consoles and that's goi...
Dude I spent 4k+ on this rig, I better be able to max out every game I get for the next 2 years.
I still can't believe this game requires 8-cores for Ultra settings. i have a 4770k w/ HT but I'm not about to spend another $1000 on something like a i7 3930k. They aren't even using quad cores effectively even now and now ONE game is requiring octa-core processor. Wtf...
Not really.
It's quite easy to see poorly utilized AA when you've been playing on Ultra settings for a few months. I see jaggies everywhere; i.e the backend of the primary weapon(AK-12), the tanks(which are still lacking adequate shaders and shadows), wall edges within the building, railings on the stairs, exc. exc.
@Abriael
YouTube compression has very little effect on great AA usages(such as strong SSAA or MSAA),definitely not to the degree that it w...
Even with next-gen Xbox, it's still missing AF, ambient occlusion, coupled with some poorly used AA(looks like MSAA?) and motion blur. Textures are piss poor(like seriously god awful) and the frame-rate is choppy, but that one is probably the video, which is once again not direct-feed. The PC beta right now even with everything on low settings is slightly better than this(I tested all the settings myself). Footage I've seen of the PS4 version is looking noticeably better. It's not...
Really bad decision by Microsoft to go with standard DDR3 system memory instead of the fast, graphics intensive memory of GDDR5. With the launch of DDR4 and GDDR6 launching next year, XB1's RAM is going to become extremely dated extremely quick while the PS4, using a unified pool of 8gb RAM of GDDR5(2gb more than the best single card dGPU on the market) is still going to hang tough throughout the generation.
Going to be soooo interesting.
You played the game at the expo?
Lol $2000 card with no games attached incoming!!
Prepare yourselves!!
2560x1600 @ 60fps on your 680? Playing The Witcher 2? Playing Crysis 3? Riiiiight.
Come back when you're ready to stop lieing, bro cookie.
Lol people got to defend their brand of choice because PC gamers, like most of the people in this thead, spend a massive amount of money on their products. It's simply defending your investment, much the same way that console gamers defend their console.
But yea, you're right. It would be nice to see NVidia,FOR ONCE, put out a card that's not 700+ dollars. The route they're going, it's only going to make overall market prices higher as now Amd is rumored t...
Are you planning on running dual or trip monitors because if not, a 2nd 780 is a waste.
I just bought my second 780 and now they're about the cut the price. Great...
-.-
Ooooh my...
Prepare for $1200 card, folks.
LMAO!!
LMAO!!!
So a next-gen game compares relatively well to a current gen game?? Wow...that next gen feel, lmao.
Man it's been awhile since we had a fresh batch of 180 from Microsoft. Glad to see it's back to it's old self.... LMAO