
The idea is tempting but simply offloading assets to cloud won’t work, says Filmic World's boss and Ex-Naughty Dog developer John Hable.

John Hable on how DX12 will impact consoles and PC.
Why would a random ex-naughty dog dev who isn't working with DX12 on XB1 or PC or gaming at all currently (film graphics) be a reliable source to quote from? Seems pointless.
Not to mention both consoles are strongly CPU bound as their core speeds are terrible. Not saying he is wrong or right, but someone who is actually using the software would make much more sense to quote, these comments from him are essentially guesses.
I feel like gaming bolt decided to interview this guy for no reason other than he worked for Naughtydog and wants to start a flame war.
"The short answer is that newer APIs will make the CPU faster, but will probably not have much effect on the GPU,” Hable said to GamingBolt. “The improved APIs change how long it takes for you to tell the GPU what you want it to do, but they have no effect on how long it takes the GPU to actually execute those instructions. “ - ex naughty dog dev
"
They might be able to push more triangles to the GPU but they are not going to be able to shade them, which defeats the purpose. " cd projekt red
http://www.cinemablend.com/...
Same thing?
An apu is gonna suck no matter what you do to it.
Sorry.
What will dx12 do for a toaster?
What will dx12/Vulkan do for a gtx 960 and an i3/i5? Or an I7 with a r9 390x with HBM?
Now that is a worthy discussion.
You're getting disagrees for being right...the fanboys are strong with this article.
The reason the CPU's are struggling in the new consoles is not because of the overhead...it's because the CPU's are weak.
A $229 CPU from 2008, the i7 920, at stock 2.66ghz let alone the easily achievable on air cooling 4 ghz runs circles around the consoles.
That is why the CPU's are struggling...it's because they are weak.
8 cores doesn't mean much, when each is pretty weak.
Or else some people might think you can take one of the new octacore chips in a phone and think it's powerful.
I love my PS4, but a $329 GPU added to a near 7 year old PC will run circles around it.
I don't see why people still are in denial, we've had these discussions for nearly a year that PC's will get a much bigger boost then any console, because PC's have the overhead wasting the power they contain.
Why people don't think Naughty Dog or even a layman can't know this and needs further proof is the height of ignorance.
So when developers who never worked with DX12 say DX12 will do nothing for the GPU, their speculation is magically right? Brad Wardell, on the other hand, has worked with DX12 and he is repeatedly discredited by people here. Love the double standard.
We'll see at E3 who was right when DX12 gets its XB1 debut.
Well I'm pretty sure both AMD and Microsoft were well aware of the GPU bottleneck with the old DDR3 GPU prior to them building the custom chips for the XB1. I'm pretty sure M$ prepared the XB1 to handle that bottleneck that Mr. Hable discussed. The XB1 was never built to run the low level DX11 API, which is pretty obvious by the earlier games. It was build with DX12/Win10 in might. The system launched some 18 months early. So we'll see what happens after the key is inserted into the system.
LOL Quoting CinemaBlend as a source, pathetic that site is garbage and is run by fanboys
And I'm supposed to believe some random guy that "USE" to work for Naughty Dog? Nah, Pass! Microsoft isn't going to "cheerlead" directx12 unless it actually does help out the Xbox in a noticeable way. If it didn't it would come back to haunt them worse than E3 2013 and Phil Spencer isn't that stupid.
You know whats pointless? Microsoft's cheerleaders talking about how dx12 will change the world and up the graphics and framerate... even though dx12 isnt released yet. Neither is Windows 10 or any games withdx12. If anybody can put in their 2 cents in so can he. Deal with it.
I agree, those people are also stupid. Are you suggesting fanboys in a comment section somehow make an article based on an interview with someone who has no experience with the software they are talking about is validated by such?
If so that is just as stupid. In fact, you claiming it will do nothing is also equally stupid as you have no clue, and it has been proven that there will be improvements. (Just how much is the question) The reality is, the software is neither released, completed or implemented yet. How about we wait and see instead of pumping more into it. People are going to believe what they believe until shown otherwise so these articles are pointless unless given by someone using it.
Funny that you think your assumption is better than everyone elses... to the point of chastising those who have different assumptions, and excusing those that don't.
@Tsubasa
That would be true......if there weren't dozens of benchmarks already shown to the world...
...but there are.
And we can already see a massive improvement from DX11 to DX12.
There will be less cheer leaders if there is less haters.
*logic*
MS fanboys arent just console gamers which mean there the ones getting the best out of DX12.
@Nicksetzer
And yet you have no idea, what experience he has with DX12 or what he knows about it. You do know that a lot of these guys all know each other right? And that they all talk right? Im pretty sure that by now everybody who is anybody in the know, knows a whole lot about DX12. Its not like a low level API is even some new mystical thing. Its been done in consoles forever now. So let's not fall back on that default fanboy argument of "how would he know if he hasn't used it yet"? How you dont know he hasnt?
@Out
"And we can already see a massive improvement from DX11 to DX12"
On the PC! repeat, PC, not X1. You are making assumtpions. We don't know what it going to do for the X1.
well of course they did ,i'm just surprise the click bait went the other way this time its normally asking every indie dev why the xb1 sucks lol he really has no idea what tools are in dx12 for xb1 .
lol, both are strongly cpu bond, lol. The weakest part of both is the CPU, average mobile cpu...
Well... not "average mobile CPU". More like the latest smartphone.
The Galaxy S6 (which was released recently) has both a Quad-core 1.5 GHz and a Quad-core 2.1 GHz. The PS4 and Xbox One Octa-core CPUs are respectively 1.6 and 1.75GHz per core.
Sure does look comparable, though.
He's a high level developer who worked at one of the most difficult game companies to get hired at. He sure as hell knows what he's talking about, whether or not he's currently developing something on it or not.
People seem to be fine taking the word of MS employees who aren't even developers regarding DX12. I'd hold the opinion of a talented, unbiased developer a little bit higher.
He maybe a high level developer but he still hasn't had anything to do with dx12 and why does it matter if those ms employees aren't devs as at least they actually had hands on with dx12 yet your saying that his opinion on something he has no first hand knowledge of is more valid than those who have
Many take Wardell's word for everything, despite him clearly stating that he doesn't know the X1 well enough to say for certain. I can respect Wardell's comments because he does at least have the knowledge to reason out what is likely to be the case, and I also realize that many of the things he says simply get attributed to the X1 despite most of the time he's only talking about PC.
This guy could probably get picked up at any MS studio if he wanted to, and go in without more than a couple days to get up to speed on DX12 specific syntax. People really don't realize how talented game developers have to be to get jobs at studios like Naughty Dog. It's not like he was some sort of intern who worked there for 3 months working on linking the menus to different parts of the game.
I've done both console programming and DX12 programming(for PC) and I can tell you there isn't a major difference in how operations are handled between the two. Syntax is different, implementation is different, but DX12 operates pretty much the same way consoles do.
It's funny though. This is a great break down of a major difference in DX12, and it's a great thing, but some people are more concerned with discrediting the statements without the knowledge or the research to refute it with something factual.
@rain c++ and visual basic are the primary syntax languages for dx3d of any kind, very few changes in that. Weird you claim to be some god-like programmer but don't know that...
Not to mention crytek, unreal and square have all had tech demos showing there is an inpactful change. So should people believe you (the random self proclaimed pro) or the people who actually presented something with the software?
http://m.windowscentral.com...
http://wccftech.com/king-wu...
So if you want to believe it does nothing, enjoy your misbelief. The only question is the effect it will have on XB1.
I'm with Nick! It would be dumb to take serious the words of those (with even huge reputations) that have absolutely no experience dealing with dx12 over those that have some.
By learning new sytax I meant the new functions that exist within DX12. For console programming, they're either going to use C, C++, or the assembly API. There are also some game engine scripts that they will likely use, and many many 3rd party tools which will get licensed to make things work. Visual basic won't be used because it works off a framework which isn't suitable for AAA games, but can be used for simpler games.
When I say there isn't a major difference, what I mean is that overall, the differences are on levels that aren't actually programmed in individual games. To the average developer, they're just going to use the engine, and then provide special functions through the low level API if necessary. It's EXACTLY the way console programming is done now. Not much will change. Sorry for being unclear.
I never claimed to be some god-like programmer. In fact, that's my point. It doesn't even take a genius programmer to go from one to the other if you know the basics of one of them. No people shouldn't listen to me, but they should at least verify or research what I say to determine for themselves if what I say has merit. I don't often dismiss other people's comments without at least trying to verify if they may have some merit.
Did crytek, Unreal, and SE show off anything for the X1? because that's really what this discussion is about. This guy is discredited for his work at ND, yet you point to all those developers who haven't made any DX12 games for X1? Seems legit.
On PC I have said many many times the differences in how it operates are substantial. I can attest to this based on my own work, and I am very impressed at what it can do, and I'm a little miffed that PC's have been gimped for so long due to this kind of stuff not being available years ago without 3rd party tools.
And that's what I'm saying. DX12 brings to PC exactly what has been on consoles for decades. It's a touch more higher level, but it has an extremely efficient low level API, just like consoles.
Let me know if you want to misread and misrepresent my comment to discredit me some more. I'll be happy to respond.
If you wish to continue on with your eyes closed, and fingers in your ears going lalalala, while ignoring anyone with a comment contrary to your own, and can't bother to provide me with any kind of response that actually does address anything I say with factual information then please just put me on ignore. Let my comments be used by those who want to take the effort to learn more, and not be like yourself where you throw out a few computing terms hoping that you seem knowledgeable enough to discuss the topic properly.
I don't care what your perceptions of what it will do are, but I do care when you make it out to be something that it's not. You set an expectation for others which MS can not possibly match. At least do me the common courtesy to respond with actual facts pertinent to my comment, and not with more exaggeration and PR nonsense which only validates what you already want to believe.
What??? MS built DX12, but some guy who worked for Sony is more credible? That's just stupid.
I'll tell you what. I've got some gold to sell you... yeah, I know the Periodic Table says it's lead but they don't know what they're talking about. Trust me.
Come on now. It depends, but for the most part those MS employees actually work at Microsoft and are communicating with and/or have had experience with Dx12 or those working on Dx12, unlike the ex Naughty Dog dev. I mean geez the difference is that simple.
Man, I wonder how you all would react if some ex Halo Dev started talking about Nintendos new API or a mario dev started talking about Vulkan. Tune would change here.
And there is a reason no one else is talking about this besides Gamingbolt lol.
Why indeed? Probably because gamingbolt asked him. He's not wrong with what he says though. It's not like you need to know each specific API in and out to be able to work with them or understand what they'll be doing. Good game developers don't live in a bubble, and they all have access to forums or documentation or personal contacts where they talk about this stuff, and if they're good they keep up on everything that may be available to them.
I knew nothing about DX12 when I started working with it, there wasn't even documentation available to me until after I started work, but when I had it, I was able to see what it does different from both OpenGL and DX11, and adapt my algorithms to implement what I needed to in DX12. Since that time, MS has provided a lot of that information more publicly to developers if you know where to look. It doesn't make their PR rounds as much though.
Also, consoles are not CPU bound. They haven't been for a while now. PS3 was designed to be GPU bound with offloading onto the CPU/SPE's when needed, and the 360 had a similar setup but did have some overhead with it's memory controller in the cache, nothing major though. PS4 is most definately supposed to be GPU bound as stated by Mark Cerny himself, and I'd imagine the X1 is supposed to be as well, as it's the best way to get the most performance out of these machines.
Otherwise, I didn't notice this was a gamingbolt article before clicking on it, but it is a pretty good break down and simple explanation of a basic difference between current API's and the new ones coming in DX12 and Vulkan. His bunny example pretty much nails it, but he doesn't really explain what overdraw is to help understand why it's not optimal.
This quote is pretty much what a lot of people have been saying for a while.
. “The improved APIs change how long it takes for you to tell the GPU what you want it to do, but they have no effect on how long it takes the GPU to actually execute those instructions. “
In other words, you can't make hardware do more than it's designed to do.
Even Wardell has said something similar.
With all due respect, I'm pretty sure that guys knows a little more about programming than you do.
let me explain what he was trying to say. and im a computer scientist, so please consider that.
imagine you're trying to send a set of order to your employee. you have two options, you can do it via a middleman: asking someone to send the order to the employee, or to send it directly to your employee. naturally the second option is way faster because.. well it's just naturally faster.
programming is a set of order, employee is your hardware, and those middle men is any middleware, such as directx 12. So what directx 12 trying to do is to fasten the middle men, but it got nothing to do to fasten the employee. Does it make the game to be rendered better? of course! but does it make the hardware better? no!
So to sum up, direct x 12 WILL make xbox one's game better, but ps4 is already a better hardware, so it's impossible for xbone to top ps4.
and considering game dev (especially the triple A games) is much closer to the hardware (which minimize the role of middleware), the effect would be very minimum in console dev.
Hmm, I find that hard to believe. Especially considering "computer scientist" is the most outlandish job position description I have ever heard. It would be like calling the people at nasa spaceship guys.
That all said, you contradict yourself regardless. Directx 12 is a kernal and it does transmit data between the hardware and an application, but bettering that process can actually benefit the quality of a game. Will it make the physocal hardware more capable? No. It does however have the same result as it makes the hardware work more efficiently.
A better example would be a helicopters propellers. If you make the propellers directly horizontal they have little effect, but if you skew them slightly they allow for much more lift. Same wings, same engine, same weight.
Proof of this would be to use a simlar piece of hardware running old software and one running new. (Let's say dx9 vs dx12) there would be an absolutely massive difference, despite the hardware being The same.
Point is, we know DX12 is a massive improvement, we just don't know howuch it expands on the current toolset for XB1. From what we have heard the toolset for XB1 is a bit under par. So this could be a two fold upgrade. Stronger API with a more reliable set of tools. Not to mention if XB1 uses DX12 properly it could allow devs to make engines that will accomadate it more readily. Would be like last gen, PS3 better hardware, but xb360 was easier to create for. I think that is the goalS is shooting for. That said, this gen, the PS4 is streamlined enough AND powerful enough that I doubt any gains will overpower it.
It would appear nick, that you want to discredit anyone who thinks differently than you.
While I can't claim ng's credentials, Computer Scientist is indeed an actual thing. It is more on the theoretical side of computing, as opposed to an engineer who works on the hardware or software. A CS would have to have enough knowledge of computer engineering, but the reverse isn't always true.
Here's a list of jobs looking for computer scientist positions.
http://www.indeed.com/q-Com...
Everything he said is true though, whereas your comment is just you throwing out some computing terms again trying to look smart. So lets look at what you say.
"Directx 12 is a kernal and it does transmit data between the hardware and an application,"
No, DirectX is an API. An API is an application program interface. it is simply a set of protocals and routines which facilitate the building of software applications. There are thousands of API's which are used every day, and their main purpose is to dictate how the software interacts with the OS(emphasis on OS).
A kernel is a program which manages I/O request from software, and translates those request into instructions for the hardware.
A kernal interacts between the hardware and the OS, whereas an API interacts between the user(software) and the kernel. The idea of low level access bypasses the kernel to a degree, but in a PC that's not ideal as it can cause serious security issues, which is why you need a low level API because opening up a system fully to low level access can cause all sorts of unwanted side effects. I don't even think consoles allow for complete bypassing of the OS kernel, unless it's through the hyper-visor. Outside of closed systems, I can't think of any API that directly interacts with the hardware. Almost everything is done through a level of abstraction, and DX is no different. The whole idea behind DX was to remove the hardware from the equation to make games more compatible and easier to program for PC.
" but bettering that process can actually benefit the quality of a game"
Yes. And DX12 does that. that's what the article says, that's what I've said, that's what ng said. It can benefit the quality of the game, and it will, but the effect is that the hardware can get more data faster, or more efficiently, or spend less time waiting for data to perform it's task.
That isn't the same effect, because better hardware can do more, whereas the current hardware can do what it's designed to do. The efficiency is that it's not waiting on data, or that the data can be output without waiting, or the data can be supplied in a more acceptable manner for processing, but the data itself is still processed based on the rendering algorithm used. I know it's a nuance, and you aren't exactly wrong, but you are making assumptions about the gains.
"From what we have heard the toolset for XB1 is a bit under par"
What we've also learned is that an implementation of DX12 is already installed into the X1 API. It's not the full Dx12 because it's still not finalized, but the core is implemented from what I understand, and the current low level API probably isn't going to change either way, as the current LL API was already built for the X1, never existed in DX11. Same as last gen, the X1 has a custom version of DX.
@nick wow... Are you being serious... A computer scientist is technically anyone with the job position titled that (research based I'd assume) and/or anyone with a degree/studying for a degree which here in the UK anyway is known as "Computer Science" I know several people studying it currently as I myself study physics and therefore know many people at uni studying other sciences because... Well your just more likely to meet them. You seem to like discrediting others comments without any real evidence so I'm going to assume you have no background in any form of science because if you did you'd know. Evidence is key, without some tour credibility is zero. And both the guys above seem to have evidence for their comments and potentially experience in the field. Same goes for the guy being interviewed.
Computer scientist is not a posistion nor a degree. It is a description of such. Anyone who has a degree or position would know that. Generally all it means is a major in mathematics or a computer related field. Really simple to understand. Just like saying you work on broadway. There are lots of jobs in broadway not just one.
http://en.m.wikipedia.org/w...
The xbox one "beta tested in the future" was specifically made with DX12 in mind. The Xbox One is a windows 10 device. DX12 will not only make use of all the cores (Currently only one core being used with dx11) but it will also make much better use of its super fast ESRAM. Tiled resources, etc...The main reason for the difficulty for some 3rd party games not reaching full 1080p. Moving to DX12 is huge in this regard and it's something many conveniently leave out when discussing the true benefits of games developed using DX12 on the console.
Another huge benefit not talked about much, but expect MS to really do a deep dive into this at E3. But the sheer ease for developers when using DX12 on PC and literally with a push of a button, your game is now scaled & optimized for ALL windows 10 devices. Devs will save money, time, etc...And consumers/gamers when they buy a game on ONE windows 10 device, you own it for ALL your windows 10 devices. That to me is the game changer.
Lastly the fact that if a game is built using DX12 & windows 10, cross play will be simple & easy. For years I've always dreamed of a day when PC gamers could play with console gamers. Well after July that dream will be a reality.
Dx12 & windows 10 is definitely a game changer for xbox one. The xbox one will finally have software for developers that the xbox was built for from the beginning. It's amazing to me how with the low level dx11, the xbox one has been able to keep up with the Ps4 and in some cases do better. Fanboys say that xbox one can't do 1080p. Strange, because I have over 10 xbox one games that are native 1080p.
Dx12 will open up all kind of doors for the xbox one. Soon enough, gamers will see.
he probably knows a thing or two about PC programming numbnuts. what do you think the PS4 and xbox1 basically are?
Oh gamingblart. Playing both sides off against each other again. Classy stuff.
Strange that anyone with even a modicum of API knowledge has been saying what's posted here (and more) for months. The whole while you've been posting BS about massive gains to Xbone, pc and even mobile.
Serious question? Do you ever feel bad for the hit whore tactics you employ? All they do is cause division and angst.
What was that old chestnut about the ends justifying the...something?
Edit: Lol at the damage control attempt by nic (first comment).
This isn't a Naughty dog dev. It's the guy who runs filmic worlds (a graphics solution based company thats open to pc mostly). He worked for Naughty dog at some point. That (the ND reference) was used to generate hits and cause an argument. Seems you fell for it..'coz...you know. Creating fighting fanboys is how this site does business.
Edit 2: nice edit nic. Keep up the good work. *smfh*
Just think, a pact of script kiddies are going to tear this news outlet apart along with Gamingbolt.
Honestly gaming bolt is a farce of a site. Ppl put way too much stock into the differences of direct x and Open gl, when a new version comes out of course it's better, but in the end it all pretty much evens out.
I think the ex-naughty dog dev is mixing old technological advances with today's new technological advances of gaming & hardware/software.
Thing is, MS took a completely different approach with designing the x1's hardware.That said, They simply went above the standard engineering process to make some of the x1's parts custom built. By doing this, they gain a bit more hardware control over the x1 CPU/GPU, and how it should send n process data, back and forth.
The x1 was design with dx12 in mind. That makes a world of difference. The way I look at it we don't know enough about the x1 hardware to talk about it capabilities.
That's what has me interested. We don't know enough about the hardware, and whatever details we don't have I feel like we will around the time windows 10 & Directx 12 comes out. I'm not saying there will be some magical boost, but like you said the X1 was built with Directx 12 in mind. They planned it out from the beginning and all they talked about seems to go together like pieces in a puzzle.
It's all very interesting imo.
It's hardly custom built just because it has esRAM..
They didnt take a completely different approach whatsoever. It's literally just like the 360. Just stronger, obviously with added functionality.
I don't see anywhere where they went above any standard engineering process. It's a console. Of course, it has more hardware control access, but it's no different than any other console. It sends data back and forth no differently, aside from esRAM, which requires work to get anything out of it.
Yes, it was designed with DX12 in mind.. no. Wait. DX12 was designed with consoles in mind. Not the other way around. DX12 gives PC's the lower level access that consoles already have. That is all.
Nothing is revolutionary about it's design.
Nothing is revolutionary about it's design. Oh really?
The x1 GPU is custom built. Its the only GPU in the world that has dual lane. This tech feature its the first of it kind. There no other GPU like it. Beside, MS wouldn't spend 2 billions bucks on X1's custom built GPU for no reason. Keep in mind, you can't test something that isn't readily available on the market. That said, I can clearly see why MS had to wait until x1's custom built GPU is born before writing DX12. I think had MS used a regular GPU for the x1's design they couldn't start writing dx12 way before the x1's hardware were completed.
http://www.reddit.com/r/xbo...
Another thing, Illogically speaking, what I find obvious about the x1 custom built gpu is that it has dual lanes which means that some of the other x1 hardware component has to also change to sustain data communication between the two lane that the x1 GPU has.
I'm guessing, that why MS had to implemented a move engine to take full advantage of those two data lanes. I could be wrong on that note.
Anyways, my point is the x1 has some hidden mystery about the x1 hardware. So, to make claims on what it can do it plain crazy thinking, especially if you don;t have(know) the x1 full specification.
Jhoward...what you've just written and linked to is complete crap.
The "dual lane" you're waffling on about is for gpgpu compute and refers to ACEs. They're used for parallel computing. They each run 8 queues (lanes). It has 2 ACEs for a total of 16 queues (lanes). Pretty impressive huh?
For reference the PS4 (the R9 280 and up as well) run 8 ACEs for a total of 64 queues (lanes).
One of the reasons MS would have chosen to go for the older/fewer ACEs configuration is to accomodate the eSRAM. Said RAM takes up more than half the transistor count on the die. There is no secret sauce in the hardware.
Now that kinect has been 86'd they may attempt to repurpose the move engines. Given their ultra-limited bandwidth with no direct access to the eSRAM their usefulness will be limited.
All (yes all) the xbone's spec's can be found online. Beyond 3D and other sites have all that for you if you like.
Now did you really want to talk about hardware or were you just grasping at straws? Tell ya what. I'll wait for your reply and we can continue this if you like.
Here's a tip though. Don't link to Mr X style reddits if you want to be taken seriously. ;)
Edit: LOL all the info the link you provided proves my point if you just look at the info given. OMG! You couldn't make his stuff up (but someone did).
@jhoward585
For one.. You are talking about complete speculation from a rough translation. that still hasn't been clarified.
Logically speaking.. You're still talking about "new" and "one of a kind" tech that isn't actually confirmed, yet, anyways. So, I don't see why you are trying to explain how it works.
Again, still unconfirmed. Don't harp about secret X1 sauce like everyone else is. Always have to grasp onto something..
And, just as LostDjinn said..
So. Yeah.
Love your console. Don't lie about it or find excuses though. The excuses are what games and features you like. Not what you hope it to be.
@sinspirit
YOU:For one.. You are talking about complete speculation from a rough translation. that still hasn't been clarified.
speculation?LOL
ME:Fact is, the x1 does have a dual lane custom built GPU. No one knows how it works but MS.
-----
YOU: Logically speaking.. You're still talking about "new" and "one of a kind" tech that isn't actually confirmed, yet, anyways. So, I don't see why you are trying to explain how it works.
ME: I brought it up because in my mind I know any piece of new technology will eventually improve as it passes through the trail n error phase. The x1 GPU is the first of it kind so there going to be lot of test run on the software side of things to refine it performance.
You:Again, still unconfirmed. Don't harp about secret X1 sauce like everyone else is. Always have to grasp onto something..
ME: I was thinking the same. Until I get more info on the x1 hardware specification I won't take in another false rumor as fact.
You: Love your console. Don't lie about it or find excuses though. The excuses are what games and features you like. Not what you hope it to be.
ME: for the record I own a PS4 not an x1. maybe in time I will.
Another thing, I do my best to make sense of everything I read or hear, especially when I gather information on the internet.
Lies is always a Lies. But one thing I do know is most major company won't take the blame for another company's false(misinformation) claim.
With that being said, MS has made some claims in the past that involve AMD as far as the engineering & hardware design of the x1 hardware.
And yes, I'm talking about the secret sauce stuff that's spurring all over the internet.
Fact is, AMD has their repetition to protect ,and MS has theirs. one thing that is for sure, AMD would've ended some of those false claim to protect their image.
So, That my friend, is all I need to draw my conclusion on what is fact, and what is not a fact.
@LostDjinn
Ok, the link I provided on my previous post may not be the greatest information as far what the x1 hardware can actually do.
Honestly though, I don't think anyone can say what the x1 full specification really are because there's just too many contradiction regarding the x1 hardware specification on the internet. One sites say one thing while another site says another.
Truth is, I was more interested in MS's past business decision they made to fund & create the x1's hardware more than the physical hardware(and specification)itself.
Fact is, MS spent well over 2 billion dollars on the x1 GPU technology. That alone say a lot to me. To me, It means MS took a chance which could've either worked to their advantage or not.
Truth is, we really don't know if it was a bad or good business call. We don't yet have all the details until then I will remain completely optimistic.
I think what u meant before about dual lanes is that the x1 gpu is split into two sets of CU with 2 individual gfx command processors. Thats what brad wardell meant when he referred to xb1 having dual lanes and how ps4 and pc dont.
@LostDjinn
If my memory serves me correctly, I think brad wardell were the one who stated that the x1 custom built GPU cost close to 2 billion dollars.
They way he explained it...The deal between MS and AMD to engineer & build the x1 hardware cost MS over 3 billion. Two billion dollars went into the development of the custom GPU for the x1 while the remainder 1 billion dollars went in to the rest of the x1 hardware component & design.
http://www.gamespot.com/art...
http://www.vg247.com/2013/0...
Genuine- If it's command processors I think you'll find they pertain to OS task distribution. With the discrete (system) OS and gaming OS requiring high and low priorty access. It's they only way they'd be efficient. The hypervisor would be simply access the discrete level OS priorty solution.
Whats the point?
Well, think of something like the snap function. The game would be given a high priorty while say the browser would be given a low priorty (as missing your render budget on a web page would be preferable to the game doing it).
Edit: Jhow neither of your links say anything of the sort. A deal between MS and AMD is all that's mentioned. Nothing about gpus. Please make sure you provide proof of your claims in future. Otherwise you'll paint yourself in a bad light
You just completely edited your comment to cover the fact you can't provide a link. Jhoward I'm now have a very different view of you. It's not about the truth. You were simply clutching this whole time.
Thanks for that.
From that thread, as much as I cared to read since it kind of devolved after a bit, all I can gather is that no one seems to know what the dual data lanes are for.
For those that don't know, a data lane is just a controller for data, and in this case, the X1 has two supplying it's CU's in the GPU. I'm not sure the move engines are really important for two data paths, as it may just add more overhead since the data controller has direct access to memory and the CPU. In the case of the X1, the different controllers appear to control a split set of CU's.
Anyhow, I'm not going to speculate for now, because I'd have to do some more research. I understand what's being talked about, but not enough in relation to the X1 to be able to fathom an assumption. I do think the article which reported the leak was a bit presumptuous.
To me, the best thing to take from that thread was a comment from iroboto,
"It's very important to not get stuck into conspiracy style thinking. It's like if MS denies the functionality of the second graphics command processor you take that as the opposite. If they agree with you, you take as truth, and if they say nothing about it that means you also take that as admission that you are correct. In all scenarios you are only agreeing with what aligns to wishful thinking and that severely hampers your abilities to make good sound decisions."
Not saying you're wrong or anything, just that it may be wise to temper your expectations and wait until more information is out before postulating a conclusion on what MS did and didn't do. I'll also readily admit that I should probably follow my own advice sometimes. There are a couple people on that thread which seem really knowledgeable about hardware, and while they're offering possibilities, none of them are saying anything definite.
Otherwise, I wouldn't really call it a revolutionary design. It's a different design, done to achieve some task which is currently unknown. One person speculated that the extra channel is for the media and overlay features of the X1, so the extra data path may simply be a workaround so as not to take away from the actual abilities of the GPU.
I can think of several reasons why a second data controller would be beneficial in gaming, but it's not a feature of DX12 that I've seen. It may be specific to X1 though, in which case I wouldn't have bothered looking too much into it yet.
However, even with two data controllers, unless there is some specific gaming purpose for them, I can't really think of any reason why you would need one on the GPU's given the rather low number of compute units. The data controller that came standard with the GPU should be able to handle it perfectly fine, since graphics are serial in nature. But when it comes to GPU compute, it can actually help tremendously.
Edit@LostDjnn
Appears you went into more detail about it possibly being a multi-tasking thing needed. While that's perfectly reasonable, I still wonder if it's actually necessary to have 2 controllers. Overlay has such low overhead, it seems rather unnecessary to split the CU's and memory controllers. I can't imagine that system features would run off GPU compute.
Rain that's not where I was going with it. Overlay indeed takes FA overhead. The efficiency increase I was referring to pertains to running 2 controllers with a conventional overlay as opposed to running them hardware based priortization. It has nothing to do with compute based packet distribution to the gpu. Simply the simultaneous rendering of assets from two seperate OS's on a hardware level.
Nice to chat with someone who actually just cares about the facts though. If I run outta bubbles just pm me. :)
Ah. Yeah I think I missed a bit of your comment there. Indeed it does make sense to do that as it would require hardware to be dedicated to the actual secondary rendering to prevent slow downs with the game render. Since console developers have the ability to control memory controllers, it would be reasonable to isolate a secondary controller that is managed by the OS to perform it's functions. This leaves everything still available to the developers and prevents unintended conflicts.
I think one example of why this might be beneficial is looking at the PS3. While a game can render in the back ground when you hit the home button, it can also have stuttery frame rate should it keep running. It's mostly obscured so it can be hard to notice, but it is there in some games. I'd imagine on a multi-tasking view, the stuttering would be extremely noticeable.
It's an interesting approach to handle what could be done with a rather inexpensive secondary graphics chip running synchronously with the main GPU.
The thing these developers are missing is that GPU bottleneck he's talking about won't be in the XB1. Because the XB1 was designed specifically for the DX12/Win10 API/OS there are things in the XB1 that will eliminate most of the bottleneck that would normally happen to a next-gen console. I expect a lot of PC games will get transferred to the XB1 with little problems. I'm interested to see how The Witcher 3 gets updated to Win10/DX12. Most of these developer who don't work for M$ haven't a clue of how the XB1 will handle DX12 because they don't know everything about the hardware.
What these APIs do is to lessen the strain on the CPU. So if you have a game that is bottlenecked by it, which basically means the GPU cannot unfold it's full potential, you get some gains with a better APIs. But the GPU stays limited to it's specs. There is nothing you can do about it.
Example: BF4 with Mantle
Slow CPU + Fast GPU ... gave you a crappy experience in DX11, since the GPU was bottlenecked by the limited CPU. Mantle helped there and freed up some capacity on the CPU, so the GPU could better unfold.
Fast CPU + slow GPU ... no gains with Mantle.
Fast CPU + fast GPU ... if there is no CPU bottleneck, you get no gains with Mantle.
Slow CPU + slow GPU ... minimal gains or no gains, if the GPU was allready at it's limit with DX11.
DX12 will basically work the same way. On PC and X1 the GPU will limit it's effect. It really only shines on old CPUs combined with mid- to toprange graphicscards.
Can current and ex Naughty Dog employees stop commenting on DX12? It's none of their concern. I don't see MS constantly talking about PS4's API.
I know right ps4 guyz talk more about xbox 1 then the system they make games for it.
Graphics programmers aren't bound to platforms like you somehow think they're supposed to be.
Being a graphic programmer doesn't make you an expert of every graphics API. A dev that only makes iOS games doesn't necessarily know everything about developing for Android OS.
Naughty Dog devs have never worked with DX12, any Xbox game, or PC game. So they would know little to nothing about it. So asking Naughty Dog current and ex employees is useless.
@Pandamobile
While that is true, at the same time it makes no sense to ask a developer of a totally different studio, with totally different programming methods. They still don't know much about it. Even AMD; the company that worked with them on it, doesn't even know much about it's capabilities. And you suspect someone who hasn't touched DX12 at all knows something? You can't really compare previous versions to DX12 either. As it unlocks low level access in PCs that has been dormant for years. So completely different case this time.
Every single game developer on the planet has made games for PC. Do you really think that graphics programmers jump right out of university or previous jobs onto a PS4 dev kit without learning DirectX and OpenGL?
Seriously?
APIs are transparent to graphics programmers. Just because they've never used DX12, doesn't mean their opinion on it is completely invalid. They know all the shortcomings and pitfalls of graphics architectures, regardless of whether their employed by Sony or not.
@Lennoxb63
"Being a graphic programmer doesn't make you an expert of every graphics API. A dev that only makes iOS games doesn't necessarily know everything about developing for Android OS."
True that but in the case of Naughty Dog believe it or not, they have Xbox One and X360 development kits at their office. If you don't believe me, look online.
@Lennox
I'll agree with you if you can agree that Phil Spencer isn't qualified to talk technical specs on DX12. He's not even a game programmer, he's a technical engineer of hardware more, yet if he said something about DX12, you would take it as gospel.
Otherwise, graphics programmers that understand how an API works, regardless of which one they primarily use, are far beyond the level of the less astute programmers who simply use pre-made functions to draw a screen. If you think programmers at Naughty Dog don't attend classes and go to things like Build, then you are sadly misinformed. Naughty Dog is part of the ICE team which makes the PS API's and SDK's. I dont know if this guy was part of ICE Team, but do you truly believe that these guys are clueless on different API's which are doing exactly what console API's have been doing for almost 30 years now? Any console graphics programmer probably knows this stuff more than any PC graphics programmer, because it's what they already do.
By your own reasoning, there aren't many DX12 developers out there at all, so no one we've heard from are actually qualified to talk about it in regards to consoles except for maybe a few privileged devs who got early access to it for the DX12 X1 games. There aren't many of those out there right now, and most of it is isolated to engine makers for the time being. So who exactly should we listen to? MS? That'd be fine, but they aren't exactly unbiased. So that leaves no one to listen to to relate DX12 to X1.
I'm sure you didn't even bother to read the article.
He's a programmer who worked at an acclaimed studio, he knows his stuff. Gamingbolt asked him questions, he answered.
This article was a step up from their usual 'posting a series of tweets' style of journalism.
All the little sub processors your refering to are mostly the equivlent of blast prosessing from the genisis which was a dma controller so yea maybe small advantage in some instances but its hobbled by weaker gpu cores less of them even and even the 6 % overclock cant replace 6 compute units. Slower ram for gpu doesnt help its lower latencey for cpu and makes xb1 faster in opening standard apps and such but costs in gaming compare ddr3 gpu proformence to gdd5
I don't know who your comment is geared towards but I have issues with yours on its own.
Sony uses a 14:4 configuration for their cu's. While dev's can use the extra 4cu's for their games Sony doesn't recommend it, and says it will offer little benefit since the system was balanced to 14cu's(not 18) anyways. You don't even know how PS4 works, let alone how many of the unique elements of XB1 will function yet because their purpose remains unknown. Until MS clarifies, any argument is purely speculative anyways. Data sheets can be misleading.
It may be ballenced for 14 cu but the die contains 18 backed by 8 jaguar cpu cores clocked at 1.6 ghz the gpu is close in numbers to a 77xx series and since ms reduced the number of cus from 18 to 12 to make room for other stuff.
I love how when people bring ddr3 vs gddr5 as an argument they fail to mention that xbox one also has eSRAM
Of course having ONLY ddr3 compared to gddr5 isn't as good for gaming, but throw eSRAM into the mix and its a different story. Its not easy to use which is why there has been varying results in resolution.
But MS aim to end the struggle once and for all. They are giving eSRAM its own API with dx12, and this along with the updated pix tool should see an end to eSRAM under utilization.
Great to see a dev who worked on dx12 give his thought oh wait....SMH this dev has not worked with DX12 so he is just speculating unless he has worked with the new API he can't make any credible claims.
http://stream1.gifsoup.com/...
did you work on DX12 On Xbox one And PC ? no
so shut up
Notice that is less than the pc benefits. I'm all waiting for someone to say that actual gains.
No doubt the One will get a 50% boost to performance just by adding dx12. Then we have the cloud coming probably next year add 50% boost to performance. That would put the One at about 1 petaflop computing power. 4k gaming is going to be sweet one the XboxOne.
But we all know it's true wink wink ;) right right. One developer even said it easier to develop for the XboxOne. Reading between the lines that would mean squeezing out all the performance from 1 petaflop would in reality give you about 5 petaflop computing power. Its just like when squeezing the juice from a lemon. Just imagine when they have squeezed all out of XboxOne. That will probably blow my mind.
@piff, Such incredible gains! Xbox One will become a force to be reckon with in the future. Stay tuned for the Spencer effect!
What a troll, you clearly don't believe that yourself, you are just stealth trolling to identify yourself as some MS drone.
He is being quite obviously over the top...he isnt trying to fool anyone about it except you it seems.
Your passion for all things console wars is inspiring.
You're basically Mel Gibson in Braveheart, console wars version.
Game of Thrones is actually a show about nicksetzer1 winning the Console Wars Throne.
At the end of The Lion King, nicksetzer1 killed his uncle and took back control of the Console Kingdom.
They deny it because they hear about it, read about it, know what it can and does do, but it wont benefit them in any way. Every dev praises it so their only option is to troll, downplay, and deny.
Didn't this guy get interviewed awhile ago on the same topic? Nothing really new learned from this to be honest.
Even based on his comments there will be gains without a doubt, PC has been seeing gains of 300+% if xbox one see just a 50% gain it's still great news.
I don't think people understand those terms though. 50% is no where near enough in performance. 300% is this, 30fps to 60fps in performance gains. 50% would be 30fps to 35fps in performance gains. Some cases in PC games, it's been said to have 500 to 600% performance gains!
Example, if a game like Forza Horizon 2 is 1080p 30Fps and sees a 50% increase it would have a 5 frame performance boost. If it sees a 100% boost it would see a 10 frame boost.400% it would be 40 frame boost which would be 70fps , ect.
If Xbox one can have atleast 300% gains, I would be impressed.:)
PS: I did not down vote you..
Where are you getting these fps numbers from? And when you say 50% increase = 5 fps increase, what's the 50% increase of what?
@kstuffs
Why this is from the two APIs .(DX11 to DX12) Intel ran a game in DX11 at 7 fps, in DX12 they had it running at 42fps!! So that's six times the performance boost,
7x6= 42 (that's a 600% increase)
The 50% is what lastking95 was referring to Xbox one performance increase, but that's not much of an increase. Forza horizon 2 1080p 30fps running on DX11 to DX12 was my example..
Enyone interested fast forward to 23:00 mins of this video.
http://youtu.be/47cnFWK0dRM
So because he worked on PS3, his whole entire career of programming is meaningless and he shouldn't have any opinion or comment on a subject that is related to his line of work? Do you realize how silly this sounds?
I have no idea what people are expecting of these APIs. APIs cannot work miracles, they can only do what developers tell them to do.
Armchair experts coming out of the wood works to shoot down this guy's words.
Just because he used to work for Naughty Dog, doesn't mean he was some sort of fanboy trying to talk bad of Xbox/Microsoft. In fact, the fact that this guy worked for Naughty Dog should suggest to you maybe he has a little more knowledge on the technology. It's not like a graphics engineer who used to work with OpenGL is ONLY going to know OpenGL.
is that good or bad? I never got my chart on which days it is good, and which days it's bad.
Also, bubbles for funny.:)
oh yeah, mine as well. best purchase of the century. I'm amazed I can still run modern games at 60 fps with it :)
There's more internal processors that will be lifted with DX12. Remember now..you'll have more processing power from the CPU to strengthen the offloading processors MSFT have yet to even talk about or unlock for developer use.
In none of the SDKs released notes does it even detail them the slightest.
I remember the Hotchips and I remember the head scratching...don't sleep on Xbox and don't sleep on DX12.
People been hating on dx12 for a while now.I bet if it was for psi then it would be a whole different story smh. my guess is if dX2 wasn't going to bring any kind of improvement, then what would be the purpose of even bringing it out? You people make no sense smh.what'so the point in putting a turbo in a Honda if you know it still won't beat a ferrarri?
The argument is that the improvement is a minimal one and not a secret sauce and that PC stands to gain more, which is true. Also, you are still limited by the hardware and ms brass made a fatal mistake by putting cheaper tech in the xbone (ddr3) and have spent this whole gen trying to make up for it and trying to convince ppl that a secret tech will turn the xbone into a powerhouse. They said the cloud would enable devs to harness 3x the power of xbone and that titanfall would use the cloud for rendering. Supposedly the cloud would handle all the ai to make titanfall have the ai ever seen in a game, fnny bc titanfall both looked bad and had poor ai. Really ms is taking minor features that will bring marginal improvements and telling ppl they are major features that will bring mind blowing ( unrealistic ) improvements.
Hater aid on DirectX 12? With all the stories being posted about DirectX 12 on N4g..and your saying its getting tons of Hate?LMAO
what is Amusing is Vulkan is also a Brand New built from scratch Api, and yet hardly a peep being said about it, but yet when it does get said, many still claim oh Sony will not use Vulkan on the PS4, but yet does that mean No one will be able to use Vulkan on the PS4?
Many gamers have claimed ps4 would not be able to use Vulkan based on what Brad stated "if Sony allows it" but yet those very same gamers see x ND dev. Say what he did and yet oh' he has no experience with DirectX 12 he should just shut up but oh ' but..but..brad knows all about it...
LMAO
all the while GamingBolt is making bank on both sides for stories like this...lol
The fanboyism on the Microsoft side is strong here. All this ex ND dev is doing, is echoing what everyone else has been saying.
The problem is that he was once affiliated with a Sony first party studio, so they're using that to call foul. What's foul is their stupidity.
The guy is an ex developer for a major 1st party game studio. He isn't required to work with DX12 to have some knowledge on it. It's an API. He's worked with them before. He's as qualified as anyone else. It's like saying a Domino's pizza cook isn't qualified to comment on Pizza hut.
I went looking into this guy, and he really knows his stuff. He's no fly by night mobile programmer using some game engine to make games. He knows intimate details of how hardware works, and how software in general works.
People who want to discredit him only do so because they see naughty dog and that somehow instantly nullify his statements.
I can't imagine any developer in their right mind, regardless of which API they work with, or what level of game they make, wouldn't want his expertise on their team.
Seriously people, go read his blogs on technical stuff. I doubt any of the haters here could understand anything once he starts talking technical. You don't know this stuff without understanding how hardware AND software works. Computers work in specific ways, they aren't some magical all knowing sophisticated AI that can adapt to whatever you may throw at it. Software has to be written for hardware, not the other way around.
Is it not possible that hardware can be made to process data in ways that haven't been conceived before? That a programmer looking at this new hardware, even through experienced eyes, still won't comprehend all of the ways that the data can move through that system? Or that DX12 can enable not just some, but all of it's intended hardware functions?
Even Einstein couldn't see past his own data that proved the universe was expanding, years before Hubble, because it conflicted with the Universal Constant.
Dx12 is so great that it will allow xbone to render multiplatform games in 1080p! No more 792 or 900p but 1080! Wait..it's coming..."but The Witcher is 1080p". The Witcher is a game that ms payed a lot of $ for it to be in the xbox camp, it was at their E3 conference last yr on stage. Also, every dev makes a decision as to dumb down their game to xbone in order to achieve "parody" (lazy) or build it up to PS4 and iterate on the xbone version much like they do with cross-gen. Talk is cheap, unless you are paying for positive coverage then it's expensive, but ms would never do that....would they? *coagh* paying youtubers *coagh* non-disclosure *coagh*. Only big greedy American companies would use propaganda and illegal tactics, not macrosoft. Yes i owned the orig xbox (fable) and 3 360's bc they kept breaking and wouldn't read the disks anymore. The only reason i went with PS4 over xbone is bc i couldn't afford to put AA batteries in my xbx controller anymore (batteries are a fortune in my country ).
FFS, this stuff has been reported on a million times now. This is not news anymore. GamingBolt posts the most repetitive stuff.
Why is N4G is already drawing such outlandish outcomes of DX12 when the API hasn't even been released nor have the developers have gotten their hands on it yet. It is interesting to hear an experienced developer's take on Direct X 12 but everybody uses different techniques and philosophies developing games.
The only way to really know if Direct X 12 is the next leap in game development or a steaming pile of lies is to wait and see how developers implement this API in future games.
In no way is this news...and it doesn't matter who said it...even IF it was someone that actually knows about it. We all knew from day one/second one when DX12 was announced that it would benefit PCs more than Consoles...this should be a shock to no one.
Ive gotta say that you xboxwun guys are goin wayyy too overboard thinking DX12 is gonna change stuff on xbone. Youre probably going to see smoother gameplay in PC versions of games but thats about it. From what ive seen of Halo 5 its not looking too hot in the visuals dept. (oddly halo2 anniversary edition looks like it has better graphics). If thats the first DX12 game on Xbox then im not holding hope for Gears 4 to look like anything other than an uprezzed gow judement or whatever.
If you really want to see improvements and changes and all that new stuff just get some cheap gaming PC off craigslist with last years dx12 stuff in it.
"Ive gotta say that you xboxwun guys are goin wayyy too overboard thinking DX12 is gonna change stuff on xbone."
Huh? nobody is saying its going to be doing 4K AAA games all of a sudden. Why do PS fans ALWAYS take everything said to the most extreme of assumptions? Its like you guys need to be reassured by xbox fans that their console wont be more powerful than yours...weird. Furthermore the X1 was designed for DX12. Its a known fact that they launched earlier than intended. Its also a known fact that X1 is currently using a derivative of DX11 and therefore the hardware is not being utilised to anything like its full potential...FACT..but go ahead and read that as 4K gaming. PSfans are uniformed nut jobs, who cant stop talking about X1. You guys can no longer be reasoned with. Why would it even bother you guys that Xbox fans are enthusiastic about improvements to their console? I dont understand that.....
"From what ive seen of Halo 5 its not looking too hot in the visuals dept. (oddly halo2 anniversary edition looks like it has better graphics)"
So you're comparing a completed games visuals to the graphics of a beta of a game that was a year from launch? A beta that ws filled with placeholder textures and assets, A beta that was built on an even earlier build (public Beta builds are always older than the day they are released. Halo 5 probably looked better than the BETA back in december...but we dont get that build for stability reasons) ...You cant really be this stupid can you?
The rest of your comment was pure nonsense based on your own silly, uninformed rhetoric. It doesnt even warrant responding to. It reads like pure nonsense. Seriously. Go get a clue.
"The improved APIs change how long it takes for you to tell the GPU what you want it to do, but they have no effect on how long it takes the GPU to actually execute those instructions."
When are xbox fans going to learn? You are totally deluded to think the system will change. The DX12 hype train really is going for a massive crash.
Like it or not what this person says is totally true.
I'm just about sick of articles talking about what DX12 will and will not potentially do......
Why do they always ask ND staff about there thoughts of DX? Lol what does ND know about something they never use? I don't recall a ND game using DX since... ever
I found it, this is what MS are referring to.
http://www.dolphinfitness.c...
Yeah this is why I stop getting gaming news from gamingbolt. There's only a few websites I read when it comes to gaming news, gamingbolt is so hung up on resolution and dx11and dx12 that's they'll interview ANYONE about that subject, even people thay probably never use it to make games ever smh. They've been in my gaming tabloid list for quote sometime and this article just proved why, just click bait article.
Also, this is the reason why I've stopped coming on this website so much, cause you'll approve this crap article like it's news, it's not news at all. Why don't you pull some developer from Media Molecule and Guerrilla Games also and ask them about DX12 also while you're at it.
Newer API's will make the CPU faster therefore consoles have a shitty AMD 8 cores CPU that is beaten by a I3 which is a Intel dual core with hyper-threading, no wonder the PC would gain more performance.
It wont make them faster .faster indicates an upclock what it will do it allow optimization for better coding which will include proformence gains by doing more per cycle not more cycles.
Enough with the arguing. They say this, they said that about it etc.. How do we know it's going to be a massive improvement for the Xbox one? We will not know until it comes out but for the PC it will help.

"Trailers and feature-length movies simply have a much higher budget per second than what the full game can afford," says Filmic Worlds boss John Hable.
PS4's hardware is 28nm and lowest available in next couple years is 14nm and when foundries are enough experienced with it to reduce initial failure rates to an acceptable level and even on the same as current 28nm then expect two times the performance maximum if you want a console at reasonable performance and power envelope in an acceptable form factor for a home console.
Except everyone is willing to dish out 600$ on a home console.
im fine with 1080p just make better games that are fun to play.. visuals are at a nice spot.. idk y push for Higher details are great right now. if we want higher get ready to pay more.. but im good with where its at
@SourtreeDing
Exactly. They really don't need to push 4k on consoles, all they need to do is max the graphics and get framerates to a locked 60. Imagine an open world game like fallout with the graphics of The Order. Would anyone really care if it was 4k or not?
It's still early in this gen so I expect big things in a few years from these consoles even though I know all games won't be 1080/60 at the end of this gen. The thing is, when next gen comes around we probably will be able to get The order graphics at a steady 60fps an all games. That, as far as I'm concerned, would be perfect. But who knows, it may even be better.
Just enough time for 12K to come out,lol. At least prices will go down, and maybe,just maybe people will be giving away free 1080p tvs because noone wants them anymore XD
these machines (xbox one even more so) have a hard time hitting a steady 30 frames at 1080p.
the main issue is consoles need to sell at a reasonable price for it to be a mainstream product. $400 seems to be the highest most are willing to spend. that means it will take a long time to get to 4k with decent frame rates.
the pc is already there but consoles take up the bulk of marketing and therefore in a sense console gaming is holding back technical advancements because most developers use the console market as their main source to sell their games.
@moldybread
Consoles are NOT holding back technical advancement. The reality is that most pc gamers do not have machines that can run games at 4k. If devs were to make 4k the standard for games then they would be catering to a very small audience.
That is the exact reason that pc games are designed with various settings. They are fully aware that most people cannot run games at max settings so they leave it up to users to adjust for their specific hardware. consoles are standard hardware so adjustable settings are not necessary.
The reality is 4k gaming is expensive for EVERYONE, console and pc gamers alike, so devs cater to both console and pc gamers (the majority of gamers) who cannot run games at 4k. Blaming consoles alone is ridiculous.
This. Last gen was good, but it was always significantly behind PC. This gen, games are mostly at native 1080p and have most if not all of the graphics effects featured on the PC versions albeit at a lower quality setting. Even though PC is still ahead, console graphics are still in a nice spot now. They're not as clean as PC but, they're not ugly anymore.
@johnDOE
well said^^^ PC gamers are blind..
Console Specs are all the same where as PC is all over the Place because everyone has a different rig
@ninjatogo
we dont care if consoles are behind PC and that will always be the Case..
When these consoles were launched, there was talk about how they would have a shorter lifespan than the previous machines, which were 7 and 8 years old.
They exceeded the usual expected lifespan of 5 years for various reasons not least because of the worldwide recession making expensive new machines somewhat unpalatable. However I wouldn't be surprised if PS4 and Xbox One were replaced not long after they were 5 years old with new hardware.
Not least because they were somewhat more 'budget' machines in the first place compared to the more advanced technology the previous machines launched with and they will age faster relative to PC hardware IMO. You really can already match and then exceed them with pretty budget PC components.
I still wouldn't expect 4k to be the target resolution of any new machines, but it's several years away anyway. Enjoy what we have right now and don't even START thinking about what new hardware can bring for at least another 2 years.
As soon as I saw that the quote was on Gamingbolt I knew that's wasn't exactly what he said, and behold I was right.
He said it'll be another 2 console generations before realtime CG graphics hit consoles and add another if you want to render that in 4k. But even then I think it's only going to take 2 gens to hit 4k gaming with CG visual (so PS6 / XB5), the only reason it wouldn't is because one of the two decides to force the other into an early generation, for money's and competition's sake.
QHD - 4k gaming is coming next console generation. 2015's big GPUs are suppose to be the R9 390 and 980 Ti. Those are 8.5 TFLOPS (+300w) and 6.2 TFLOPS (7.44 in comparison, 250w). Those single GPUs should be more than enough to run many of today's games in 4k @ 30fps, considering the standard 980 and R9 290x (weaker cards) can do just that.
In 5 years time those cards should be mid-range, with significantly lower power draws (likely 50% less), 1/3 of the original price ($200 range), superior technology and software drivers, which makes them viable options for a console in their rebranded forms R9 770x and UGTX 460 (2019/2020).
Now that being said if graphics evolve, which they will, then most graphically demanding games will be in the QHD (1440p) / 2k / 3k range, while smaller games and indies will aim for 4k.
Which means the following generation (PS6 / XB5), will be just like this generation and aim for 4k gaming for the majority of games (this one aims for 1080p). Ultimately resolution will be a non-issue next gen (PS5 / XB4), because QHD (1440p) and above pretty much produces a crystal clear image regardless since it's nearly 2x 1080p. And most importantly the gen after (PS6 / XB5) will be the end of graphics and resolution wars for all but the biggest fanboys, because 4k resolutions and higher will be the norm, and graphics will be good for every major game, meaning gameplay, fun factor, and originality will be the main selling and discussion points of games once again.
@Johndoe11211 "Imagine an open world game like fallout with graphics of the order"
So how do you propose these consoles do that? The Order is a linear shooter and it can't even achieve 1080p/60.
Also in regards to "no one has a capable PC" I'm just gonna assume you're trolling.
http://store.steampowered.c...
46.94% on windows 7 x64
29.47% on windows 8.1 x64
highest x86 is XP at 3.29%
ram
30.62% on 8gb
21.88% on 4gb
13.36% on 12+gb
gpu
32.8% on 1gb vram
23.05% on 2gb vram
12.98% on 970s
44% on quad core
48% on dual core
@garrettbobbyferguson
It's pretty obvious that you're here just to disagree or poke fire at some personal issue that you probably have with me for you to completely misquote both my statements. Please read my posts over again, if you have a problem with the english language PM me and i'll try to explain myself better to you privately. I never said I expect The order graphics at 1080/60 from this gen and nowhere did I say "no one has a capable PC". Something is wrong with you.
I am also pretty happy with what the PS4 can achieve at the moment because games like inFamous SS, Driveclub and The Order still manage to amaze me with their visuals. I think we all can agree that these games look amazing!
Sure high end PCs still have the upper hand but 4k is still not a norm today just like how 1080p wasn't in 2006. It will take some more time for it to be mainstream and I am ok with that. I honestly am hoping that next gen, 4k is in the checklist of the PS5 or it will be a disappointment! I think 4-5 years is enough for 4k to mature and for the price of a 4k setup to drop to a more affordable price point.
2 more gen for 4k? That's BS!
As long as it's 1080p and has AA its fine for now. Because we watch blu-Ray movies on 1080p and they look phemoninal. Resolution is not the bottleneck, it's the graphics.
@johndoe
"Consoles are NOT holding back technical advancement."
yes they are. developers are catering to the larger market. games like destiny also support last generation hardware because of the huge install base which hinders advancements even further. the pc should always be the lead platform but isn't always with multiplat games.
have you seen the requirements for oculus rift? they are quite high and that's good. that is why project morpheus will have an uphill battle when it comes to performance on demanding games. instead you will likely see indie style games being supported. if oculus supported the xbox one and ps4 the games would be held back to accommodate the hardware as would oculus rift from being as advanced as it is.
as long as consoles remain popular the growth going into 4k gaming will be a long route. 4k tv's are still rather expensive, the more popular they get the more prices will come down. that same scenario works with gaming, they will always cater to the most popular userbase. console gamers don't care that much about frame rates either, it's why so few games go above 30 frames and we've even seen some pc games capped at 30 frames because they were designed for consoles. the call of duty franchise could also host more online players than consoles but that franchise has been dumbed down for consoles all due to that is where the largest market is now for it. so tell me again how consoles don't hold back pc gaming.
@Johndoe
While it is true, average pc gamers cant run 4k effectively.
https://www.youtube.com/wat...
http://www.geforce.com/hard...
Thanks to DSR technology, an average pc gamer can run 4k downsampled to 1080p. Its still something, and messing around with settings you can get a lot more out of a $350 PC, then a PS4 or X1.
While the argument is always "most pc gamers". Facts are, "most" pc gamers can play games @ 1080p 60fps. Something consoles struggle to do. I suspect "most" pc gamers will regularly play @ 4k before consoles ever adopt it. The best we can hope for is solid 1080p 60fps next gen, then hopefully 2k(1440p) after that. 4k for consoles is more like 3-4 gens from now. I do expect console cycles to become a 5yr standard from now on, so in 15-20 yrs from now. Just my thoughts though. I would love to be wrong, trust me.
@garretbobbyferguson, The Order is 1080p same as how blu-ray movies are 1080p. Do you think they are also not 1080p? The resolutions are the same on both.
@johndoe11211
PC players very easily have machines that do 60 fps on max setting on pc at 1080P. I cant say that for consoles.
@SourtreeDing
We are not blind we are playing pc and we can see the downgrades clearly.
Higher details and higher framerate. 30fps is a joke and some games can't even achieve that.
@moldybread
my pc isn't "there".
What you mean is the technology is there.What's holding back advancements on a mainstream scale is price.Not many people are willing to pay over $800 for prettier graphics.. Unless you want the install base to drop dramatically, creating a domino effect resulting in a handful of games releasing each year or possibly, consoles dying all together... Well, I think that is PC master race's plan for consoles.. so yea. Pony up the dough or gtfo right?
tfoh
Intels next gen of Core CPUs will be 10-12 nm, I have no clue what eyeofcore is on about and neither does he TBH.
4k will be maxed on budget GPU's by the time this stagnant gen comes to an end.
Higher and higher resolution can only do so much for gaming right now. Total immersion, that's where it's at. And tech is finally at our fingertips for good VR and AR. I hope to see some major advancements in consumers hands.
4k? Not so much right now. 1080 or 2k VR??? I'll take it, thank you very much.
Johndoe is 100% correct with that notion.
Console gaming is holding no one back. If those teams want to make demanding games ONLY for PC....they are free to do so. No one is stopping them.
@Garrett-"So how do you propose these consoles do that? The Order is a linear shooter and it can't even achieve 1080p/60" Won't, not can't. It does lessor settings because of what the game is and what they focus on...that is a choice. If you game on PC...you clearly know that going 1080p, 60fps is nothing more then turning off some effects.....I don't see how the y "can't" if its their own game...they very much can if they really wanted that.
Like I've stated before....if they wanted it SOOOO BAD...they would not make new engines, just use a last gen, dated engine and do 1080p 60fps all day. Clearly...that isn't what all developers want.
1080p and 60fps are NOT ALL the settings to actually judge a game by...I mean...I'm sure we all know that right?
Thats like saying those 2 numbers mean more then a new engine. Soo, HL1 look better then HL2? What if I told you HL1 is in 1080p 60fps and HL2 is in 720p 30fps? I mean...that setting only really means soooo much. Its not a night and day difference and it doesn't go over new engines. Not even slightly.
@Ninjia- "Last gen was good, but it was always significantly behind PC."
All gens generally speaking will always behind PC, but that is generally speaking. All gaming PC's will be behind NASAs.....so? I mean.. lol, it means very little if that hardware is not being actually used as the minimum in terms of development, ie do we see right now R9's being used as minimum specs?
PC will always have the edge in making a game look "better" by comparison, but PC at the same time won't be making exclusive games that have minimums beyond that of consoles. It has to do with what many have stated already, not enough own those beast PC's to really solely develop for that crowd.
Many on here need to really ask themselves....if this was what developer wanted....whats stopping them from making a PC exclusive that is minimum a titan card? They can crowd fund if they really wanted that too...
You didn't see it in any other gen, you won't see it now. It has to do with MOST don't own such rigs to even justify such development.
I game on both PC and console and can say its a double edged sword. You "can" have the better graphics, but that option to have "better" also means its not exact like console.....which results in less exclusive HIGH END AAA development.
PC gets the hardware price down, console gets the developers working on higher end hardware, that ultimately gets PC versions being made.
I'm sorry but Witcher 3 and AC Unity are only made on PC because a PS4 and XONE exist to justify the engine and development. Yet we didn't see both titles last gen on PC despite the hardware existing.
@garrettbobbyferguson
I have
windows 8
8gbram
1.8 gb vram
i7 quad core
I can barely run GTA V. My project cars is at medium settings with 30fps and 720p.
You need maybe double the power consoles have on pc to get graphics to look as good as consoles on PC.Optimization>>> Brute power. Don't get me wrong. I may be able to play those games at higher settings, but at 80c temps for long periods of time will drastically shorten the lifespan of my gpu. same for my cpu.Thats why the requirements to play such games require much higher specs than what consoles have.
It's all relative though. Look at CGI 7 years ago. Or 12 years ago. I think gaming has surpassed that :P. Of course offline rendering of CGI will always have the advantage of time, meaning it can take hours per frame and be acceptable. It doesn't need to worry about refreshing at 30 frames per second or more.
So a bit of a duh comment based on the headline alone (yes, I have not read the article).
1080p is fine for me too , but they should find a way to make next gen 100% backwards compilable
I dont want to rebuy all games again for next gen
I think it will. This gen probably marks the beginning of a standard set of hardware for consoles. It won't be like in the ps2 or ps3 era where hardware design was out of the ordinary. The ps5 and xbox two will probably be designed almost identical to the ps4 and xbox one but with higher specs. if they do that then games will definitely be backwards compatible.
I agree with johndoe11211, now that they have gone with x86 they are unlikely to move on to some other architecture. The only viable one being ARM, and the only reason to do that would be to make it easier to run the same game across mobile and console.
But I find that highly unlikely, they will stay the course with x86, and because of that backwards compatibility will be quite easy. And they may even start releasing new consoles sooner than you'd expect since the development cost is so much lower.
@eyeofcore
I've read your comment about 15 times and I still don't have a bloody clue what you're trying to say.
Next couple years?
AMD "zen" cpu comes next year.
And it's 14nm FinFet.
Also Amd & nvidia gpus next year will be 16nm with HBM2.
4k monitors are in the $500 range now and in 2 years 1440p/4k will be a nominal thing.
If it's gonna take ps6 to do 4k then ps5 is already holding back pc lol
But if console gamers are already willing to wait from 2005 to end of 2013 to move from 720p 30fps to 1080 30fps then it wont be a problem to play 4k 30fps in 2029 lol
My lord two more generations of downgrades.
I hope pc keeps rising.
Paying for multiplayer plus skins and so many things wrong with consoles.
Consoles are lame as fake.
@DarkOcelet
Because it takes place in a shoe box like ps3 games did.
You won't see an open world game that looks like this on ps4.
http://www.youtube.com/watc...
And before u ask "buh how many pc's can play that??!"
$80M worth.
Priotities my man. Some people like to spend 400 dollars orthers like to spend 2k in a pc. Nothing wrong with that, but others might want to travel. Or higher a really high end concubine.
Gotta love the ignorant PC Enthusiast talking about stuff they don't understand LOL. This dev is talking about rendering game assets in 4k. The only games that have done this so far are Ryse(2k textures I believe) and Dragon Age Inquisition(the shiny armor).
The games on PC that say "4k" are only upscaled 1080p native games. All modern games are rendered at 1080p to display in 1080p. Epic games already said rendering in 4k now adds thousands/millions of dollars to a games budget. The Crysis devs already went near bankrupt thanks to Ryse's expensive 2k rendering costs.
I really wish 'PC Enthusiasts' would learn about games cuz you really look like idiots when you talk about stuff your clueless about.
@Ippiki
Maybe you should educate yourself!
https://www.youtube.com/wat...
http://4k.com/gaming/
They do exist!
This is definitely a case of the pot calling the kettle black!
In context, 2k and 4k are referencing resolutions. For digital display, DCI defines them as 2048x1080 and 4096x2160 where the 2k and 4k are references to the first digit in the resolutions. That's the professional world, the most common consumer equivalents for 2k and 4k are 1920x1080(full hd) and 3840x2160(ultra hd). If a game is running at 1920x1080 then it is displaying a 2k render. If it is running at 3840x2160 then it is displaying a 4k render.
When talking assets, we are talking the resolutions of textures. A 2k asset would be a 2048x2048 resolution texture and a 4k asset would be a 4096x4096 resolution texture. Now here's the catch, these two aren't conjoined. You can run any combinations you like at any resolutions you like. You could run a 2k render with 4k assets, or even higher, if you like.
There aren't a ton of pc games that are shipping with 4k assets but there are some and more are showing up all the time and there are plenty of 4k mods if you really want them. Anyway, modern design is less about single massive texture sizes, and more about multiple smaller textures blended together in the shader.
But what do I know, I'm just a clueless, ignorant pc enthusiast that needs to learn about games so I don't look like an idiot.
Consoles will remain a part of the industry because most console owners don't want to build and maintain a PC.
@Ippiki
I see what you were trying to do by replying to NuggetsOfGold (which is a waste of time since he's a 1 bubble troll, and ignorant), but you yourself went off topic with your explanation of what the dev actually said.
He said it's going to take 2 generations (PS6/XB5) before consoles can produce CG quality graphics in real-time similar to what movies use, and possibly another generation (PS7/XB6) before those CG graphics can be rendered at 4K in real-time.
His main point was talking about rendering CG quality visuals on console, not 4k gaming in general.
@Revolver_x_
What @Ippiki was saying is that most games don't develop games with 4K Assets. Many of the tools are still 1080p textures, and the image is simply scaled to 4k, rather than everything being rendered in 4k to begin with.
For example
http://www.nexusmods.com/sk...
I know it says 2k texture mod, but that's what he's talking about. Most games are still use 1080p assets, but are rendered in 4k.
@ippiki: dude... sure texture resolution and rendering resolution are different things. But they are not dependant on each other. On PC in almost all cases you can chose your rendering resolution and your texture resolution with different settings. Usually called "resolution" and "texture quality". You can set those to any level you want.
Now your max resolution is determined by your display. If you have a 1080p monitor/TV that is the highest _native_ resolution you can output. You may want to chose it to render at 720p and upscale it to 1080p. Or you want to render at 1440p and downscale to 1080p to increase visuals or save performance.
It's true that most games don't have 4k texture option but that has nothing to do with rendering resolution and not required to output at 4k. There are enough titles that offer 4k textures through mods though.
I personally usually play in 1440p downscaled to 1080p. btw.
"well said^^^ PC gamers are blind..
Console Specs are all the same where as PC is all over the Place because everyone has a different rig"
...console specs are all over the place too, some own xbox 360s, ps3s, ps vitas, Wii-Us, Xbox ones, PS4s...and each of them developers have to support too. And PCs are less complicated, they all read the same programming language and support the same APIs. All of the consoles read different languages and support different APIs.
And once again we have people downplaying GC graphics, just 9 years ago people were buying into the FFVII's tech demo. And not to mention people 9 years ago were buying into the 1080p standard, now all of a sudden 4k's too high? ...........i thought 1080p was a bigger leap from 480p SD consoles. 4k is only 2x the leap.
"..console specs are all over the place too, some own xbox 360s, ps3s, ps vitas, Wii-Us, Xbox ones, PS4s...and each of them developers have to support too. And PCs are less complicated, they all read the same programming language and support the same APIs. All of the consoles read different languages and support different APIs. "
The absurdity of this comment is mind numbing.
That's because you're seeing from a console gamer's perspective. you're not seeing it from a developer's.
- PS, Xbox, and Nintendo are a lot different from them selves. If a developer develops on DX all PCs can read it, the only difference is the power capabilities. consoles are different from programming language to hardware.
And At least PC gamers will be able to link up competing GPUs with the next DX. I'd like to see someone linking an Xbox, PS, and WII-U together. ;)
the order 1886 is very close to CGI. If we don't need 4K, we might just wait another cycle. It's close
No it's not at all. Average washed out greys at best. Not to mention that the game sucked over all and it's now stuck in the bargain bin where it belongs.
If you want more games like The Order to pop up then you are what's wrong with gaming.
GOW3 achieved CGI levels quality and maybe even exceeded it to what has been available at the time of original God Of War(early 2005).
Ryse
http://gearnuke.com/wp-cont...
i know some scenes are pre rendered in engine but hell that pic above look more than cgi. looks real
Isn't that from a cutscene? As far as I know cutscenes in Ryse are prerendered. Correct me if I'm wrong.
@masterCornholio
here is DigitalFoundry Frame-Rate Tests these are realtime scenes
https://www.youtube.com/wat...
skip @2:54 in game shot
http://www.picisto.com/phot...
All Ryse cutscenes are pre-rendered including that one. I've played it on PC, it's very easy to spot when the game switches from 60 fps to 30 during cutscenes.
A frame counter during a cutscene doesn't mean it's realtime, it just means the framecounter still is running during the cutscene.
It was also a 4 hour long game. I'd much rather have games like Dragon Age: Inquisition and Witcher 3 at 1080p, 30fps than have a 4 hour, linear action game which looks like a CG movie.
You beat it in 4 hours? My first playthrough clocked in at just over 12 hours.
What you dislike about the Order doesn't have anything to do with the fact it almost looks like CGI.
@Transistor
Actually it does. My point is that I doubt you can have long, incredibly intricate games that look like a CG cutscene 100% of the time. The Order focused on graphics over everything else, and in a lot of ways it's barely a complete game. I'd much rather have a deep game than a pretty one.
@Cy
Not all games need to be open world 100 hr RPGs.
Its called choice and whether some like it or not I love to be able to choose between types and not have to pick between only 100hr open world games with stories so stretched out it becomes a chore to finish them.
Plus not all of us have 15 hrs per day to spent on games anymore...
Look don't get me wrong I love RPGs, you see they're like steak, I love them so much its my favorite type of food but I don't want to eat steak in every single meal because I will be fed up with it sooner rather than later.
You see I love my steak and my chicken and my pork and my salads and everything in between with equal measure.
So you disliking the order like that its you saying no to more choice. Its bad for you and for everyone else you persuade.
and quantum break
http://i.ytimg.com/vi/E087G...
http://vignette2.wikia.noco...
Cant wait to see that game at E3. Sam Lake says the team has made remarkable progress since the the last time it was shown. also one of the guys working on QB also worked with the team that did the visual effects for the movie "Gravity"
I've yet to get an Xbone, Quantum Break may be the game that persuades me.
The Quantum Break reveal trailer is pre-rendered CG, including the first screen shot you posted.
Yea those games looked cgi but this article is referring to cgi like visuals at 4k in real time Not even a 40 titan can achieve that
Lol at all the people who agree that The Order: 1886 and God of War 3 look CGI but disagree that Ryse does, just goes to show that these people don't even know what CGI is and it's just a matter of one-upping Sony and downplaying Microsoft as usual.
Ryse cutscenes are pre-rendered CG. The Order: 1886 cutscenes are realtime. That is the difference. It has nothing to do with your personal attacks.
@ DarkOcelet "The Order 1886 already almost look CGI."
...CGI of 14 years ago still beats it. Spirits within Used fully rendered hair and of course 400,000 polygons spent on characters.
http://gamehall.uol.com.br/...
http://static1.gamespot.com...
That being said, Assassin's creed unity is the game at the moment that's actually tried to push for rendered hair on characters and lighting too.
http://i.imgur.com/jyUimVt....
http://i.imgur.com/kX7ZwrF....
At least you said almost.
To most non hype driven fanboys the order was average at best, even graphics wise. To seasoned PC gamers those "awesome" graphics are laughable. Then there's the fact those console can't render a game with "decent" graphics like The Order and have the game still be fun to play and not over in mere hours.
The Order was nothing more than a bargain bin game that had never before seen levels of hype to sell it to dummies.
One of the worst games I have played in the last 2 years.
To each his own my friend but The Order 1886 was enjoyed by many people.
Also The Last Of Us looked incredible and had an amazing gameplay and so did Gears Of War 3.
So i am pretty sure Gears 4 and The Last Of Us 2 will reach the graphic fidelity of The Order 1886 and be awesome to play.
This just reminded me how excited I am to see Quantic Dream's PS4 game. What they achieved on PS3 was pretty crazy, I could see their PS4 game getting pretty close to CGI.
I'm fine with ps4s hardware...let's develop great games before we sorry about superficial stuff.
Guy is just butt hurt he doesn't work for naughty dog any more
4K, maybe. But with The Order 1886,inFamous 2nd Son, Driveclub, Killzone Shadow Fall and Ryse, i think we're good on that CGI visual part.
Those games you listed does look good, I have the PS ones and later this year will be picking up Ryse & a X1. To say that these games are on CG level is just inaccurate. The next 2 gens we will all see what CG gameplay looks like
"CGI visuals" is too vague and it's a moving target. Consoles have been achieving CGI visuals for a long time now, but cutting edge CGI is another matter. Consoles might never achieve that, unless the image is streamed and not processed locally.
Not achieving 4K is just common sense. There's no reason for these companies to make consoles capable of that when the vast majority of people won't be using them with 4K capable TVs/Monitors.
Brad Wardell believes SW:E1 level CGI only needs around 50-75k draw calls, something that both consoles can achieve. If he's talking about contemporary CGI then he's probably right about 2 generations away.
Interesting thing about going to 4k is that you really don't need a lot of fltering so it does free up a bit of resources to get to that resolution. If the consoles can hold out like last gen, they may be able to get to 4k by next gen on certain games.
Obviously PS4/One won't achieve CGI quality visuals, the kind of CGI used in films usually takes days or weeks to render on a render farm.
I do think however late PS4/One games will surprise and impress people.
Who thought these machines would do CGI?
They do amazing games, not James Cameron films.
If the Dark Sorcerer tech demo is not CGI quality I dont understand what is he talking about.
I can remember the Silico Studio tech demo render on console also.
In my humble opinion he is wrong.
I just wont games that work from the start and that dont need 10, 20 or 30 gigas of patches...
When 4k becomes the majority in homes that's when it should be implemented in a home council as a standard. Give me 1080p locked, 60fps locked, graphics pushed as far as they can go, and a constant game world that if you went back even 3 months after you did something its still that way. Then and only then will I want the resolution bumped up. In other words lets get as much out of the 1080p 60fps that we can and not have all the flaws and then move it up again.
4k needs a ps7 or ps8?
i understand why naughty dog studio kicked this freak guy out
By the time everyone has a 4k tv, I think the PS6 could handle it at even better graphic effects we have now at 1080p, with all the advancements/improvements.
GTX 980 can't fit in that form factor, though, I'm saying at the rate improvement over generations, 4K would be standard by PS6 at earliest, not the PS5, sure PS5 can handle it, but you will have devs trying to get the most out of 1080p first and worrying about the majority of consumer TV resolutions.
Next gen should be doing 4K I'm hoping. I know it'll take awhile to get there but I don't think it'll take 2-3 more generations. I say we're 5 years away from Toy Story type graphics.
I don't so you need either sli gtx 970 or 980's to get reasonable framerates @4k that is roughly $800 to 1200.
So I really don't the ps5 or xbox two will be rocking games in 4k, aiming for 1440p is a better middle ground for consoles.
microsoft or phil spencer said last year that xbox one can do 4k but ps4 can't, i think we should just wait to see what they have in store for us
If $300-400 video cards struggle with 4k, how would either console have 4k games? Short of 2k changing it's logo.
He is talking about the current gen systems the xbox1 and ps4, so again he is not talking about future hardware.
Do you mind actually reading his statement that I am replying to, rather than just falsely assuming I am referring directly to the article.
What you are talking about and what he is talking about are two separate things, re-read his statement.
From what i seen and how familiar i am with pc's. It may just not happen next gen. Either we get another graphics jump at 1080p or maybe slightly improved graphics at 1400p. It's not that they won't be able to do it. But cost would be through the roof.
thats why mobile gaming will dominate if they move blindly like some people want them too .
who wants to waste soo much money making a game so detailed that makes the company become at risk no one .
i see the PS4 and X1 as wise decisions and wisely made consoles for video game publishers and makers alike to thrive .
does more powerful matter well not for me i still enjoyed many PS1 and PS2 games more than any PS3 game , its the game that matters not how it looks .
if we go the PC root with the mentality of PC gamers we will end up with publishers like activision and EA as our only options and games as bad as crisis nothing but good looks . and thats my opinion .
games like SSB and ICO for me are better than any graphics centered game .
here people read this.
gameplay is the most important part.
they need to make long and perfect campaigns which we can play again and again just like ps1 and ps2 days.
thats all.
ghost rider, godhand, rise to honor, shadow of collosus, smackdown pain, ben 10, crash, downhill domination, all sports games,moto gp and even nfs.i can go on all day but i can't. they all were damn good coz they cared about the most important part which is gameplay.
developers need to innovate in gameplay and make it as funand good as possible. story and graphics come second
i agree with you .
but you know telling a compelling story doesnt need a powerful system it needs imagination .
shadow of the collossus for example is still being discussed today because of how deep it was, the game was super fun to play and enjoy .
i honestly didnt like a whole of games last gen and honestly last gen was the worst of all not because it dragged too long but because games were bland gritty and i cant really describe it but it didnt feel like what i loved from the genesis PS1 days and miles behind the PS2 .
so much contradiction in this comment its funny. says its a waste to focus on gfx and performance but then people downplay nintendo constantly and mobile. pc gamers have the mentality to support games and invest in games more than console gamers. how many pc gamers came to console just to become soulless shell of their former selves. you just named one how crysis was a completely different game on pc. the community for games also last longer on pc we dont throw away a game just for the next overhyped game.
nothing but good looks is the definition of console games today most of the ideas and best games are coming from pc. you see if it was up to pc gamer we would be left with ea and activision but fail to comprehend that you are currently waiting for a well know pc dev game called the witcher (Console fanboys dont think). the order rings any bells. i could keep going on about how all yall care about is gfx but you could see that in every article on this site. pc gamers care about progress and it seem like its coming to a crawl. i see you noted last last gen games too, how funny. at least this year yall wont be starving for games.
its funny that you couldnt understand and yet you assume that i am like everyone else now thats funny .
i am not the kind of guy who leaves a game after i get hyped for the next one ESPECIALLY at the PS1 ps2 and sega era before it heck even with some PS3 games .
i hate todays gaming because its borrowing too much from PCs .
when did i downplay nintendo i might not be their biggest fan but they create some really great games that i play for countless hours .
i didnt buy the order or bloodborne , i bought ratchet tearaway little big planet and many more games for their fun gameplay or sometimes very compelling stories and not because how good they look again i still to this day enjoy old PS1 and PS2 games more than any recent game i even played some game cube games and loved them .
PC gamers have exactly the same mentality i just described if not then why they build hyper powerful rigs just to make the X game look better nothing else .
i played crash for example it didnt cost me anything beside the game itself i didnt pay anything more is that bad in your opinion .
mobile gaming is gimmicky and i didnt see a quality game on it with very few exceptions and even those do not compare to the play-ability of console games .
on the pc gaming front i just dont like it , it focuses on graphics and unfortunately consoles last gen borrowed a lot from it which led it to be lackluster in comparison to older console generations in terms of memorable games with few exceptions of course some games were fantastic like the last of us which focused on what games should be .
now i am waiting for tearaway unfolded ever heard of that game guess not because you are looking into the next multi plat shooter or the next MMO maybe LOL
I dislike gaming bolt, mostly for the community of A-holes i seem to see in every comment section of their articles.
One can see it in all their articles posted here (I don't even get why they are aproved even), they are only on it for the clicks. Clickbait article over clickbait article.
Leave 4K console games for now. Focus on getting 1080p 60fps locked on consoles first.
Next generation we'll get cgi visuals, and next line of house, we'll have cards that can handle 4k. This dev is just flat out wrong. My 3 year old card almost handle 4k. The only thing limiting it is vram.
Along with 4K-game enable and 3D I'd also like to see Dolby Atmos/DTS-X and hopefully more HDMI/USB port connections built-in as standard feature with the next Sony and Microsoft consoles.
I'm fine with 1080p, but if a new console comes out in 4 years or so I can see it completely skipping 4k and going to the next best thing, 8k or whatever else...
4k is like hd-dvd, something that is trying to be pushed around but not much takes advantage of it, but when things start to adapt they'll go one step higher.
Its taking forever to release games at 1080. Can you imagine 4k? We'll get one game a year!
Some people would say that the latest Battlefront trailer was near movie CGI quality rendered in real-time on consoles... obviously not anywhere close to the latest Avengers quality, though the new lighting\rendering tech they're using makes an AMAZING difference in quality from the previous gen.
A lot of lower budget sci-fi movies would do well with CGI quality as good as the Battlefront trailer.
@Disagree, maybe I should have said, "much" lower budget.
This is why I've been saying 4K is just a buzzword right now, even on PC it's not very viable. High end cards only run games @ 30fps average with dips below 20fps often. 1080p and 1440p with 120hz/144hz are much easier to run and give fluid fast frame rates and are much better for games for a few years yet. Not surprising to me that it will be 2 generations until consoles get 4K running.
I know for a fact that a 4k capable console will be announced a week after I invest in a nice new 1080P television, just how my luck goes :)
Seriously, this is another consideration as far as a 4K console is concerned. A large percentage of the gaming community will likely have to stomach paying out for a new tv too. Who knows how that might affect growth and sales of any new console.
For my part, I am playing Witcher3 on PS4 and think it looks pretty fabulous and plays great too, I am really enjoying my gaming at the moment regardless of resolution or framerate.
I don't know about gaming, but I expect we're going to see hardware revisions of both consoles this gen that support 4k video output for bluray and video streaming services.
My god. No matter how reasonable someone sounds PC people strike back. 4K cost way to much its that simple. But i am sure pc master race is all ears for everything.
I don't think it will take two generations for 4k on consoles. If you look at things remaining the status quo then sure. But in a few weeks we will see a sea change when the R390 drops. There are several things different. IIRC it will be on the same die size with a billion more transistors, but uses less power than the previous AMD flagship. And AMD is moving to 20nm and 16 or 14 soon after. That means more transistors and less power and heat. AMD is really pushing using less power now. Then there is high bandwidth memory. It will make GPUs a lot faster. The new APIs make those GPUs work harder. And last we will be moving on to DDR4 and away from GDDR5. All of those things mean a lot more from our GPUs.
On the CPU side we are potentially looking at VISC based chips instead of CISC. If the prototype from last year holds up and they are cheap that is a ten times boost on CPU alone over this generation. The prototype was supposedly 2 to 3 times faster than a Haswell. In four or five years that should have increased. Plus it is more power efficient. And AMD is one of the investors in the company making it. So we could see this in an APU soon. Another thing on the GPU die size. By 2020 dual GPUs would be a real possibility. And if the other APIs will see multiple GPUs as just graphic/compute assets like DX12, then 4k real-time CGI is a possibility. This year's Computex should ahead some light on the next few years.
Umm... I guess it may not be full blown CGI, but The Order 1886 looks like one of those CGI cutscenes you'd see in a video game a few years ago, except it's the entire game.
Remember back in the day how a game would show you this intro CGI cutscene and it was so jarring when you saw the actual game graphics? This is no longer the case.
I could care less about 4K graphics right. It will probably be 10 years before I even get a 4k tv
Movies have a bigger budget per second because of the actors high salaries and the stuff they have to build then destroy in the real world to create the action. However, in video games you can make the real world virtually and destroy it however you want plus not have to pay big bucks for top end actors compared to movies. Therefore, I don't know why he compared the budgets of movies vs games, unless I'm missing something.
You can compare movies and games because game budgets have gotten huge. Look at a game like GTA V. I bet they spent 100 million. The rumors are they did. Even with high priced actors most movies don't spend that. That doesn't compete with the blockbusters, but on average I bet movies and games are pretty close. Looking at the typical wide release theatrical movie the budget is probably around the 30-40 million dollar range. The average AA or AAA game probably has a 20-25 million dollar price tag. So they are close and comparable.
I play 4k YouTube footage through my @080p Sony projector and it looks staggering.
Beyond Two Souls on PS3 achieved CGI visuals LAST generation.
What he surely means is in games that have lots of controllable action all happening at the same time.
4K is on PC, UHD is mainstream with blu ray starting 4k. This guy claims it will take two generations because the devs are too lazy to give you 4k.
Two generations is well over 10 years.
Game makers are becoming too lazy with yearly PC mods charging for a full priced game, DLC that is overpriced, extra content by asking you to pre order.
Master race is the only way forward by looks of things.
This what happen when people run their mouth and don't know what they are talking about. Read this:
http://www.gameinformer.com...
Isn't Final Fantasy 15, Quantum Break, The Order 1886, Uncharted 4, and etc cgi like already? Final Fantasy 15 especially with its awesome visuals looks better than that cgi movie Advent Children. I can see 4K coming next gen. 4k as we know it is getting cheaper every year. Plus 5K and beyond is already happening on Monitors and Televisions.

Filmic Worlds boss, John Hable also talks about the selection process at Naughty Dog.
they were nt just talented ..they were also not bored (like many lazy others) to work on PS3 which was a beast ... Kojima,FromSoft and Naughty Dog made their own engines specially for PS3 (also Sony helped them a bit too), which maybe was difficult at start but later were benefited alot by that as now they have their own engines perfected and giving some very well made games ...(kojima started with MGS4engine for PS3 and now evolved to Fox engine, Sony gave to FromSoft the phyre engine which evolved to BBorne engine..etc etc
how is the ps3 maxed out with uncharted 2 or TLoU when there is this?.
crysis 3 on console realtime
http://www.gamersyde.com/po...
http://www.gamersyde.com/po...
http://www.gamersyde.com/po...
uncharted 2 pre rendered cutscenes
https://itani15.files.wordp...
http://media1.gameinformer....
yeah I am blind lol...
Third-Party never maxed out the PS3,they had to change a lot about Crysis 3 in terms of attributes to make it even portable.
Crytek uses high end solutions and do not necessarily optimize them
for pretty much anything,Uncharted 2 was developed with the CBE in mind and took full advantage of it's capabilities,well in comparison.
I would love to see a next gen CBE with more bandwidth and a better graphics solution,it's a very underrated chip
Crysis 3 looked so bad when running. The framerate was in the 15-20 range the vast majority of the time and the jaggies were oppressive.
For all of it's effects, Crysis 3 looked worse than Crysis 1 and Crysis 2 on PS3 - Crytek tried to do too much and the horrible performance ruined the experience and graphics.
You can max out a system, but you can also optimize your engine on a maxed out system. You can only get so much power out of a system. Your next steps are to optimize your engine to get more performance. Which i think you're getting the 2 confused.
It was obvious that it was maxed out with Uncharted 2 as it will be maxed out with Uncharted 4 and Star Wars Battlefront
I messed up but meant the PS4 will be maxed out with Uncharted 4 and uncharted 3 as both games are coming out in the systems third birthday.
A system can be "maxed out" and still show improvement, as he is saying (I'd advise reading beyond the title). Despite what Uncharted 2 did/looked like, God of War III, Heavy Rain, Uncharted 3, and Beyond: Two Souls looked better. Why? Maxing out a machine doesn't mean you can't make improvements in other areas.
The latter portion of the console's life had developers (as Hable put it), "squeezing out" what they could, though he admits, there wasn't really much left. I don't expect Uncharted 4 to be Naughty Dog pushing the PS4 to Uncharted 2's point on the PS3, but their second game? No doubt.
Not really. "maxed out" means the limit has been reached. If you've reached the outer limits of what it can do, you can't go beyond that. If they were able to tweak something here and optimize something there to get another 1fps out of it or improve textures or improve loading times, it means they didn't hit the max with U2.
What he wanted to say was worded incorrectly.
If the PS3's limit had been 100% reached, you wouldn't have seen better games than Uncharted 2. That is not what maxed out means. Its RESOURCES can't be pushed any further, but the engine code can be modified in a way that would allow for optimization (as plenty on the PS3 proved). The Naughty Dog 2.0 engine had reached its limit on the PS3, but there was still enough juice to make games look and run better after Uncharted 2 released.
The engine, to that point, was maxed. The PS3's capabilities were not. Something I see people get confused on all of the time.
You can max out any machine due to crappy code. It's easy to do, any idiot can do it.
A computer is only truly maxed out once the software has been fully optimized, and that takes a long time as someone will keep finding clever solutions to certain problems.
You're not getting it. If you can max out something that's it. You keep bringing up optimization. If it can be optimized, it didn't hit max. Max is the end of the road. Like I said above, if they can optimize to get better performance, it means it was never maxed to begin with. Max is the end. Being able to optimize means it wasn't at max.
... and if graphics were the most important thing to me in a game, this would mean something.
I would say Beyond: Two Souls is, by far, the best looking game on PS3. It didn't have the large explorable areas that Uncharted and The Last of Us had, but wow is it a looker.
Yes it was a nice looking game indeed.
It lost me though. The story was absurd and the last part of the story felt so rushed and silly that it ruined the game for me.
I prefer Heavy Rain to BTS any day.
haha you guys are funny and don't understand....a system can reach its limits with a engine but then you optimize and really optimize is when you clean up you engine and start taking bits from what you cant see to use it on what you can see.....background becomes static, lighting becomes baked....etc.....bla bla bla
People in this comments section must have autism because Uncharted 2 did not max out the PS3.
Hell, God of War III looked better than Uncharted 2.
I think The last guardian would've taken the title for the most maxed out game on the PS3.
With over a decade of development time it had better be....
The last games that wowed me were probably, God of war 3's opening gameplay when climbing Gia and then fighting posideon and Uncharted 3's Airplane to desert scene also the big cruise capsizing. I don't think i have seen any game with that polish and scale in a while.
Go play an Uncharted 2 map on Uncharted 3 MP then go back and play it on uncharted 2 you will see a huge difference.













I can’t comment on any specific applications
In other words I can't really comment what going on at MS because I have no ideal what they are doing. -___-
And he would know about everything that's going on behind the scenes at MS. I mean he does work for Naughty... Oh wait. He wouldn't know a d@mn thing would he? lol
I like how old developers from Sony are commenting on the new things MS is trying to do. lol
Read the article and to sum it all up it all depends on internet speed and reliability.
Cloud graphic and physics improvement is still too early because of the slow and unreliable internet speed that 80% of gamers have.
MS is onto something but its way ahead of its time.