CRank: 5Score: 33590

The computer they tested it on was also AMD APU...and it had no issue.

3827d ago 4 agree0 disagreeView comment

You have to place resolution comparisons at the same size of screen/object. I have used NioRide's screenshots.

http://i.imgur.com/E4qDwk3....

This lets you see the difference between 1080P and 4K IF it were on the same TV/monitor capable of both 1080P and 4K.

When you view a 4k image on a 1080P screen, the image will appear zoomed because the pixels are the same size. But you can f...

4081d ago 3 agree0 disagreeView comment

@BlackTar187

Video games render each frame from scratch. You don't upscale. The only reason a console might upscale is to match the display's native resolution.

4081d ago 4 agree0 disagreeView comment

1080P would only slow things down if both users' internet could not handle it. Would be nice to have it as an option or auto scaling similar to what WipeoutHS's dynamic resolution.

Latency too high? Immediately drops to 720P/30FPS. That way users who have a capable internet can enjoy 1080P/60FPS and others can have it once their internet speed climbs to that ability.

The thing I do not know, and may be another issue, is the performance of the dedicate...

4190d ago 0 agree1 disagreeView comment

There is no technical limitation. This is mainly an issue of convincing the creator/publisher of the game to allow it on PC/X1. The X1 already has dedicated hardware to stream video and most video cards on PC since 3-2 years ago have a dedicated chip for streaming as well.

Remember, share play only works for 60 minutes and then disconnects and you must start another session. This is probably the compromise made with the publishers. It basically let's the player who does n...

4190d ago 2 agree3 disagreeView comment

They render both versions of the game at the same time, the old engine at 1328x1080@60FPS and the new engine at 1328x1080@60FPS. So the framebuffer is actually rendering 2 times 1328x1080

1328x1080x2=2,868,480
1920x1080=2,073,600

So it is rendering more pixels than full 1080P. The reason it can do that is half of those pixels are a lot easier to handle than normal (regular graphics and assets of Halo 2), while the other half of those pixels are much h...

4210d ago 8 agree4 disagreeView comment

Nobody in this thread realizes the game has depth of field and the PC shot above simply has it turned off and vita doesn't have depth of field at all.

As simple as it may seem, DOF affects performance, even if a little.

http://international.downlo...

4436d ago 5 agree1 disagreeView comment

Borderlands is normally blurry in the background (or if you aim down sights, it will bring into focus whatever you are directly looking at), it is the depth of field effect. Who ever took that PC screenshot disabled the depth of field graphics option is all.

Here you can see it is on:

http://i.imgur.com/VVLAa.jp...

Even the first borderlands did this:

4436d ago 4 agree2 disagreeView comment

I can understand compressing but the share function lowers the resolution? Is there a cap? That is a huge bummer. I was liking the idea of not booting my PC for screenshots/video capture on the PS4 :/

4453d ago 1 agree1 disagreeView comment

Dude, I prefer PC gaming by far, but come on, don't act like a tool.

4482d ago 7 agree1 disagreeView comment

The PS4 GPU:
18 compute units (1152 cores) clocked at 800MHz

The R9 270: 20 compute unites (1280 cores) clocked at 925mhz

We also don't know the spec of the CPU (Cores and clock speed) nor the amount of system RAM. I assume it is at least 8GB, not counting the R9 270 having 2GB of dedicated video memory (or more if not reference, which is common).

@Ritsujun

You have no idea about computing hardware if you think OU...

4525d ago 20 agree0 disagreeView comment

Well, no. He isn't the only person in the world who understands computer technology...

It is simple, GDDR helps the GPU and hurts the CPU in certain areas of performance, DDR helps the CPU and hurts the GPU in certain areas of performance.

4661d ago 3 agree2 disagreeView comment

There is only X1 footage so far that I know. And from what I have seen it switches from anything in about a second at most once you finish a voice command. That is pretty quick considering that is two things in one second (xbox analyzes what you said, once it does, it executes whatever you told it to do, which also takes time).

I have no idea why coolmast3r says it X1 is slow. Hell the X1 is dedicating more of the hardware to the OS than the PS4 (bad for games, good for OS an...

4680d ago 13 agree4 disagreeView comment

It has been almost 7-8 years. I would hope we got more than 4x the ram. Especially with how cheap it is these days.

4680d ago 14 agree0 disagreeView comment

Detrania

Look, the PS4 does have a better GPU, by a good bit. It is the same video card as the X1 but with 50% more cores, so you are looking at around 40-50% more graphics horsepower. But the CPU's are nearly identical, both are blue ray, both have 8GB of ram (though the PS4 has GDDR5 so we can see more bandwidth but lower latency, hurts CPU tasks but helps GPU tasks, X1 is vice-versa). X1 also has an HD video capture card (For TV or other inputs). That raises cost even ...

4680d ago 1 agree0 disagreeView comment

Patcher? Since when is he reliable. Flipping a coin is about as accurate as he is.

The X1 comes with Kinect 2 and video in (a video capture card). Whether or not you want that hardware, it costs money. We can debate about things being optional or not, but the point is there is a legitmate reason the X1 costs more even if the GPU in the PS4 is around 40-50% faster. The X1 still has a 1080P camera, Infrared camera, multi-mic array, and a video capture card. Those aren't che...

4680d ago 0 agree0 disagreeView comment

That is a good point. The X1 basically has an HD video capture card inside of it (the HDMI in). Those go for around 200 bucks (some cheaper, some more depending on quality) for a PC.

That plus the Kinect 2 easily explains the higher cost.

I think both console are priced pretty fairly, maybe each a little tiny too much considering the actually hardware inside. What it really comes down to is if you want kinect and the HD in function on X1 or not. The PS4 video...

4680d ago 1 agree0 disagreeView comment

If killzone sf used 5-6gb, say by increasing texture res massively or having MANY varied character models, that doesn't mean the ps4/GPU could handle it. If you cram a shit ton a textures and the ram is there to hold it, that doesn't mean the texture fill rate is going to be able to keep that up. There are many bottlenecks.

Hell, minecraft on PC can use gbs of ram if you keep exploring and have the game setup to load lots of the world into ram and not cycle out blocks...

4694d ago 0 agree1 disagreeView comment

Everyone needs to remember that when we talk about a console game, we are talking about both system and video ram/memory requirements. When you talk about PC memory requirements, we are talking about just system memory and just to get the lowest graphical setting the game offers.

Take a look at this article on metro last lights graphics.

...

The Cell would beat the current PS4 CPU in certain graphics and compression/decompression scenarios. But that's because the Cell was to help the lame GPU the PS3 had. The PS4 is doing the smart thing, as PC's and most other consoles do, which is have a beefy CPU focus on CPU work (AI, scipting, etc) and a better GPU to focus on the graphics and such. And the GPU will crush the Cell in graphics performance (which isn't that hard to do really, the cell is a hybrid, master in none). ...

4735d ago 1 agree0 disagreeView comment