
The Battlefield 3 Beta
It was an exciting day at PC Perspective yesterday with much our time dedicated to finding, installing and playing the new Battlefield 3 public beta. Released on the morning of the 26th to those of you who had pre-ordered BF3 on Origin or those of you who had purchased Medal of Honor prior to July 26th, getting the beta a couple of days early should give those of you with more FPS talent than me a leg up once the open beta starts on Thursday the 29th.
My purpose in playing Battlefield 3 yesterday was purely scientific, of course. We wanted to test a handful of cards from both AMD and NVIDIA to see how the beta performed. With all of the talk about needing to upgrade your system and the relatively high recommended system requirements, there is a lot of worry that just about anyone without a current generation GPU is going to need to shell out some cash.

AMD has mentioned that the next-gen Xbox is on track for release in 2027, which means we might be in the final year of the Series X|S.
NVIDIA rolled out the DLSS 4.5 update at CES last week, adding 2nd Gen Transformer-based Super Resolution technology for all RTX GPUs. The performance scaling varies wildly across the older (RTX 20/RTX 30) and newer (RTX 40/RTX 50) GeForce RTX lineups. We tested NVIDIA’s next-gen upscaling solution across Cyberpunk 2077, Black Myth: Wukong, Oblivion Remastered, and KCD 2.
I've been surprised by this, the difference between 4 and 4.5 is very noticeable. It's almost completely or has removed that weird dark ghosting that you'd get in foggy games like Silent Hill 2... and Cyberpunk mixed with a high res texture pack is jaw dropping in ultra 4k.
Also if anyone doesn't know I recommend DLSS swapper, it allows you to inject the latest DLSS version into older games.
Quite amazing. But, this does probably mean devs will depend on ai even more for their supposed optimizations lol.
no offense to AMD, but this sort of stuff shows that they are always going to be playing catchup. I guess Nintendo can take advantage of some of these features.
"Better than native."
Native 4K in nearly all games nowadays is actually native resolution with forced temporal anti-aliasing.
TAA smears and blurs frames together to soften jagged edges.
Of course DLSS makes games look "better than native" because native alone without any competent AA methods makes games look horrible.
FSR 4 was a substantial improvement to AMD’s upscaling solution. It reduces ghosting, improves finer mesh retention, and particle effects. In most cases, it delivers similar visual quality to DLSS 4’s CNN model, but slightly worse than the newer transformer model.
Since FSR is open-source and nvidia's DLSS isn't, I'd personally always prefer FSR.
Frankly, I think all these differences are nice to know (and notice) about if you're playing at DF level. And I totally respect that very small need to max out performance.
But given the prices, I don't think any nvidia GPU advantage justifies paying 1000+ bucks. I don't see any game(s) exclusively (or not) available on PC that offer a fundamentally different and innovative gameplay experience.
I dont know about anyone else, but I've never had 2 screens playing at the same time to know the difference in performance of a given game. It's like those TV screen comparisons, virtually nobody in the real world engages does this, lol. Performance seems comparable to me. Besides Nvidia is no longer interested in the gaming products, its full steam ahead with "AI".
I've been lucky enough to get a new 5090 build in March, glad I went with Nvidia. Cyberpunk looks amazing.
Just tested the PS3 version and PC version, was it worth investing in a GTX590.....no.
The gfx aint that mind blowing as some want it to be.
Looks like I should be allright with my Sapphire 6870 then, especially when its overclocked to 950/1150 and runs BF3 at 1080p not 1200p. Still waiting for the beta.
im playing the game with a HD 6970 at 2560x1440 and i get outside 20-35FPS and inside 40-50FPS.