• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

BF4 Demo Ran on A Radeon 7990 at 60fps/3K resolution, Confirms 7990 Existence

A

A More Normal Bird

Unconfirmed Member
Please post a build then. Include a wifi card, Windows 8, a power supply powerful enough to power the machine and a monitor that supports that resolution.

You may want to look up downsampling.
 

Easy_D

never left the stone age
To get rid of jaggies and pixel crawling.

That open landscape with the far away builings and construction sites would look like shit at 1080p with no real AA because all the small geometry just turns into a pixelated mess.

See those cranes, chimneys, antennas, gaps in the unfinished buildings etc?

When you play this bit for yourself at 1080p (or 720p *shudder*) it's not going to be pretty...
I'ma get me one of these just to downsample from 3K to 1080p. It'll probably look sexy
 

MooseKing

Banned
He was saying that the 7990 is 2x 7970 performance and a single 7970 is twice what the PS4 has. PS4 is in between a 7850 and 7870.

3k is also twice the resolution of 1080p when you multiply it out.

anything in crossfire is not 2 times the performance. More like 1.5 times.

The more GPU's you add the less of a difference they begin to make.
 

DarkFlow

Banned
Please post a build then. Include a wifi card, Windows 8, a power supply powerful enough to power the machine and a monitor that supports that resolution.

Forget it, saw your edit
A wifi card? Who the hell puts a wifi card in a gaming rig?
 

TEH-CJ

Banned
You can't compare PS4's GPU to the 7990 spec-for-spec. PS4 doesn't have nearly as much overhead as PC. It's tough coming up with a good analogy, but as far as performance to cost, it's something like Nissan GT-R (PS4) vs Porsche 911 GT3 (AMD 7990). Huge price difference, similar performance.

I would take a GTR any day over a any porsche. Just saying.
 
A

A More Normal Bird

Unconfirmed Member
A wifi card? Who the hell puts a wifi card in a gaming rig?

I think the point is to make it an equivalent comparison with a console that includes such features, but it ignores the different contexts of the two systems. A bit like adding the price of a monitor to a PC but not including the price of a TV when costing consoles. The argument is that most people already have TVs, but most people already have computer monitors too - even widescreen HD ones, as they're beneficial for productivity.
 
There's also the fact that a monitor getting out of $1000 is... nuts.... Whereas TVs often are. Yeah, you can get cheaper, but the quality per dollar is vastly different.

Also, for the console guys, just hook it up to your TV. Have it boot in Big Picture Mode! Mmmm easy. Though not my preference.

Looking at this, I'll get whatever the "high end" 8000 series card is, unless the game takes only as much as BF3 does. Doubtful, but if I can get it to 1080p and 60FPS, and it not look like a glass of mud, I'm happy, but 8950 or so should deal with this game rather nicely.
 

Picobrain

Banned
Why do people continue to make that inane argument? Does everyone include the price of their TV in console prices? Because let me tell you, that wouldn't work out well.

If I don't have a PC and I want to buy one of caurse I need to buy monitor too, right? And fot TV, everybody got a TV, so it's not same comparison.
 

Alexios

Cores, shaders and BIOS oh my!
If I don't have a PC and I want to buy one of caurse I need to buy monitor too, right? And fot TV, everybody got a TV, so it's not same comparison.
But is that a 3K TV and what stops you from hooking up the PC to it as people already said?
 

Hip Hop

Member
In my experience, it's smooth until the framerate falls under the refresh rate. Then all bets are off. But Crossfire in the 7xxx series isn't as bad as the 5xxx series.

I know it's a little off topic, but this is the problem I have with a single GTX 670 card, I just got into PC Gaming.

If it stays at the refresh rate, it is all good. Anything below that causes microstutter in every game.

Could it be a problem with my card, something im doing wrong?
 

CatPee

Member
I know it's a little off topic, but this is the problem I have with a single GTX 670 card, I just got into PC Gaming.

If it stays at the refresh rate, it is all good. Anything below that causes microstutter in every game.

Could it be a problem with my card, something im doing wrong?

Nah, it's just the reality of refresh rates. Happens in console games too. If the outputted framerate isn't an even variable of the monitor's refresh rate, stutter occurs.
 
A

A More Normal Bird

Unconfirmed Member
Without the monitor the thing will still cost at least $1,500. That's if you cheap out on parts

What's your point? That running unreleased top of the line games at 3K/60fps requires expensive hardware? That a brand new console sold at a loss is cheaper than a (vastly) more powerful PC? I really don't understand what argument you're trying to make here or why you brought it up in this thread.
 
I know it's a little off topic, but this is the problem I have with a single GTX 670 card, I just got into PC Gaming.

If it stays at the refresh rate, it is all good. Anything below that causes microstutter in every game.

Could it be a problem with my card, something im doing wrong?

Do you have Vsync turned on? If you're not using triple-buffering or some kind of dynamic Vsync, that'll cut your framerate in half every time it drops below your monitors refresh rate.
 

SapientWolf

Trucker Sexologist
I know it's a little off topic, but this is the problem I have with a single GTX 670 card, I just got into PC Gaming.

If it stays at the refresh rate, it is all good. Anything below that causes microstutter in every game.

Could it be a problem with my card, something im doing wrong?
That's probably just the expected visual difference between a vsynced 60 and everything else. The effect I'm talking about is so bad that 59fps looks closer to 15fps. And yet 60 is butter smooth. It's probably the visual equivalent of playing the same song on two stereos 1 second out of sync.
 

Hazaro

relies on auto-aim
Hope that's a true solid 60FPS and not lower.
http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Test-4

Doesn't happen to all games, but I can't recommend any dual AMD GPU setup to anyone. It's pointless until AMD fixes it. Buy a Titan if you want power. SLi 670/680 if you can deal with dual GPU issues.

k3qsFFg.png


SLARxFv.png


nFTzaUu.png


1f2q8dE.png
 
You can't compare PS4's GPU to the 7990 spec-for-spec. PS4 doesn't have nearly as much overhead as PC. It's tough coming up with a good analogy, but as far as performance to cost, it's something like Nissan GT-R (PS4) vs Porsche 911 GT3 (AMD 7990). Huge price difference, similar performance.
Similar? It's about as close as a gtx 650ti and a gtx Titan. The only similarity is that both are nvidia.


Without the monitor the thing will still cost at least $1,500. That's if you cheap out on parts
Not really as my pc cost $1.2k but I could have gotten a non-sli motherboard, a non-overclockable cpu and a lower power supply. Plus the 670 is now about $50 cheaper.
 

SapientWolf

Trucker Sexologist
Hope that's a true solid 60FPS and not lower.
http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Test-4

Doesn't happen to all games, but I can't recommend any dual AMD GPU setup to anyone. It's pointless until AMD fixes it. Buy a Titan if you want power. SLi 670/680 if you can deal with dual GPU issues.

k3qsFFg.png


SLARxFv.png


nFTzaUu.png


1f2q8dE.png
They said that Crossfire is useless in BF3 but I can verify that the claim is 100% false. I'd be curious to see what's really going on though.
 

Valnen

Member
Why do people continue to make that inane argument? Does everyone include the price of their TV in console prices? Because let me tell you, that wouldn't work out well.

If you don't already have a PC, it's only natural to include the price of a good monitor into the price of the build, as well as peripherals. Believe it or not there are still people out there without PC's, or good monitors/peripherals.
 
Aliasing is visually distinct from macroblocking.

DICE: What you saw yesterday is pre-alpha - it's not the final game or anything. We are lacking a lot of optimization. Without going into detail about what it could look like on a lower spec machine or a higher spec machine, we will have the scalability to bring the most out of any piece of hardware.
it is still 8 months away to final release.
 

Hazaro

relies on auto-aim
They said that Crossfire is useless in BF3 but I can verify that the claim is 100% false. I'd be curious to see what's really going on though.
Everyone is still getting their testing methodology down, have you tried the frame time viewer with FRAPS? Looking for more info.

I've personally experienced this (not in BF3) when I had 2x6950's.
 

Dash Kappei

Not actually that important
PS4 costs about 500 dollars less, comes with a BR and HD and everything else you need to play games...so obviously its not going to compete head to head with a finished PC decked out with a 7990.

All in all, it compares well based on economics. And if people stopped caring about 1080p on consoles (which is worthless at most comfy couch playing distances), it would probably do quite well.

Give me 60fps at 720p (or even high 600s) and massive scale and from 12ft away I'll be happy in MP games


LMAO
Next you'll tell me the human eye can't possibly see beyond 60fps. Are people seriously still generaling with such broad, stupid, misinformed statements on a gaming forum?
 
A

A More Normal Bird

Unconfirmed Member
it is still 8 months away to final release.

Optimisation doesn't have much to do with someone perceiving a large amount of aliasing for 3K being downsampled to 1080p. At that res even without MSAA or post-AA aliasing should be pretty low. Not that I necessarily agree with NBToaster; I haven't seen the footage so I can't comment.
 

acm2000

Member
someone explain the point of the TITAN gpu? less powerful and more expensive than a 690 GPU, but you save some energy, is that it?
 
Optimisation doesn't have much to do with someone perceiving a large amount of aliasing for 3K being downsampled to 1080p. At that res even without MSAA or post-AA aliasing should be pretty low. Not that I necessarily agree with NBToaster; I haven't seen the footage so I can't comment.

The 3k version was for the theatre footage it is maybe not the same one as the youtube stream.
 

SapientWolf

Trucker Sexologist
Everyone is still getting their testing methodology down, have you tried the frame time viewer with FRAPS? Looking for more info.

I've personally experienced this (not in BF3) when I had 2x6950's.
They're not using the frametime viewer.

Maybe there's something wrong with their setup but Crossfire is smooth as silk for me in BF3 with two 5850s. I turned it off to test their theory and it was horrible. Now I remember why I bought the second card in the first place.
 
You can't compare PS4's GPU to the 7990 spec-for-spec. PS4 doesn't have nearly as much overhead as PC. It's tough coming up with a good analogy, but as far as performance to cost, it's something like Nissan GT-R (PS4) vs Porsche 911 GT3 (AMD 7990). Huge price difference, similar performance.

So you expect the PS4 to have similar performance to a 7990... Oh dear, this fall is going to be absolutely hilarious.
 
Would be nice if they stopped with those misleading demos.

DICE said:
The demo is the visual target of what we want the game to look like, and when I say visual target I don't want people to confuse that with rendering pictures that you could never create in the actual game - when we create our visual targets a big part of that is to make it realistic, as in you will be able to run it on a machine that you can buy.

yeah how misleading
 
A

A More Normal Bird

Unconfirmed Member
The 3k version was for the theatre footage it is maybe not the same one as the youtube stream.

Maybe not. No idea why they wouldn't want their demo to look as good as possible though. Also doesn't make your point regarding optimisation any more pertinent. Once again, I haven't seen the footage, so I can't comment. There's a chance the youtube stream was downsampled from 3K and NBToaster is just being fussy.
 
someone explain the point of the TITAN gpu? less powerful and more expensive than a 690 GPU, but you save some energy, is that it?

You avoid multiple GPU issues. There are a lot of people who intentionally avoid SLI and CrossFire because even though they've apparently improved a lot over the years, you still hear a lot of complaints, particularly near the launch of games. Some people, like myself, just want to avoid adding one more potential complication. Personally, I'm not willing to pay $1000 but I do look to buy the most powerful single GPU if it doesn't shatter my budget.
 

sflufan

Banned
You avoid multiple GPU issues. There are a lot of people who intentionally avoid SLI and CrossFire because even though they've apparently improved a lot over the years, you still hear a lot of complaints, particularly near the launch of games. Some people, like myself, just want to avoid adding one more potential complication. Personally, I'm not willing to pay $1000 but I do look to buy the most powerful single GPU if it doesn't shatter my budget.

I'm one of these people.

I absolutely refuse to use any dual-GPU solution. Just give me a bloody powerful single GPU and I'm golden!
 
Specs got leaked:


The AMD Radeon HD 7990 “Malta” has two Tahiti XT cores which result in a total of 8.6 Billion transistors, 4096 Stream processors, 2 Prim /Clock , 256 Texture mapping units and 64 Raster operating units. The core is clocked at 1000 MHz or 1 GHz though not known whether it has the GPU boost technology enabled or not. The card is equipped with a massive 6 GB GDDR5 memory that operates along a 384-bit x 2 memory interface and is clocked at 6.0 GB/s effective frequency. The memory pump outs an impressive 576.0 GB/s bandwidth. The Radeon 7990 has a peak compute performance of 8.2 TFlops.

http://wccftech.com/amd-radeon-hd-7...ons-disclosed-dual-tahiti-gpus-clocked-1-ghz/

Seems like the Ares II is still the king of dual GPUs.
 
Top Bottom