• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

FXAA creator comments on Orbis, Durango

Status
Not open for further replies.

Corky

Nine out of ten orphans can't tell the difference.
I don't see any difference between these and the final game.

1080p
720p

I can't see the "blurry mess" you are referring to (yes it's obviously worse but far from being a blurry mess), quality of pixel matter a lot more than quantity.

What you did there, I dun there saw it.
 

velociraptor

Junior Member
I don't see any difference between these and the final game.

http://www.youtube.com/watch?v=kLFZVNHPaeM
http://www.youtube.com/watch?v=iaGSSrp49uc



1080p
720p

I can't see the "blurry mess" you are referring to (yes it's obviously worse but far from being a blurry mess), quality of pixel matter a lot more than quantity.
Resident Evil is one of the few console games which has excellent IQ.

The difference between 720p and 1080p however is quite clear (although I can see you have simply given me 2 of the same links).

I've got Mass Effect 3 on PC, and honestly, it blows the PS3 demo out of the water. 1080p is HUUUUGE.
 

Drek

Member
Are you assuming Nintendo isn't going to release anything else this gen? They'll assuredly have something significant released around the launch of the consoles

Nintendo hasn't released a piece of console software that changed the video game landscape from a retail standpoint since before the N64. What are they going to do?

Another Mario? Zelda? Those aren't game changers, those are Nintendo's core and those people didn't propel the N64 past the PS1 or the Gamecube past the PS2 or even the Xbox as far as 3rd parties were concerned.

Nintendo's surge in the living room last generation was powered entirely by hardware differentiation. Different input and a much lower starting price point. This generation's hardware deviation is failing to show the same kind of mass appeal and lacks as much of a pricing edge. See the problem?

Nintendo will continue to make healthy profits but they aren't going to suddenly release a piece of software that changes the retail landscape and therefore aren't going to flip 3rd parties away from much more advanced hardware to their console.
 

Hypron

Member
Really, a game running at a slightly lower resultion is a blurry mess now?

Yes, for me it is.

Also
1. PC screenshots tend to be downsampled.

Even without downsampling, 1080p looks a lot sharper than 720p.

2. they often look like someone has thrown butter all over them with stupid mods and additional effects that kill any of the original art. Just look at all the enb mods for skyrim, people seem to want blurry garbage. They think it looks better for some reason.

I don't like that either and I don't get why you even bring that up as a counter to my argument. It's got nothing to do with the problem at hand. Also, I don't get why you're upset about that if you enjoy the blurry garbage resulting from upscaling games.

3. miles better for me isn't defined by a shift in resolution.

But for some people it is.

I'd rather have a better looking game running at a lower resolution then a worse looking game running at a higher resolution. Half the time I play games in windowed modes so I can ult tab and do other things at the same time. The flexibility of PC gaming is what brings me back to it, not the graphics, or crappy controls.

For the bolded: what? For the rest, that's great, but for me (and lots of other people around here, since you can see quite a few people requesting 1080p as a standard feature for next gen consoles) getting a game with slightly worse effects is definitely worth it if we get a higher resolution as a counterpart. Higher resolution really does bring out all the details in the textures that you just can't see in 720p (again, compare DmC console versus PC for example, or Dark Souls PC vs console).
 

Jack_AG

Banned
Dat Dark Souls resolution.

Twice the amount of pixels means twice the visible detail. I still couldn't believe those DS assets are the same on consoles - such a shame missing all that detail from nothing more than a resolution downgrade.
 

fvng

Member
If this GPU is pre-GCN with a serious performance gap to PS4, then this next Xbox will act like a boat anchor, dragging down the min-spec target for cross-platform next-generation games.

this is the most troubling, I feel like it's the current gen repeating history. sigh.
 

J-Rzez

Member
The E3 build looked noticeably better than the final PC version.

Things happen. Look at Planetside 2. The beta looked significantly better, but they had to lock out and hide features because the code wasn't up to snuff. Or maybe it was just too advanced for most hardware.
 

Jack_AG

Banned
this is the most troubling, I feel like it's the current gen repeating history. sigh.
Current gen is close enough to warrant asking for parity. Last gen wasn't and if there's a divide like that next-gen I hope developers don't cheap out shooting for parity. I don't care which unit is more powerful - if there's a gap - I expect (as in: demand) developers to show it.
 
this is the most troubling, I feel like it's the current gen repeating history. sigh.

I feel theres more chance of devs backing Orbis actually. If MS is going after a different crowd than third parties usual target customer then they'll probs lose the good feeling of last gen.

Hell if Orbis is more powerful to a good degree we may see the return of unpaid exclusives. Otherwise they'll be competing with Sony first party and any third parties who take full advantage of the machine.
 

ghst

thanks for the laugh
720p looks like babyfist pixel garbage and every time i'm affronted in real life by a console game on even a moderate sized TV i can't fathom how any able visioned person can stand it.
 

Chev

Member
Current gen is close enough to warrant asking for parity. Last gen wasn't and if there's a divide like that next-gen I hope developers don't cheap out shooting for parity. I don't care which unit is more powerful - if there's a gap - I expect (as in: demand) developers to show it.
Won't happen unless you also agree to pay more for games.
 

eso76

Member
720p looks like babyfist pixel garbage and every time i'm affronted in real life by a console game on even a moderate sized TV i can't fathom how any able visioned person can stand it.

I would assume those claiming 720p look like a blurry mess are using a 1080p panel.
720p DON'T look like a blurry mess unless upscaled.
 

QaaQer

Member
720p looks like babyfist pixel garbage and every time i'm affronted in real life by a console game on even a moderate sized TV i can't fathom how any able visioned person can stand it.

If spec rumors are true, you should be the next Kevin Butler.
 

mrklaw

MrArseFace
I would assume those claiming 720p look like a blurry mess are using a 1080p panel.
720p DON'T look like a blurry mess unless upscaled.

but there are very few 720p panels out there. HDTVs are mostly 1080p or 1366x768 so will require scaling.


Personally I think 720p looks fine upscaled to 1080p on my TV, and if it meant better framerate, AA or IQ I might be ok with it on next gen.
 

sp3000

Member
Things happen. Look at Planetside 2. The beta looked significantly better, but they had to lock out and hide features because the code wasn't up to snuff. Or maybe it was just too advanced for most hardware.

How did the beta look better, post some evidence
 

Perkel

Banned
I don't see any difference between these and the final game.

http://www.youtube.com/watch?v=kLFZVNHPaeM
http://www.youtube.com/watch?v=iaGSSrp49uc



1080p
720p

1080p
720p

I can't see the "blurry mess" you are referring to (yes it's obviously worse and I'm not saying we shouldn't go above 720p next gen but it's far from being a blurry mess), quality of pixel matter a lot more than quantity. There's another argument about console games being played from a distance so the effect from lower resolution isn't as drastic as it would be if one were playing a game with the screen just a feet away from them. I never play at a non native resolution on my PC if it's hooked up to my monitor because the scaling just doesn't look right even if it's a resolution just a step below 1080p it'll look quite blurry, it's a different case when I hook it up to my HDTV though.

Maybe next time try to use game which was created with high resolution in mind ? shitty textures in RE5 won't get better with res. Without high resolution you won't see details which high resolution textures provides.


Seriously 720p only looks good for people who don't use 1080p everyday. Especially those people which don't have 1080p native in their screens.
 

benny_a

extra source of jiggaflops
Seriously 720p only looks good for people who don't use 1080p everyday. Especially those people which don't have 1080p native in their screens.
Objection.

I think 720p is fine. 1080p is better. But saying 720p only looks good because you don't know the glory of 1080p is absurd.

Resolution is nowhere near the most important factor for graphics for everyone.
 
Funny stuff.

Unless you have a 100" TV or are sitting 2 feet away you aren't going to notice much difference in 720p and 1080p if you are sat at the proper viewing distance.

Can't believe people are so elitist over everything, "omg I use 1080p every day and I can SO tell the difference, 1080p r00lz 720p dr00lz"
 

Pooya

Member
don't think there is a need to go down to 720p next gen.

here is some real world benchmarks from xbitlabs:
zfulltable8ks0e.png


http://www.amd.com/us/products/desktop/graphics/7000/7770/Pages/radeon-7770.aspx#2
http://www.amd.com/us/products/desktop/graphics/7000/7850/Pages/radeon-7850.aspx#3

HD 7770 GHz Edition is roughly same as rumored Durango GPU with 2 less CUs but higher clock, same flops and similar bandwidth (72GB/s), maybe Durango has better bandwidth with it's custom memory setup it has, or different number of ROPs etc. this should still be a roughly good representative for the performance.

HD 7850 is roughly same as Orbis rumored GPU with similar flops, less CUs but higher clock again, same scenario as said above applies here.

looking at Crysis 2 and Battlefield 3 results, which are probably the closest to 'nextgen' game we have right now, both should handle 1080p at 30fps, the result should improve on the console vs these PC results. Orbis probably can run the same game with similar image quality at 60fps even in some cases.

maybe later in the generation they decide to go down to 720p, as these games really don't look all that good and are old already and devs want to push gfx further.
 

jsnepo

Member
Out of all these 720 and upscaling talk, I'd like to ask if it is better to use to TVs upscaler or the console? I was playing Arkham City and found a way to force it to 1080p upscaled by disabling 720p from the PS3 settings.
 

Erasus

Member
720p looks like babyfist pixel garbage and every time i'm affronted in real life by a console game on even a moderate sized TV i can't fathom how any able visioned person can stand it.

It looks good on plasma actually! And if you are comming from pre-gen, dont own a pc, dont own any 1080p games they just dont know any better
 
It doesn't have to be 1280x720 or 1920x1080, it can be something in between. 1280x720 is definitely too low and looks bad enough for people to notice, 1920x1080 may be too high for certain devs who want to push the envelope. I think we'll see some sort of dynamic resolution implementation that's more widespread, cause it only makes sense. You don't necessarily need a full 1920x1080 framebuffer in scenes with a lot of action, where the framerate would normally go down.
 

Erasus

Member
text and big pic so wont quote

Yea I have a 7770.

Its a great card. BF3 on med-high, post AA in 1080p 40fps multi 64p

Glorious

I saw a DmC bench and it hits 70-90 fps maxed out

It doesn't have to be 1280x720 or 1920x1080, it can be something in between. 1280x720 is definitely too low and looks bad enough for people to notice, 1920x1080 may be too high for certain devs who want to push the envelope. I think we'll see some sort of dynamic resolution implementation that's more widespread, cause it only makes sense. You don't necessarily need a full 1920x1080 framebuffer in scenes with a lot of action, where the framerate would normally go down.

Wipeout HD does this and its a great comprimize. On a 42 plasma I cant tell its not 1080p and even on my 24" 1080p monitor where I sit close (not everyone runs consoles through a TV) it looks great. Im all for dynamic buffers if devs need it to push more effects
 

Pooya

Member
It doesn't have to be 1280x720 or 1920x1080, it can be something in between. 1280x720 is definitely too low and looks bad enough for people to notice, 1920x1080 may be too high for certain devs who want to push the envelope. I think we'll see some sort of dynamic resolution implementation that's more widespread, cause it only makes sense. You don't necessarily need a full 1920x1080 framebuffer in scenes with a lot of action, where the framerate would normally go down.

yeah, 1600x900 actually doesn't look bad on a 1080p TV screen for games. I actually play at that on TV here and there to get better frame rate on my aging PC and it's not an apparent difference at all, definitely worth the extra frame rate for me for minor loss in clarity which I can't tell when looking at a TV. 720p though it's pretty obvious. dynamic resolution should be good solution switching between 1600x900 - 1080p, this is probably what Durango games should do with these rumored specs to run the same game at same target FPS as Orbis.
 
don't think there is a need to go down to 720p next gen.

here is some real world benchmarks from xbitlabs:
zfulltable8ks0e.png


http://www.amd.com/us/products/desktop/graphics/7000/7770/Pages/radeon-7770.aspx#2
http://www.amd.com/us/products/desktop/graphics/7000/7850/Pages/radeon-7850.aspx#3

HD 7770 GHz Edition is roughly same as rumored Durango GPU with 2 less CUs but higher clock, same flops and similar bandwidth (72GB/s), maybe Durango has better bandwidth with it's custom memory setup it has, or different number of ROPs etc. this should still be a roughly good representative for the performance.

HD 7850 is roughly same as Orbis rumored GPU with similar flops, less CUs but higher clock again, same scenario as said above applies here.

looking at Crysis 2 and Battlefield 3 results, which are probably the closest to 'nextgen' game we have right now, both should handle 1080p at 30fps, the result should improve on the console vs these PC results. Orbis probably can run the same game with similar image quality at 60fps even in some cases.

maybe later in the generation they decide to go down to 720p, as these games really don't look all that good and are old already and devs want to push gfx further.

Those minimum framerates are appalling if you want AA, though.
 

Krilekk

Banned
And/or 3D. 720p will be the standard for next gen, you need a substantial push forward in fidelity to make people buy a next gen console. Higher resolution is not enough.
 

mrklaw

MrArseFace
how about 1080i/60? you get 60fps and 1080p but with the fillrate of 720p/60. Works for TV broadcasts (TV would show a 1080p image anyway)

yeah, 1600x900 actually doesn't look bad on a 1080p TV screen for games. I actually play at that on TV here and there to get better frame rate on my aging PC and it's not an apparent difference at all, definitely worth the extra frame rate for me for minor loss in clarity which I can't tell when looking at a TV. 720p though it's pretty obvious. dynamic resolution should be good solution switching between 1600x900 - 1080p, this is probably what Durango games should do with these rumored specs to run the same game at same target FPS as Orbis.



would it make more sense to keep the vertical resolution at 1080 though and just scale horizontally? so 1280x1080 for instance, or 1440x1080 which is quite a common broadcast resolution compromise.
 

Eideka

Banned
And/or 3D. 720p will be the standard for next gen
Every time I read this my blood starts to boil. There are NO reasons to opt for 720p next-generation given the hardware they will work on.
1280*1080 is the lowest there should go for.

This is 2013, most HDTVs are 1080p native.
 

lumzi23

Member
More hardware lights and texture layers? I don't think so.

Also gekko was simply less advanced a gpu when compared to the Xbox. Stuff like bump mapping and all those bells and whistles were supported by the Geforce 3 gpu feature set.

A couple of developers certainly did very interesting things with the Gamecube, just like many did with the PS2. But Xbox was simply a gen ahead in terms of features supported by its gpu. You can just look at a multiplatform game like Chaos Theory to understand the native differences these consoles had.

I am not sure but I think it was Xbox with four of each while GC had eight of each. Also, what I understand is that Xbox had more 'standized' shaders? While GC was more versatile with it's programmable shaders.

Xbox could pump out stuff like normal mapping all day but GC definitely had it's own unique advantages allowing stuff like the displacement mapping seen in the factor 5 games.
 
Nintendo hasn't released a piece of console software that changed the video game landscape from a retail standpoint since before the N64. What are they going to do?

Another Mario? Zelda? Those aren't game changers, those are Nintendo's core and those people didn't propel the N64 past the PS1 or the Gamecube past the PS2 or even the Xbox as far as 3rd parties were concerned.

Nintendo's surge in the living room last generation was powered entirely by hardware differentiation. Different input and a much lower starting price point. This generation's hardware deviation is failing to show the same kind of mass appeal and lacks as much of a pricing edge. See the problem?

Nintendo will continue to make healthy profits but they aren't going to suddenly release a piece of software that changes the retail landscape and therefore aren't going to flip 3rd parties away from much more advanced hardware to their console.

I would assume those claiming 720p look like a blurry mess are using a 1080p panel.
720p DON'T look like a blurry mess unless upscaled.

Where exactly does one buy a HDTV with a native 1280x720 resolution that will 1:1 pixel map a 720p input? "720p" HDTVS are garbage as they're either 1024x768 or 1366x768 and scale all content that you feed them. Only specialist kit like DLP projectors or HMDs are actually native 720p, not your average LCD or plasma.

Personally I'm not quite as averse to lower resolutions as ghst, so long as your AA and AF solution is up to scratch.

Dynamic resolution (that always maintains 1080 lines of vertical resolution) would be my personal choice next generation. If AA and filtering are good (and anything less than 2xmsaa + high quality fxaa and 8xaf is inexcusable on Orbis) then I'm not going to notice a temporary drop in horizontal resolution all that much. This solution maintains a native 1:1 mapped 1080p HUD throughout and should get rid of the vast majority of tearing and dropped frames as well, which is the right compromise to make IMO.
 

QaaQer

Member
Given how the "50% more" comment was openly mocked on B3D for being inaccurate, yet is still getting thrown around out of context, maybe he just wanted to remove his comment for the logical reason he cited.

Posts like this and the petty b3d mocking is why I guess the post was removed.

Typical dick waving, point scoring nerd crap vs good respectful speculative discussion = pointless headaches. I get it now.
 

Pooya

Member
Those minimum framerates are appalling if you want AA, though.

you'll take FXAA and you'll like it :p

it looks kind of bad right now in 720p console games, but at 1080p, if highest quality presets are used and tweaked a bit for the look of the game you're making, it should look just fine. if there is 8x-16x AF it should make everything look a lot cleaner than what we have these days.
MSAA probably going to be pretty rare with these GPUs.
 
you'll take FXAA and you'll like it :p

it looks kind of bad right now in 720p console games, but at 1080p, if highest quality presets are used and tweaked a bit for the look of the game you're making, it should look just fine. if there is 8x-16x AF it should make everything look a lot cleaner than what we have these days.
MSAA probably going to be pretty rare with these GPUs.

There's no reason for MSAA to be rare on Orbis, memory bandwidth is incredibly generous. FXAA isn't a replacement for MSAA its a compliment to it and it produces its best results when high quality versions are combined with msaa.

Durango is where the problem lies; as fitting multiple high precision 1080p buffers into a 32MB pool is going to prove a juggling act if you want to enable msaa as well. Wii U developers are struggling to enable MSAA with a 32MB embedded frame buffer and they can get away with a 720p render target and lower overall precision.
 
Posts like this and the petty b3d mocking is why I guess the post was removed.

Typical dick waving, point scoring nerd crap vs good respectful speculative discussion = pointless headaches. I get it now.

Petty mocking by random posters maybe, but posters like ERP/bkilian/people-in-the-know poking fun because some of his speculation was just plain wrong is completely valid. It's a shame some are taking his statements out of context, and forced him to take it down.
 

QaaQer

Member
Petty mocking by random posters maybe, but posters like ERP/bkilian/people-in-the-know poking fun because some of his speculation was just plain wrong is completely valid. It's a shame some are taking his statements out of context, and forced him to take it down.

yeah, right.

B3D has its fair share of dick waving, they just hide it better.

And unless those posters have their real names attached to their words, they are just as anonymous and as random as any other internet know-everything trying to prove how smart they are by scoring points by twisting things and taking things out of context.

It would have been so much more interesting to have had zero bullshit and maybe the FXAA guy would have answered some questions here or on B3D.

Oh well, back to the nattering...
 

Jack_AG

Banned
Won't happen unless you also agree to pay more for games.
Do tell how this works. I haven't paid increased prices for my PC titles for the benefit of an overall better experience. You would do well to know most studios create assets for technology that can't run them... then scale back to fit the platform. They already create with fidelity higher than what we see with consoles/pc.

Not to mention some devs for current gen have gone on record stating that their PS3 counterpart took twice as long and spent twice as much just to attain parity to the 360 counterpart - yet the price of the game remained the same to the consumer.

Fast forward and we now have apples-apples hardware and ease of development should be a far closer space than current gen, coupled with the fact that studios already create at a higher fidelity than what actually goes in their games...

How would it be more expensive?

If there is a gap I want to see it no matter which system is better. If they wind up being extremely similar then parity is fine, save for minor differences between the two.

But, getting back to what you were saying - no. Thats not how it works.
 

mkenyon

Banned
I have no idea whether or not this is true, but ran across this on Overclock.net. Interesting if true:

Low level programming also takes a larger magnitude of lines of code in a program. Meaning that development costs will go up since games will take longer to release.

Also an interesting fact, the average programmer at work only types 20 lines of useful code a day.

To put this into perspective here is a machine level langauge for the IBM Z390.

This program adds two preset numbers:

Code:
PRINT NOGEN
EQUREGS
ADD SUBENTRY
L R2,ONE
A R2,TWO
XPRNT MSG,L'MSG
XDECO R2,OUT
XPRNT OUT,L'OUT
SUBEXIT
MSG DC C'Adding Numbers'
ONE DC F'1'
TWO DC F'2'
OUT DS CL12
END

Highly efficent and fast at run time, but takes ages compared to a high level program to code.

Heres an another example of the same concept, but with user input to change variables in Python.

Code:
C = lambda n, k : n + k

I'm not saying low level coding isn't great for getting things to run efficient and fast, it just isn't pratical in the way businesses are run today.

I should also note debugging on machine level code is a lot harder than it is on its high level language counterpart.
 
Status
Not open for further replies.
Top Bottom