Resident Evil is one of the few console games which has excellent IQ.I don't see any difference between these and the final game.
http://www.youtube.com/watch?v=kLFZVNHPaeM
http://www.youtube.com/watch?v=iaGSSrp49uc
1080p
720p
I can't see the "blurry mess" you are referring to (yes it's obviously worse but far from being a blurry mess), quality of pixel matter a lot more than quantity.
Are you assuming Nintendo isn't going to release anything else this gen? They'll assuredly have something significant released around the launch of the consoles
Really, a game running at a slightly lower resultion is a blurry mess now?
Also
1. PC screenshots tend to be downsampled.
2. they often look like someone has thrown butter all over them with stupid mods and additional effects that kill any of the original art. Just look at all the enb mods for skyrim, people seem to want blurry garbage. They think it looks better for some reason.
3. miles better for me isn't defined by a shift in resolution.
I'd rather have a better looking game running at a lower resolution then a worse looking game running at a higher resolution. Half the time I play games in windowed modes so I can ult tab and do other things at the same time. The flexibility of PC gaming is what brings me back to it, not the graphics, or crappy controls.
Obsidian developed KotOR 3 on Frostbite 2. BELIEVE!
If this GPU is pre-GCN with a serious performance gap to PS4, then this next Xbox will act like a boat anchor, dragging down the min-spec target for cross-platform next-generation games.
The E3 build looked noticeably better than the final PC version.
Current gen is close enough to warrant asking for parity. Last gen wasn't and if there's a divide like that next-gen I hope developers don't cheap out shooting for parity. I don't care which unit is more powerful - if there's a gap - I expect (as in: demand) developers to show it.this is the most troubling, I feel like it's the current gen repeating history. sigh.
this is the most troubling, I feel like it's the current gen repeating history. sigh.
720p looks like babyfist pixel garbage and every time i'm affronted in real life by a console game on even a moderate sized TV i can't fathom how any able visioned person can stand it.
Won't happen unless you also agree to pay more for games.Current gen is close enough to warrant asking for parity. Last gen wasn't and if there's a divide like that next-gen I hope developers don't cheap out shooting for parity. I don't care which unit is more powerful - if there's a gap - I expect (as in: demand) developers to show it.
720p looks like babyfist pixel garbage and every time i'm affronted in real life by a console game on even a moderate sized TV i can't fathom how any able visioned person can stand it.
720p looks like babyfist pixel garbage and every time i'm affronted in real life by a console game on even a moderate sized TV i can't fathom how any able visioned person can stand it.
I would assume those claiming 720p look like a blurry mess are using a 1080p panel.
720p DON'T look like a blurry mess unless upscaled.
Things happen. Look at Planetside 2. The beta looked significantly better, but they had to lock out and hide features because the code wasn't up to snuff. Or maybe it was just too advanced for most hardware.
I don't see any difference between these and the final game.
http://www.youtube.com/watch?v=kLFZVNHPaeM
http://www.youtube.com/watch?v=iaGSSrp49uc
1080p
720p
1080p
720p
I can't see the "blurry mess" you are referring to (yes it's obviously worse and I'm not saying we shouldn't go above 720p next gen but it's far from being a blurry mess), quality of pixel matter a lot more than quantity. There's another argument about console games being played from a distance so the effect from lower resolution isn't as drastic as it would be if one were playing a game with the screen just a feet away from them. I never play at a non native resolution on my PC if it's hooked up to my monitor because the scaling just doesn't look right even if it's a resolution just a step below 1080p it'll look quite blurry, it's a different case when I hook it up to my HDTV though.
Objection.Seriously 720p only looks good for people who don't use 1080p everyday. Especially those people which don't have 1080p native in their screens.
720p looks like babyfist pixel garbage and every time i'm affronted in real life by a console game on even a moderate sized TV i can't fathom how any able visioned person can stand it.
text and big pic so wont quote
It doesn't have to be 1280x720 or 1920x1080, it can be something in between. 1280x720 is definitely too low and looks bad enough for people to notice, 1920x1080 may be too high for certain devs who want to push the envelope. I think we'll see some sort of dynamic resolution implementation that's more widespread, cause it only makes sense. You don't necessarily need a full 1920x1080 framebuffer in scenes with a lot of action, where the framerate would normally go down.
It doesn't have to be 1280x720 or 1920x1080, it can be something in between. 1280x720 is definitely too low and looks bad enough for people to notice, 1920x1080 may be too high for certain devs who want to push the envelope. I think we'll see some sort of dynamic resolution implementation that's more widespread, cause it only makes sense. You don't necessarily need a full 1920x1080 framebuffer in scenes with a lot of action, where the framerate would normally go down.
don't think there is a need to go down to 720p next gen.
here is some real world benchmarks from xbitlabs:
http://www.amd.com/us/products/desktop/graphics/7000/7770/Pages/radeon-7770.aspx#2
http://www.amd.com/us/products/desktop/graphics/7000/7850/Pages/radeon-7850.aspx#3
HD 7770 GHz Edition is roughly same as rumored Durango GPU with 2 less CUs but higher clock, same flops and similar bandwidth (72GB/s), maybe Durango has better bandwidth with it's custom memory setup it has, or different number of ROPs etc. this should still be a roughly good representative for the performance.
HD 7850 is roughly same as Orbis rumored GPU with similar flops, less CUs but higher clock again, same scenario as said above applies here.
looking at Crysis 2 and Battlefield 3 results, which are probably the closest to 'nextgen' game we have right now, both should handle 1080p at 30fps, the result should improve on the console vs these PC results. Orbis probably can run the same game with similar image quality at 60fps even in some cases.
maybe later in the generation they decide to go down to 720p, as these games really don't look all that good and are old already and devs want to push gfx further.
yeah, 1600x900 actually doesn't look bad on a 1080p TV screen for games. I actually play at that on TV here and there to get better frame rate on my aging PC and it's not an apparent difference at all, definitely worth the extra frame rate for me for minor loss in clarity which I can't tell when looking at a TV. 720p though it's pretty obvious. dynamic resolution should be good solution switching between 1600x900 - 1080p, this is probably what Durango games should do with these rumored specs to run the same game at same target FPS as Orbis.
Every time I read this my blood starts to boil. There are NO reasons to opt for 720p next-generation given the hardware they will work on.And/or 3D. 720p will be the standard for next gen
how about 1080i/60? you get 60fps and 1080p but with the fillrate of 720p/60. Works for TV broadcasts (TV would show a 1080p image anyway)
More hardware lights and texture layers? I don't think so.
Also gekko was simply less advanced a gpu when compared to the Xbox. Stuff like bump mapping and all those bells and whistles were supported by the Geforce 3 gpu feature set.
A couple of developers certainly did very interesting things with the Gamecube, just like many did with the PS2. But Xbox was simply a gen ahead in terms of features supported by its gpu. You can just look at a multiplatform game like Chaos Theory to understand the native differences these consoles had.
http://timothylottes.blogspot.ca/2013/01/pulled-post.html
Pulled the post. :-(
Console warriors strike again? or maybe bad PR for MS?
720p is just fine if you have good AA.
Nintendo hasn't released a piece of console software that changed the video game landscape from a retail standpoint since before the N64. What are they going to do?
Another Mario? Zelda? Those aren't game changers, those are Nintendo's core and those people didn't propel the N64 past the PS1 or the Gamecube past the PS2 or even the Xbox as far as 3rd parties were concerned.
Nintendo's surge in the living room last generation was powered entirely by hardware differentiation. Different input and a much lower starting price point. This generation's hardware deviation is failing to show the same kind of mass appeal and lacks as much of a pricing edge. See the problem?
Nintendo will continue to make healthy profits but they aren't going to suddenly release a piece of software that changes the retail landscape and therefore aren't going to flip 3rd parties away from much more advanced hardware to their console.
I would assume those claiming 720p look like a blurry mess are using a 1080p panel.
720p DON'T look like a blurry mess unless upscaled.
Given how the "50% more" comment was openly mocked on B3D for being inaccurate, yet is still getting thrown around out of context, maybe he just wanted to remove his comment for the logical reason he cited.
Those minimum framerates are appalling if you want AA, though.
http://timothylottes.blogspot.ca/2013/01/pulled-post.html
Pulled the post. :-(
Console warriors strike again? or maybe bad PR for MS?
Its a shame. You have to put things in Junior High level terms for people to understand.
you'll take FXAA and you'll like it
it looks kind of bad right now in 720p console games, but at 1080p, if highest quality presets are used and tweaked a bit for the look of the game you're making, it should look just fine. if there is 8x-16x AF it should make everything look a lot cleaner than what we have these days.
MSAA probably going to be pretty rare with these GPUs.
Posts like this and the petty b3d mocking is why I guess the post was removed.
Typical dick waving, point scoring nerd crap vs good respectful speculative discussion = pointless headaches. I get it now.
Petty mocking by random posters maybe, but posters like ERP/bkilian/people-in-the-know poking fun because some of his speculation was just plain wrong is completely valid. It's a shame some are taking his statements out of context, and forced him to take it down.
720p is just fine if you have good AA.
Do tell how this works. I haven't paid increased prices for my PC titles for the benefit of an overall better experience. You would do well to know most studios create assets for technology that can't run them... then scale back to fit the platform. They already create with fidelity higher than what we see with consoles/pc.Won't happen unless you also agree to pay more for games.
Low level programming also takes a larger magnitude of lines of code in a program. Meaning that development costs will go up since games will take longer to release.
Also an interesting fact, the average programmer at work only types 20 lines of useful code a day.
To put this into perspective here is a machine level langauge for the IBM Z390.
This program adds two preset numbers:
Code:PRINT NOGEN EQUREGS ADD SUBENTRY L R2,ONE A R2,TWO XPRNT MSG,L'MSG XDECO R2,OUT XPRNT OUT,L'OUT SUBEXIT MSG DC C'Adding Numbers' ONE DC F'1' TWO DC F'2' OUT DS CL12 END
Highly efficent and fast at run time, but takes ages compared to a high level program to code.
Heres an another example of the same concept, but with user input to change variables in Python.
Code:C = lambda n, k : n + k
I'm not saying low level coding isn't great for getting things to run efficient and fast, it just isn't pratical in the way businesses are run today.
I should also note debugging on machine level code is a lot harder than it is on its high level language counterpart.
http://timothylottes.blogspot.ca/2013/01/pulled-post.html
Pulled the post. :-(
Console warriors strike again? or maybe bad PR for MS?