KojiKnight
Member
EDIT: Also it seems that Shadows of Mordor isn't actually using 6GB on ultra settings.
This is a little misleading, it seems to be maxing out Vram on all 4GB cards at the very least even if performance isn't bad on them.
EDIT: Also it seems that Shadows of Mordor isn't actually using 6GB on ultra settings.
Ultra textures are not enabled in that video. These are just high.
That video says "Textures:Ultra"
So I beg to differ
He says it didn't stutter. It did fill up his 4gb vram but he didn't notice any stuttering. Not sure if he was playing on 1080p or 1440p. Will ask him when he's back if anyone's interested.
This is a little misleading, it seems to be maxing out Vram on all 4GB cards at the very least even if performance isn't bad on them.
That video says "Textures:Ultra"
So I beg to differ.
The game only comes with "Textures"High" by default. Ultra has to be downloaded separately
Read the description in the settings, ultra will only differ from high IF the HD textures are installed.
Well the user doesn't state that he does or doesn't have them either right?
I'm pretty sure its said that PS4 version is 60fps.Reading through the PC performance thread and reading about the framerate on PS4 (probably 30 fps, maybe 30-40fps) makes this thread stupidly funny in retrospective.
I'm pretty sure its said that PS4 version is 60fps.
EDIT: Oh seeing in the other thread that its not actually 60fps. Just a case of bad information becoming widespread?
Yea, that'd be hilarious if somebody bought the PS4 version instead thinking it would run better. Sounds like even very modest PC's are able to run this at 1080p/60fps with decent settings.
And i thought I would make it till Witcher 3 before a full upgrade....damn.
But we already have better-looking games that require less VRAM (in some cases, considerably less).Yep....
For the games that are actually lookers, I shudder to think what their VRAM requirements will be when games that look like Shadows of Mordor already require 6. =.=
What would you think if Ryse looked and ran better on worse hardware?Yep....
For the games that are actually lookers, I shudder to think what their VRAM requirements will be when games that look like Shadows of Mordor already require 6. =.=
What would you think if Ryse looked and ran better on worse hardware?
Yep....
For the games that are actually lookers, I shudder to think what their VRAM requirements will be when games that look like Shadows of Mordor already require 6. =.=
But we already have better-looking games that require less VRAM (in some cases, considerably less).
Hell, Crysis 3 came out early last year and it uses considerably less VRAM while also being visually superior in many ways. Now, I don't expect every dev to live up to Crytek standards, but Monolith is obviously doing something wrong.
Yep....
For the games that are actually lookers, I shudder to think what their VRAM requirements will be when games that look like Shadows of Mordor already require 6. =.=
Visually superior in many ways doesn't matter a lot. These requirements are only for the ultra textures, the rest of the graphical effects don't have a big impact on VRAM. Also Crysis 3 is almost a corridor shooter compared to this.
Draw distance in and of itself doesn't really determine scene complexity. Objects out in the distance are going to have a lower LOD and less detail. In fact, according to Crytek, rendering the cityscape of Crysis 2/3 proved to be more technologically challenging than the more open jungle of the original Crysis. Things such as the number of light sources, the polygon count, materials, the number and quality of the shaders, etc. all play a much larger role. And since open world games stream in their assets the memory requirements aren't necessarily greater either.Ryse is a corridor simulator that's more linear and closed off than your average CoD game with a fov of about 30 with a game world about as static and spatially uninteresting as a fucking board that's painted prettily, and designed that way intentionally so the xbone wouldn't explode running it.
Hence I wouldn't give a shit if it performed better, because false equivalencies and tales from the ass from people who don't really know what they're talking about (see: 99.9% of this thread prior to launch) don't really mean anything at the end of the day, just performance over demand.
This post is full of stupidity.Reading through the PC performance thread and reading about the framerate on PS4 (probably 30 fps, maybe 30-40fps) makes this thread stupidly funny in retrospective.
I really hope that someone gave up on the PC version to buy the PS4 version because of this thread. That would be so good.
I think its spot on. People should learn to not overreact and wait for the facts to come in.This post is full of stupidity.
I dunno, Crysis 3 had some pretty open environments with a lot of variety and detail. Even if the gameplay was linear, can you really say a level like the dam is anything like a corridor shooter from a rendering standpoint?
Haha based on what? Two games. Next you'll be telling us 12 core processors are going to be mandatory for 1080p.
Batman: Arkham Knight's VRAM requirements will be very interesting, and telling of the future.
This post is full of stupidity.
I'm struggling to see the difference between high and ultra.
http://www.eurogamer.net/articles/digitalfoundry-2014-eyes-on-with-pc-shadow-of-mordors-6gb-textures
"They are on monster PCs making the highest possible quality stuff and then we find ways to optimise it, to fit onto next-gen, to fit onto PCs at high-end specs. Then obviously there's going to be that boundary where our monster development PCs are running it OK - but why not give people the option to crank it up? It makes sense to get it out into the world there - we have it, we built it that way to look as good as possible. You might as well, right?"
Based on the natural progress of technology.
It is inevitable. Come on.
Differences seem pretty marginal given the massive 6GB recommendation, there are a lot of textures I can see in any given shot Eurogamer posted that could've used a higher quality texture. Such as this obviously low res, pixellated looking pillar http://screenshotcomparison.com/comparison/93883
Or the floor that isn't the white outline. The ground texture that IS improved on the one shot, still looks marginally poor in quality. Even if it's an OW game, there should be a larger increase in quality for DOUBLE the VRAM.
Though I REALLY wish Eurogamer would STOP saving screenshots as JPGs with Chroma subsampling, THEN putting a text overlay on it and saving it AGAIN as a JPG.
FFS, use PNG screenshots, then convert to highest quality JPGs after the overlay has been added.
Here's another I found
http://screenshotcomparison.com/comparison/93719
There are some clear differences to definition to textures on the ground.
But the guy used JPG compression to the max.
C'mon guys. Use some common sense
Based on the natural progress of technology.
It is inevitable. Come on.
Yes, eventually, but that doesn't mean that kind of increased demands over these specs will happen this gen or very soon.
Not quite. The Watch Dog mods were basically just reactivating a few bits that got cut for the final release. But not everything. And those that did weren't always working 100% correctly either(probably why they got cut).It's only a matter of time before PC modders optimize and enhance things on the PC version in ways that the Devs either didn't have time, or didn't care to do, and then this game will REALLY look good, instead of 'pretty nice' as it does as of now. I mean a single guy fixed the Watch Dogs shortcomings, and helped it look as good as the 2012 reveal. With which of course, we were originally supposed to believe it would be of that caliber on consoles.
So game requires 6gb and launches with no sli support, was ultra just intended for titan users?
So game requires 6gb and launches with no sli support, was ultra just intended for titan users?
So game requires 6gb and launches with no sli support, was ultra just intended for titan users?
Well, there are are 6GB versions of the 280X I think, and the 295X2 has enough. But that is it.
SLI support doesn't allow for more VRAM just so you know.
So game requires 6gb and launches with no sli support, was ultra just intended for titan users?
Hm nope. I have a 6GB 780 in my PC atm - http://www.evga.com/products/product.aspx?pn=06g-p4-3787-kr
It doesn't really make sense to put a lot more detail in the textures if a very large majority can't appreciate them anyway. And you just can't make detailed textures everywhere in such a big game..
What? Game has environments that are 2km long, what are You talking about.Ryse is a corridor simulator that's more linear and closed off than your average CoD game with a fov of about 30 with a game world about as static and spatially uninteresting as a fucking board that's painted prettily, and designed that way intentionally so the xbone wouldn't explode running it.
Hence I wouldn't give a shit if it performed better, because false equivalencies and tales from the ass from people who don't really know what they're talking about (see: 99.9% of this thread prior to launch) don't really mean anything at the end of the day, just performance over demand.
Cant you use a lot of tricks with background scenery, though?What? Game has environments that are 2km long, what are You talking about.
Static? In what way?
From rendering standpoint it doesnt matter if You have 2km or 200km game world, they are demanding the same amount of resources per frame if the fidelity in visible scene is similar. The only difference is in art asset creation, so the artists power to create environments.
Anybody running this with a 750 ti?
People who do have the game.
Is it possible you could please make better/more comparisons than EG?
More 1:1 as possible? If using JPGs, please highest quality settings.
Then why even bother at all? And then double the VRAM requirement for such marginal differences?
And while it's true you can't make detailed textures for everything (Not to mention, it's going to be mip'd anyway). If you are going to offer a higher quality version, there at least be big detail improvements to as many obliviously low resolution textures as possible.
"They are on monster PCs making the highest possible quality stuff and then we find ways to optimise it, to fit onto next-gen, to fit onto PCs at high-end specs. Then obviously there's going to be that boundary where our monster development PCs are running it OK - but why not give people the option to crank it up? It makes sense to get it out into the world there - we have it, we built it that way to look as good as possible. You might as well, right?"
Quick comparison I made. The difference is minor (and it's only some textures, not all) so anyone with not so powerful systems shouldn't worry about not being able to run ultra
https://farm3.staticflickr.com/2946/15418808602_68946c1b17_o.jpg[IMG]
[IMG]https://farm4.staticflickr.com/3928/15232397619_d459e26204_o.jpg[IMG][/QUOTE]
Actually the difference seems to be quite pronounced in some of the screens I have seen, including these. You can see so much more detail in the ultra textures. Although high textures still look fine. If you keep your expectations in check and know that ultra would just be a more detailed version of the high textures, it doesn't really seem you could be disappointed.
It is not like the textures would suddenly completely change.
Looks like I'll need to hang on to my PS4 for multiplats.
console developers are ridiculous
i really hope Batman doesn't need some shit like 8GB minimum
For those interested in the 6GB vRAM rumor for Shadow of Mordor I can show that they were fake.
I just ran the Shadow of Mordor benchmark and with Maxed settings at 1440p (yes I downloaded the ultra hd textures pack) I get 94fps average.
I also played through the game a bit without an issue.
Youtube video of my benchmark below.
https://www.youtube.com/watch?v=DFg1FpWExY8&feature=youtu.be