• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry : Should you install The Witcher 3's day one patch ( XO version )

mike4001_

Member
So 1080p title screens and pre rendered videos makes a 1080p game now?.

This just calls for:

080zyuod.jpg
 

c0de

Member
I also heard you had to be well above 30/60 fps to have a real locked framerate, in order to accommodate the more intense moments in the game that would barely run 30/60 fps without the lock (i.e. when shit goes crazy).

But then again, that's just something I think I heard in other threads and I'm definitely not an expert, so don't quote me on that.

As I said before - you want to have the desired framerate for the worst case possible but that doesn't necessarily mean that the game runs way above in other situations.
 
I also heard you had to be well above 30/60 fps to have a real locked framerate, in order to accommodate the more intense moments in the game that would barely run 30/60 fps without the lock (i.e. when shit goes crazy).
.

A developer can decide up front to budget time for a frame. Nintendo for example do this to get their locked 60fps. You just don't make your engine do more than this. There is no intense moments - at least, none that you don't plan for. Obviously this is more feasible in some games than others. You aren't going to be able to budget Minecraft for the player setting off 10,000 TNT blocks.

So, you could have budgeted for 30fps and brought in some efficiencies to your engine that got you just over. In this case, you still get 30fps locked but you could get 30-35. Make sense?
 

Artex

Banned
This thread is pretty funny.

I'm waiting to platinum Bloodborne before I even consider picking this up. Me thinks we'll see more than 1 patch in the next 60 days.
 
But no one has asked/answered the most important question yet.
Are the boobie scenes 1080p?

But seriously, this seems like wasted effort on the dev side. Why does the game have this feature if only the title screen "benefits" from it?
 

MMaRsu

Banned
MS's desperation to get that 1080pr by all means is really laughable sometimes

It IS laughable, the fact that they want these games to offer some FORM of 1080P, whether its a menu or the game itself with worse fps ( Diablo 3 ).

If your machine cant handle it then maybe you should have made a machine thats capable of displaying the new gen games at 1080p. Otherwise just admit defeat on that front and let developers make it the best looking and performing at 900p if they can.

Too bad for Xbox gamers though. Although they made that choise themselves ( to buy the weaker next gen console ). Some people dont care about resolutions they say, but I think they would see the difference between 900p and 1080p.

Do you have a link to the source were microsoft demanded this?

Demanded what? 1080PR?
 
But seriously, this seems like wasted effort on the dev side. Why does the game have this feature if only the title screen "benefits" from it?
For marketing and moar $$$

I almost can't believe they did this. It's worse than dealing with used car salesmen. I wonder if they actually advertise 1080p on the back of the box. If so I hope they get sued.

I mean really though what's stopping any game from having any single scene or menu at 1080p with everything else at 900p (or even less) and calling it 1080p? It's very technically true. There is a 1080p part.

Good god this industry is going to kill itself.
 

PhatSaqs

Banned
Dumb shit as always SMH.

MS should just stop trying to fight this war and let their dev partners know that they arent to emphasize it.
 

RedAssedApe

Banned
But no one has asked/answered the most important question yet.
Are the boobie scenes 1080p?

But seriously, this seems like wasted effort on the dev side. Why does the game have this feature if only the title screen "benefits" from it?

no they are 900p on purpose to make things more blurry for the kids playing.
 

MMaRsu

Banned
For marketing and moar $$$

I almost can't believe they did this. It's worse than dealing with used car salesmen. I wonder if they actually advertise 1080p on the back of the box. If so I hope they get sued.

I mean really though what's stopping any game from having any single scene or menu at 1080p with everything else at 900p (or even less) and calling it 1080p? It's very technically true. There is a 1080p part.

Good god this industry is going to kill itself.

Yup tomorrow I will check the back of the Xbone box if it says 1080P then WTF
 
Ah, see I was thinking that if it ran at 30 fps 90% of the time, the only way for it to be less than 30 fps if it had semi-major drops. Like, if for 9 seconds it was running at a solid 30 fps, the only way to average 29.55 would be if it dropped to 25.5 for the last second, which would be pretty major.

Since it's averaging slightly less than 30, the only explanation is that there are times where it is running less than 30.



?



Bad explanation on my part. I tried to explain it as you have a list of measured framerates where each framerate is measured over a certain amount of time, and then the different framerates added up and divided by the total amount of framerates, but then I realized that the way it was probably done was probably just by taking the total amount of frames and dividing by the number of seconds, total. Both go to the same answer, but the former tells how performance really changes over time (as framerate is measured at different times, saved, and then the measurement starts completely anew) while the other just gives overall performance.

...I think I'm still explaining this badly. If I still sound stupid let me know and I'll think of a good way to explain what I'm thinking.



Funny.


ughhh...i'm not even in the mood to explain.


1) capped at 30 (or rather, 29.97)

2) 29.55 is at the 98.5% target

3) if it drops at 15 fps, it needs a 45 fps render at some point to even it out. but, refer to point 1. capped at 30 fps.

4) which means, to be able to get to 29.55, you would have to have very infrequent drops or drops that aren't low, or both.

5) avg fps should be measured as fps over time, rather than taking total frames and dividing by time (which wouldn't be preferable). it is an avg of an avg, you see. and it has weight (mathematical). though i don't know what algorithm blim used. and it isn't about minimum/maximum. even if it drops at 15 fps, you don't need to render 45 fps to get to 29.55. you'll need it to stay at 29.97 for a very long time, consistently to cancel that one instance out of the overall avg until it becomes insignificant enough and be an outlier.

6) your example, again, is ughhh-inducing. "pretty major". no it wouldn't. as your example suggests, for every 5 fps drop it needs to stay the other 9 times at the highest framerate it is programmed to render. which leads to your earlier statement "half the time". see now, where your "do you know how averages work" logic lie in all this? not to mention, blim's 29.55 fps is recorded definitely longer than 10 seconds. the probability being 90% of the time it isn't 29.97, that could well mean the drops aren't severe or they aren't frequent. i could say the same thing. "what if fps is 29.6 for 9 times out of 10 and that last second it is 28.6? that would round up to 29.55 and that's not major." see what i did there?
 

Three

Member
It can but by their report in the segment they played it didn't.

Yea I'm well aware of that, we'll just have to wait and see how the XB1 fares when "the going gets tough".

Let me try and explain blimblims test to you and neogaffer1. On the XB1 version 67 frames were repeated 3 times or more (ie it stuttered below the 33ms per frame required for 30fps) on the PS4 80 frames were repeated 3 times or more. That suggests both go below 30 at some point but that the PS4 does this 13 more than the other. This on the grand scheme of things is nothing for both since we are talking about thousands and thousands of frames.
The other thing in blimblims test was max dupes.
The max dupes (an indicator of lowest framerate hit or stutter) was on the XB1. Its max duplicate frames was 4 meaning it showed the same image for 4 consecutive frames. PS4s max duplicate frames was 3. The final thing was framerate average. The framerate average being higher than 30 does not tell you anything about the lowest framerate or deviation. If one is capped and the other isn't then you get an average that is higher but it tells you nothing about consistency or range on both.
 
PS4 at 1080p has some minor drops so how XB1 can do 1080p when its weaker than PS4 by GPU compute roughly 30-40%. I think MS wanted 1080p someway as they are marketing this game instead of 900p as answer.
 

FranXico

Member
This is just as bad as the KZ:SF multiplayer mode interlacing 1080pr antics, if not worse.

What the hell were you thinking, CDPR?
 
This is just as bad as the KZ:SF multiplayer mode interlacing 1080pr antics, if not worse.

What the hell were you thinking, CDPR?

i think the 960x1080 resolution plus doubling of frames in killzone is totally morpheus. i mean, it's so obvious. that's the resolution-per-eye on morpheus, and guerrilla tried to replicate frames. and the fps is like 40-50. ahaha.
 

Vlodril

Member
Spend years trying to build a reputation for fairness and being pro consumer and then destroy it over a marketing deal about a game that will sell anyway.

no idea what the hell cd project red has been thinking the past few months.
 
Who cares what resolution it is? Just play the game and have fun with it. This game is going to be a huge success regardless of what resolution the PR says it runs at.

Since these types of threads seem to bother you so much, perhaps it would be a happier and more productive use of your time if you would just...not post in them, and hit your Back button?

Some people here do care about this stuff. Get used to it, and get over it.
 
Spend years trying to build a reputation for fairness and being pro consumer and then destroy it over a marketing deal about a game that will sell anyway.

no idea what the hell cd project red has been thinking the past few months.

well, they signed that ink when xbone was parading the bullcrap in 2013. so their deal goes way back. probably even way before cdpr knew poland was tier 2.

i look at it more as contract obligation, to be honest. they probably received a "request" to improve the resolution much like how diablo 3 was at first.
 

RedAssedApe

Banned
Spend years trying to build a reputation for fairness and being pro consumer and then destroy it over a marketing deal about a game that will sell anyway.

no idea what the hell cd project red has been thinking the past few months.

prosumer is a myth. businesses are out to make money
 

Chobel

Member
Either MS pushed 1080p on CDPR, or the word "1080p" actually increases sales (significantly) for Xbox games. I'm leaning toward the latter.
 

kingkaiser

Member
No, it doesn't. Some of you guys have no idea what you're talking about. XD

No need to be arrogant, I just mixed up a capped framerate with a locked one.

In my opinion a capped framerate is not much better than an unlocked one, you still get drops while capped.

It's just really sad the Xbox One isn't even capable of locking 30fps, despite rendering 99% of the time in 900p.
 

SRTtoZ

Member
So its not really dynamic then? I remember in Wipeout HD it would be 1080p normally and if it ran into heavy traffic, lots of shit going on then it would automatically lower itself on a frame by frame basis. TW3 on XB1 sounds like they already determined the resolution based on where you are. If you're outside its 900p, if you are indoors its somewhere in between, and in cutscenes and the start menu its 1080p. Not really the dynamic resolution we all thought it would be.
 

c0de

Member
i meant there are games where xbox isn't okay with being labelled 900p on xbone. destiny, diablo 3, etc. witcher 3 seems to be one of those titles.

Well, if the console is still capable of doing so, why not? Especially Destiny. There is no point in it being only 900p. Diablo 3 took a hit but nothing major.
It would be totally different if the resolution bump would cut the framerate in half.
 
DF does say there's evidence the resolution does go up indoors, not to 1920x1080 but it does increase so there's your dynamic resolution and from what I've seen the indoor is not a separate scene, like a hut for example unlees you guys are counting that as separate?

For the frame rate tests I'd like to see later battles. Going off W1/2 you won't be just fighting a couple of drowners or a griffin and in the video it's often at 31-32 fps with that alone.
 

Wereroku

Member
DF does say there's evidence the resolution does go up indoors, not to 1920x1080 but it does increase so there's your dynamic resolution and from what I've seen the indoor is not a separate scene, like a hut for example unlees you guys are counting that as separate?

For the frame rate tests I'd like to see later battles. Going off W1/2 you won't be just fighting a couple of drowners or a griffin and in the video it's often at 31-32 fps with that alone.

In some indoor areas they believe the resolution increases. This is not true for all indoor locations just some.
 

KaYotiX

Banned
All the dumb marketing speak just sucks. Just make it 900p,lock it at 30 and be done with it.

I still love my Xbone and I know it's less powerful than the ps4 but it looks even worse when they try all this PR spin.
 
All the dumb marketing speak just sucks. Just make it 900p,lock it at 30 and be done with it.

I still love my Xbone and I know it's less powerful than the ps4 but it looks even worse when they try all this PR spin.

As this news begins to perculate I must admit this approach would have served them better. But it does give you an insight into how much of an impact and importance resolution is to MS.

Someone needs to do a chalkboard of MS' res PR.
 

Snorlocs

Member
All the dumb marketing speak just sucks. Just make it 900p,lock it at 30 and be done with it.

I still love my Xbone and I know it's less powerful than the ps4 but it looks even worse when they try all this PR spin.
I agree wholeheartedly. 900p won't kill anyone
 
I don't know why people keep saying Microsoft is doing this "1080p PR spin" with the dynamic resolution for The Witcher 3. Microsoft hasn't said a word about it, only CDPR. In fact, Xbox's Youtube Channel has the same video: https://www.youtube.com/watch?v=eHRiKyzIZt8 only they don't mention dynamic resolution or 1080p.

Now, on topic, I hope CDPR releases a patch to address the unlocked frame rate. I have the game preloaded on Xbox One, just don't know if I should play it or not because I really don't care for constant frame rate fluctuations.
 

Kinyou

Member
Oh man, this blows. I guess I'll skip the patch for now. I can live with a little graphic downgrade, but stuttering pre-rendered cutscenes would really affect my enjoyment.
 
Great. If there's one thing I didn't look forward to, it's whether or not I should download this patch or not. To me at least, it's not really clear if one should download it.
 

Noobcraft

Member
I don't know why people keep saying Microsoft is doing this "1080p PR spin" with the dynamic resolution for The Witcher 3. Microsoft hasn't said a word about it, only CDPR. In fact, Xbox's Youtube Channel has the same video only they don't mention dynamic resolution or 1080p.
The narrative that big bad Micro$oft is trying to deceive consumers is so much more fun than entertaining the idea that CDPR added the feature because they wanted to push the hardware more when available.
 
Top Bottom