• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4's AF issue we need answers!

Javin98

Banned
Lol this is past being a joke...

c862df808c6558b4721110763b3eb657.png


Dat PS4 tri-linear filtering tho
To be fair, this game was probably so rushed that they didn't even put in any effort to enable AF on PS4. Like others said, it's probably on by default on the XB1 SDK, but some additional work has to be done to turn it on on PS4.

Huh, this issue still exists? Very strange.
At least the exclusives all seem to have AF enabled.



Man, I would love to know the reason why many PS4 titles ship without AF despite not impacting performance when devs do enable it.
My theory is that AF is actually implemented in those games from day one. However, the devs either forgot to turn on AF in the PS4 SDK and trilinear filtering is the default. So a patch simply turns it on and as a result, there is no difference in performance. Just my two cents anyway.
 

leeh

Member
If one platform has it then they don't just "forget to turn it on".

I'm going to throw it out there. It's probably the single pool of RAM and the CPU access which really hits on what it can give to the GPU. It would cripple the render time due due the lack of throughput to the RAM. AF just needs memory bandwidth.

The Xbox has its small scratchpad which is ideal for the job.
 

Javin98

Banned
If one platform has it then they don't just "forget to turn it on".

I'm going to throw it out there. It's probably the single pool of RAM and the CPU access which really hits on what it can give to the GPU. It would cripple the render time due due the lack of throughput to the RAM. AF just needs memory bandwidth.

The Xbox has its small scratchpad which is ideal for the job.
If memory bandwidth was the issue on the PS4, then how do some games have even better AF on PS4 than on XB1? Your argument isn't making any sense. Furthermore, some games have 8x-16x AF on both consoles. Clearly memory bandwidth isn't the issue here.
 
If one platform has it then they don't just "forget to turn it on".

I'm going to throw it out there. It's probably the single pool of RAM and the CPU access which really hits on what it can give to the GPU. It would cripple the render time due due the lack of throughput to the RAM. AF just needs memory bandwidth.

The Xbox has its small scratchpad which is ideal for the job.

How do you explain the games that add it after a patch without any effect on framerate then?
 

hawk2025

Member
If one platform has it then they don't just "forget to turn it on".

I'm going to throw it out there. It's probably the single pool of RAM and the CPU access which really hits on what it can give to the GPU. It would cripple the render time due due the lack of throughput to the RAM. AF just needs memory bandwidth.

The Xbox has its small scratchpad which is ideal for the job.


We've heard no evidence of this being the case from anyone, and there's plenty of evidence against it since it's not an occurrence across the board.

Coupled with several games receiving patches with costless (if limited) implementations post-release, I'd guess you are very likely wrong.
 

Javin98

Banned
Honestly, I don't mean to be harsh, but sometimes I think leeh doesn't know what he is talking about. Just like in that racing face off thread...
 

Metfanant

Member
Yeah but the Xbox One version has better AF no problem. Again, to do with the fact it's not being made clear enough for the developers, no?

No, no...AFAIK, there are no games that have "better AF" on the Xbone vs PS4...its always just nonexistent on the PS4 versions that have the "problem"

Well not really...it IS a dev thing. The most recent example is this game, and Rainbow Six Siege. It's puzzling to say the least, especially considering Xbox One has higher texture filtering. Clearly the issue hasn't been solved yet...



Because games are still bizarrely running at a lower AF on PS4 than Xbox One? Not hard to understand, and we're making a list of games that can hopefully get the attention to be patched. Worked with DmC and Dying Light. And with games still suffering from this, we're back to square one, and via DF analysis, I'll keep posting in this thread as that's what it's for. Screenshots to highlight low quality AF or lack thereof on PS4 for no apparent reason.

Again, not lower AF...you just get NO AF on the PS4...which clearly shows the devs not implementing it properly based on the fact that plenty of games have had it patched in with ZERO performance impact...

Just curious, why doesn't the issue exist on XB1 then?
as has been explained a thousand times...engine/api issues

If one platform has it then they don't just "forget to turn it on".

I'm going to throw it out there. It's probably the single pool of RAM and the CPU access which really hits on what it can give to the GPU. It would cripple the render time due due the lack of throughput to the RAM. AF just needs memory bandwidth.

The Xbox has its small scratchpad which is ideal for the job.

Any evidence to support this idea is disproved by the list of games that have had AF patched in without any performance hit on PS4
 

Kayant

Member
If one platform has it then they don't just "forget to turn it on".

I'm going to throw it out there. It's probably the single pool of RAM and the CPU access which really hits on what it can give to the GPU. It would cripple the render time due due the lack of throughput to the RAM. AF just needs memory bandwidth.

The Xbox has its small scratchpad which is ideal for the job.

This doesn't make sense because XB1 also has AF "issues" + There is nothing pointing towards XB1 being better suited because the opposite has been true when PS4 has better texture filtering.
 

Boglin

Member
Then pm him instead of making a post in the thread. What is this good for?

A spotlight on conjecture can help other readers, especially the ones who aren't hardware enthusiasts, realize other explanations may be more accurate and hopefully prevent them from parroting said conjecture as if it came from a place of authority.
 

leeh

Member
Honestly, I don't mean to be harsh, but sometimes I think leeh doesn't know what he is talking about. Just like in that racing face off thread...
Sorry I got a technicality wrong from the 90's. When I was a kid. Unrelated to anything gaming related, rather the broadcast standards of the 90's.

Patched in without frame rate issues = work and optimisation, which is great to see
XB1 AF issues = Well it's only 32mb which has to handle the frame buffer, developers may not use the eSRAM like that and attempt to stream between the GPU and DDR which wouldn't be great at all. Depends how their engine was designed.
Evidence of my quote = Well, it's still happening in games. People don't just 'forget'.

Why do people take things so defensive? I was just posting an idea to what I think it could be. The bandwidth will take a large percentage hit when CPU requires access. AF needs a lot of bandwidth.

It doesn't effect PC because modern GPU's have a full pool similar to the PS4 which doesn't need to share a bus with the CPU.

It just makes sense to me. I'm welcome to be disproven, since this is discussion and I like learning.
 

c0de

Member
A spotlight on conjecture can help other readers, especially the ones who aren't hardware enthusiasts, realize other explanations may be more accurate and hopefully prevent them from parroting said conjecture as if it came from a place of authority.

Than the post should've come from a user who is an authority himself which didn't happen.
 

televator

Member
Are people really spinning this into PS4 lack of technical capacity? I mean the examples of patched games with zero negative impact on performance are right in this thread... AF is fucking peanuts on PC and has been for generations on even weaker hardware than current console tech with far smaller memory pool. Crank that bitch all the way up... I dare you to tell me that you could count any loss in framerate.
 

Thrakier

Member
Are people really spinning this into PS4 lack of technical capacity? I mean the examples of patched games with zero negative impact on performance are right in this thread... AF is fucking peanuts on PC and has been for generations. Crank that bitch all the way up... I dare you to tell me that you could count any loss in framerate.

Is there proof that it did not affect framerate? I'd never trust people's eyesight when it comes to framrearate. 90% out there habe 24fps as their standard.
 

televator

Member
Is there proof that it did not affect framerate? I'd never trust people's eyesight when it comes to framrearate. 90% out there habe 24fps as their standard.

On DMC and Street Fighter? If one frame was lost... Believe that frame counters would go fucking ape shit.
 

Durante

Member
Are people really spinning this into PS4 lack of technical capacity? I mean the examples of patched games with zero negative impact on performance are right in this thread... AF is fucking peanuts on PC and has been for generations on even weaker hardware than current console tech with far smaller memory pool. Crank that bitch all the way up... I dare you to tell me that you could count any loss in framerate.
Conversely, if AF is categorically "free" on PS4, then why do even first party titles by GAF's favourite developers use it only selectively?

More accurately, AF costs -- on any hardware platform -- depend on the exact scenario. It's not always free on PC either:

Now, the question of AF performance cost, if any, is separate from whether that cost is worth it (I believe it always is), and again separate from why such a relatively large number of PS4 titles still don't feature any AF at all.
 

Boglin

Member
Than the post should've come from a user who is an authority himself which didn't happen.

That would be ideal, but I still think it's good to have a multiple perspectives on the source of an opinion when it is to be informative. Although I guess since Javin98 doesn't have authority to dissent that I should take it as truth that leeh is "probably" correct until somebody more qualified comes to say otherwise.

It's probably the single pool of RAM and the CPU access which really hits on what it can give to the GPU.
 

c0de

Member
That would be ideal, but I still think it's good to have a multiple perspectives on the source of an opinion when it is to be informative. Although I guess since Javin98 doesn't have authority to dissent that I should take it as truth that leeh is "probably" correct until somebody more qualified comes to say otherwise.

While concurrent access from CPU and GPU does indeed decrease the bandwidth on PS4, I guess it's not that much so that it stops AF from being implemented.
 
Didn't Halo 5 make some concessions and drop AF or at least have very low AF? I don't think Dying Light on PS4 means every PS4 game should have it, if anything its an outlier.

Also 2xMSAA is not very good on some renderers and costly. I'd rather devs look into new ways than being forced to use some crappy 2xMSAA that hardly covers anything.
 

Guymelef

Member
Didn't Halo 5 make some concessions and drop AF or at least have very low AF? I don't think Dying Light on PS4 means every PS4 game should have it, if anything its an outlier.

Also 2xMSAA is not very good on some renderers and costly. I'd rather devs look into new ways than being forced to use some crappy 2xMSAA that hardly covers anything.

Also Rise of Tomb Raider, but nobody screams when this happens...
 
This is just theoretical, but I wonder what would happen if Sony forced 12xAF on all games in a firmware update. Like forcing it through a GPU control panel.

It would have been interesting to see what games were negatively affected by it.
 

c0de

Member
This is just theoretical, but I wonder what would happen if Sony forced 12xAF on all games in a firmware update. Like forcing it through a GPU control panel.

It would have been interesting to see what games were negatively affected by it.

I don't think that it will work "this way".
 

Slaythe

Member
We already have hindsight on this through Ninja Theory.

Lack of PS4 AF did not make it to QA, they handled the gameplay revisions, another developer handled porting the game to ps4 and xbox one.

They realized the problem after shipping the game, went back to fix it, took about 2 weeks, it did require testing. Then another week for Sony approval.

Then we got a great x8 AF patch with no performance impact.

But it's not as easy as turning it on or off it still requires testing.

So rather than on / off switch, it's just a matter of getting AF to work with decent results without draining performances, which isn't the hardest thing to do on a finished game, with a working version on weaker hardware already made.
 

Marlenus

Member
Conversely, if AF is categorically "free" on PS4, then why do even first party titles by GAF's favourite developers use it only selectively?

More accurately, AF costs -- on any hardware platform -- depend on the exact scenario. It's not always free on PC either:

Now, the question of AF performance cost, if any, is separate from whether that cost is worth it (I believe it always is), and again separate from why such a relatively large number of PS4 titles still don't feature any AF at all.

I was going to comment on it being odd that the newer cards were showing a greater performance hit than the older cards but if I remember correctly didn't the 5xxx series radeon introduce fully angle independent AF which likely has a higher cost than the previous versions.

Still you would expect performance to improve from the first implementation to the ones used in the consoles three generations later.

As others have said it is likely just an api difference regarding default settings on the platforms. It should really be caught in QA though as the IQ improvement is huge for the performance cost.
 
Don't understand what the big deal here is. PS4 can run circles around AF.

It's probably some stupid default property in the IDE/ executable compiler that no one bothered to fix yet.

You think Sony will listen to Robomodo (THPS5) when they open an internal ticket about minor wrong compiler setting?!?
 

c0de

Member
Don't understand what the big deal here is. PS4 can run circles around AF.

It apparently can't because otherwise we would see a higher level of AF when AF is applied (which seems to be thought of by intent so devs spend time on AF, meaning it's not a too low setting by default).

It's probably some stupid default property in the IDE/ executable compiler that no one bothered to fix yet.

This is when no AF is at work but that is not what Durante said as a whole. Why do even first party studios do it selectively and why isn't it 16x per default when there is no performance impact?
 

Chobel

Member
Sorry I got a technicality wrong from the 90's. When I was a kid. Unrelated to anything gaming related, rather the broadcast standards of the 90's.

Patched in without frame rate issues = work and optimisation, which is great to see
XB1 AF issues = Well it's only 32mb which has to handle the frame buffer, developers may not use the eSRAM like that and attempt to stream between the GPU and DDR which wouldn't be great at all. Depends how their engine was designed.
Evidence of my quote = Well, it's still happening in games. People don't just 'forget'.

Why do people take things so defensive? I was just posting an idea to what I think it could be. The bandwidth will take a large percentage hit when CPU requires access. AF needs a lot of bandwidth.

It doesn't effect PC because modern GPU's have a full pool similar to the PS4 which doesn't need to share a bus with the CPU.

It just makes sense to me. I'm welcome to be disproven, since this is discussion and I like learning.

Here's a counter argument:
a. CPU bandwidth is a very low number in general. For the most of CPU tasks latency is more important than bandwidth. That's why you don't generally see much difference in performance on PC between 2 and 4 channel memory platforms.
b. Maximum CPU bandwidth possible is a known number on PS4.
c. Even if we subtract that number from the whole PS4 GDDR5 bandwidth we're still left with a figure which is several times higher than that on slower PC GCN cards _and_ on XBO.
d. AF is cached on modern GPUs and its external bandwidth requirement is actually very low. Hence why it's nearly "free" on low end GPUs which don't have even 1/10th of PS4 memory bandwidth.

This argument is invalid and can't be the reason why PS4 has no AF in some titles when compared to XBO and PS3.
 

leeh

Member
Here's a counter argument.
Does memory access allow concurrent read/write between CPU/GPU? I'm wondering that if not, your still restricting time away from the GPU to access the memory, for example if the CPU is gearing up for the next frame before the GPU has finished. Do modern engines tend to do this?

Would you be able to explain the AF caching or link me? I don't understand how you can cache a series of textures which need to keep being passed to filter. Surely you'd need a small subset of scratchpad ram next to the GPU to achieve that?

It's definitely a weird situation though. The Journey one specifically made me raise an eyebrow.
 
It would be nice to get a real 100% answer to the AF questions and why it seems to be lacking on some PS4 games until it's patched in after, and it's been said it's likely a dev kit issue but I really can't believe people are bringing Tony Hawk 5 into the debate, I mean seriously, AF is the least of that games problems and makes little difference when the game looks that bad anyway.
 

dr_rus

Member
Does memory access allow concurrent read/write between CPU/GPU? I'm wondering that if not, your still restricting time away from the GPU to access the memory, for example if the CPU is gearing up for the next frame before the GPU has finished. Do modern engines tend to do this?
There are several channels so there are several links which can be used simultaneously. A modern engine should be aware of caching issues which may arise with simultaneous memory access though - especially the engine which was build for previous generation of h/w like UE3. So it should avoid it as much as possible.

Would you be able to explain the AF caching or link me? I don't understand how you can cache a series of textures which need to keep being passed to filter. Surely you'd need a small subset of scratchpad ram next to the GPU to achieve that?
An AF texture fetch is a fetch of several texels around the one you need to show. For 16x AF you're fetching not one but 16 texels. The thing which makes "free" AF possible is that you actually do fetch all texels because you generally need to show the whole texture anyway and if you have enough of texture cache to hold the fetches of at least 32 neighboring texels then you don't need to fetch anything from RAM to perform AF around such texel. 32 texels for a DXT compressed texture is something like 16 bytes. Texture caches in modern GPUs are several megabytes. Thus it's entirely possible to hold all the texture data needed for AF fetches in texture caches. The trick is to reuse texels which you're fetching for texturing anyway.

Guys, let's not go back to the h/w discussion. There is nothing wrong with PS4 h/w and there is nothing which makes AF any harder to perform on PS4 than on Xbox (or PC for that matter). The only reason why some games are missing AF on PS4 while their XBO versions have it is the lack of proper QA on developer part.
 

thelastword

Banned
Sorry I got a technicality wrong from the 90's. When I was a kid. Unrelated to anything gaming related, rather the broadcast standards of the 90's.

Patched in without frame rate issues = work and optimisation, which is great to see
XB1 AF issues = Well it's only 32mb which has to handle the frame buffer, developers may not use the eSRAM like that and attempt to stream between the GPU and DDR which wouldn't be great at all. Depends how their engine was designed.
Evidence of my quote = Well, it's still happening in games. People don't just 'forget'.

Why do people take things so defensive? I was just posting an idea to what I think it could be. The bandwidth will take a large percentage hit when CPU requires access. AF needs a lot of bandwidth.

It doesn't effect PC because modern GPU's have a full pool similar to the PS4 which doesn't need to share a bus with the CPU.

It just makes sense to me. I'm welcome to be disproven, since this is discussion and I like learning.
This is a tech thread, your views are welcome here, like in any other thread. People who just go on a limb and say persons don't know what they're talking about (because they disagree) are just displaying bad forum etiquette.

I will say this, I don't see it your way, certain enhancements were made to the PS4's pipeline to enhance communication between GPU, CPU and Memory. One of these enhancements is that there's an extra bus placed on the PS4 GPU that allows reading and writing directly to system memory, as much as 20 GB p/s can flow through that bus, which is superior to the PCi-E on many PC's, maybe apart from PCi-E 3.0.

There are further enhancements, but that's more or less related to asynchronous compute. The point is, AF is not as resource intensive as you're implying, still, GDDR5 is much faster than DDR3 and better suited for games in the first place. The fact that the PS4 has a better GPU with better memory is enough to give it the logical advantage in all things GPU related, which includes AF.

The evidence of games not having AF on PS4 highlights a problem with devs not doing so, whether it's from lack of awareness (a polite way of putting it) or the lack of an automatic implementation or default preset from the sdk. The former, the latter or the combination of the two all apply and has nothing to do with the hardware.

In any case, thanks for your contribution....it's welcome here.

We already have hindsight on this through Ninja Theory.

Lack of PS4 AF did not make it to QA, they handled the gameplay revisions, another developer handled porting the game to ps4 and xbox one.

They realized the problem after shipping the game, went back to fix it, took about 2 weeks, it did require testing. Then another week for Sony approval.

Then we got a great x8 AF patch with no performance impact.

But it's not as easy as turning it on or off it still requires testing.

So rather than on / off switch, it's just a matter of getting AF to work with decent results without draining performances, which isn't the hardest thing to do on a finished game, with a working version on weaker hardware already made.
I don't think that this proves your point (that's it's more complicated to implement). Every patch requires testing by Sony, so the time it takes to get that approved does not suggest how long it took the dev team to do so. Also, the quick submission of AF patches on many PS4 games suggests that it's a very trivial implementation if you care to do so....;)
 

Kezen

Banned
A note on AF, courtesy of Beyond3d :
In D3D11 and corresponding hardware AF is a per-sampler setting. It does have a significant cost that increases as you go up if you just do it naively, but since it's per-sampler it can be targeted to where it needs it most. Most textures do not need AF at all. For the ones that do maybe only the normal map or diffuse map needs it. For most textures it's really hard to notice values above 4x even on long glancing views.

If you just naively set all your textures to 16xAF you're going to have terrible performance. If you set a large amount of the scene to 4x or even 2x you're still often bumping into unacceptable performance deltas. When we talk about games that have "patched in AF" chances are that you're talking about a handful of materials changed to have a slightly higher setting. And when you're talking about "no performance penalty" you're talking about something you don't have the tools to measure appropriately in the consumer world.

This is really simple. Why don't games have AF? Because it's expensive to do it naively and only a handful of textures benefit from it, so the smartest thing to do is default it to off and raise it by hand as needed. See a blurry texture? An artist hasn't checked the "use AF" box on it. The end.

https://forum.beyond3d.com/threads/digital-foundry-article-technical-discussion.47227/page-512#post-1874346
 

Javin98

Banned
Then pm him instead of making a post in the thread. What is this good for?
I don't like others calling out, but leeh has made some questionable posts that just isn't the truth in the past. It was harsh and perhaps a tad unnecessary, I admit, but I don't think this is worth debating over.

Sorry I got a technicality wrong from the 90's. When I was a kid. Unrelated to anything gaming related, rather the broadcast standards of the 90's.

Patched in without frame rate issues = work and optimisation, which is great to see
XB1 AF issues = Well it's only 32mb which has to handle the frame buffer, developers may not use the eSRAM like that and attempt to stream between the GPU and DDR which wouldn't be great at all. Depends how their engine was designed.
Evidence of my quote = Well, it's still happening in games. People don't just 'forget'.

Why do people take things so defensive? I was just posting an idea to what I think it could be. The bandwidth will take a large percentage hit when CPU requires access. AF needs a lot of bandwidth.

It doesn't effect PC because modern GPU's have a full pool similar to the PS4 which doesn't need to share a bus with the CPU.

It just makes sense to me. I'm welcome to be disproven, since this is discussion and I like learning.
The thing is, your theory has already been disproven. dr_rus has explained it really well, so I will just give you a good example. Some multiplatform games may be missing AF entirely on PS4, but that's not evidence that the PS4 is lacking in bandwidth compared to XB1. You can't just use a few games to prove your claims when a few other games have better AF on PS4. Also, a few games that lacked AF at launch had it patched in a few weeks later. You may say it's optimization, but to me, the fact that it can be fixed in just a week or so (another week or more to get the patch certified) means that it was meant to be on by default, but something caused it to be turned off and defaulted to trilinear filtering instead.
 

If you just naively set all your textures to 16xAF you're going to have terrible performance

If your hardware is not the greatest, that is.
The thing is, your theory has already been disproven. dr_rus has explained it really well, so I will just give you a good example. Some multiplatform games may be missing AF entirely on PS4, but that's not evidence that the PS4 is lacking in bandwidth compared to XB1. You can't just use a few games to prove your claims when a few other games have better AF on PS4. Also, a few games that lacked AF at launch had it patched in a few weeks later. You may say it's optimization, but to me, the fact that it can be fixed in just a week or so (another week or more to get the patch certified) means that it was meant to be on by default, but something caused it to be turned off and defaulted to trilinear filtering instead.

"scratch pad" etc. ESRAM has an advantage stuff does not seem like the reason for it IMO, we have no evidence pointing in the direction that ESRAM is anything but a stop gap for them having to throw DDR3 in the system.
 

Javin98

Banned
If your hardware is not the greatest, that is.


"scratch pad" etc. ESRAM has an advantage stuff does not seem like the reason for it IMO, we have no evidence pointing in the direction that ESRAM is anything but a stop gap for them having to throw DDR3 in the system.
Yeah, pretty much. Also, I don't think they can possibly fit even 4× AF in 32MB of ESRAM, which is used for frame buffer anyway.
 

Durante

Member
A note on AF, courtesy of Beyond3d :
Pretty much. Well, I wouldn't quite co-sign the "terrible performance if you force 16xAF on everything" part, since that's pretty much what everyone does in the driver on PC and it seems to go well enough.

I really think at least diffuse maps should default to some decent level of AF in a toolchain. Also, this doesn't really explain platform differences, since you'd hope that if an artist "checks the AF box" for a material that would be a platform-independent setting!

Also, I don't think they can possibly fit even 4× AF in 32MB of ESRAM, which is used for frame buffer anyway.
This sentence doesn't really make any sense, at least the first part. The fundamental confusion seems to be that distinct levels of AF would require different amounts of memory. This is not the case.

As it relates to ESRAM (which is likely to be "not at all", since as you note it's generally used for render targets, not textures) you either have a texture in ESRAM, at which point you'll sample any level of AF from it, or you haven't got it in ESRAM and you don't.
 

dr_rus

Member
ESRAM bandwidth on XBO is still lower than GDDR5 bandwidth on PS4 even with the PS4 CPU bandwidth limit fully saturated. It may well be that with heavy CPU memory access there is some cache trashing happening which result in a lower measured b/w for GPU but even if that's the case - are we seriously assuming that XBO versions are using ESRAM pool for AF texture fetches? It's only 32 MBs after all and you still need to fit all render targets and off screen buffers in it. This limits your ~3-5 GB working set to a ~0.01 GB of "fast data" which would mean that only a couple of textures out of the whole set will be able to get AF.

Seriously, stop with this bandwidth nonsense. Bandwidth is not the issue here especially since PS4 has more bandwidth in pretty much all circumstances. Whatever is the reason - it's in the developers workflow, not in the hardware.
 

Fafalada

Fafracer forever
Durante said:
Also, this doesn't really explain platform differences, since you'd hope that if an artist "checks the AF box" for a material that would be a platform-independent setting!
Well the source of issues obviously isn't in developer toolchains, it's in the SDK, I do believe that was detailed some months ago in this thread?(unless I'm conflating it with something I read on a different website) The details of course aren't as simple as "SDK turns the whole thing off by mistake" or "hardware can't run more than 1xAF" so I don't expect any amount of further explaining will stop people from "demanding an explanation" :p

But to your specific point, AF boxes may be platform dependant if that's one of the settings that gets tweaked in "final" optimization passes where anything goes really.
 

dr_rus

Member
Well the source of issues obviously isn't in developer toolchains, it's in the SDK, I do believe that was detailed some months ago in this thread?(unless I'm conflating it with something I read on a different website) The details of course aren't as simple as "SDK turns the whole thing off by mistake" or "hardware can't run more than 1xAF" so I don't expect any amount of further explaining will stop people from "demanding an explanation" :p

But to your specific point, AF boxes may be platform dependant if that's one of the settings that gets tweaked in "final" optimization passes where anything goes really.

What was detailed is that there is no issues in the SDK. Toolchains are the only thing that's left hanging, especially if you consider that most of games with the issue are running on UE3.
 

leeh

Member
There are several channels so there are several links which can be used simultaneously. A modern engine should be aware of caching issues which may arise with simultaneous memory access though - especially the engine which was build for previous generation of h/w like UE3. So it should avoid it as much as possible.


An AF texture fetch is a fetch of several texels around the one you need to show. For 16x AF you're fetching not one but 16 texels. The thing which makes "free" AF possible is that you actually do fetch all texels because you generally need to show the whole texture anyway and if you have enough of texture cache to hold the fetches of at least 32 neighboring texels then you don't need to fetch anything from RAM to perform AF around such texel. 32 texels for a DXT compressed texture is something like 16 bytes. Texture caches in modern GPUs are several megabytes. Thus it's entirely possible to hold all the texture data needed for AF fetches in texture caches. The trick is to reuse texels which you're fetching for texturing anyway.

Guys, let's not go back to the h/w discussion. There is nothing wrong with PS4 h/w and there is nothing which makes AF any harder to perform on PS4 than on Xbox (or PC for that matter). The only reason why some games are missing AF on PS4 while their XBO versions have it is the lack of proper QA on developer part.
Late reply, but I just caught up with this as I was busy after I posted. Thanks for the response, cleared that up for me.

Really wish I got into game development, but I can't complain with where I'm at.
 
Sorry to bump, but this is so that we can finally clear up the AF issues on Rainbow Six Siege on PS4 that were posted about in the beta on the previous page. As we saw, the PS4 had a tri-linear texture filtering implementation.

However, in the Face-Off release today of the retail version, John reported that the PS4 had increased to a 8x AF texture filtering method, seeing a vast improvement over its previous lack of AF to begin with, and the Xbox One went from a 4x AF to 8x AF with no performance penalty on either console (in fact, the opposite is true, as more optimization has gone into the game in performance).
 

DOWN

Banned
Let me just add BioShock The Collection to the list of totally missing AF with no response from the studio
 

Creaking

He touched the black heart of a mod
I just don't get how they fail to notice this issue when putting together trailers or press release screens.
 
Top Bottom