• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Shadow of Mordor offers Ultra texture optional download, recommends 6GB VRAM @ 1080p

Look I offered up my FIRST HAND experience of a game on my PC and PS4 which shows that textures that you can see clearly are of the same quality and composite construction as PC on Ultra, this is factual and I know myself.

This was then met with, dont believe you, cant be because reasons..etc. It is not an argument or fan side I am on. Look at your posts here on this thread you are wound up.

It is relevant that the PC and PS4 are close visually, the 6GB need of ram is a mute point and clearly not needed to == PS4 textures and was not what I or others have said. But was is clear is that a 2GB card is going to have <Vram than both X1 and PS4 for textures this is a FACT and not a POV, you can argue and use old launch and x-gen games to defend this but I am at a lost as to why. 3-4GB is about what consoles will use for textures, using these on a PC with < this as Vram will result in System Ram to Card stutters and performance hits or wont run at all.



I think it just shows the amount of confusion and panic from many a new PC gamer at present. a 4gb card will be mostly fine this gen to get the same quality textures as Console (AF improvements aside), a 2gb card will not..simple I cannot understand the confusion and aggression here..I really can't?!

No you did not say the same, you said that that the textures have different quality settings on different distances (which isn't entirely correct) but that it was unnoticeable. Which makes me wonder how you know that in the first place, if you can't notice it.

I'd be very, very surprised if the texture quality on this ultra setting and on the PS4 is the same.

EDIT: You were talking about Watch Dogs? Well, same post applies.
 

thuway

Member
Oh I agree, but I still consider the 970 as upper midrange.

Could you have imagined last year having performance of a 970 under 350 dollars? Probably not :). Now if only AMD comes out swinging, we can finally start seeing some 4k capable cards in the 300-400 range.
 

blaidd

Banned
I'm confused now.

Those Ultra-Textures are supposedly a seperate download - which isn't availlable yet. So even when you set the option to "Ultra" you will only get "High" textures at the moment. When PC Gamer set everything on "Ultra", they still got "High" textures - This is not what (and, having seen the vid, shouldn't) they look like.

Yeah, but I play at 1440p, and personally, I don't think there's much need for SSAA there. Usually, 2xMSAA gives me satisfying results, or sometimes even good post-processing based solution like SMAA.

There's no MSAA. Only option for AA is SSAA in different scales.

And there's always need for some blissful AA ;-). I use a tiny 24'' UHD-Display at work (very high pixel-desity) and it still flimmers ;-).
 

BONKERS

Member
Yeah, this is just an ingame-resolution-slider like in Battlefield 4 for example. Or downsampling (upsampling if you chose a lower-than-native res), or OGSSAA if you prefer. There seems to be some post-fx-aa going on as well, so even while OGSSAA is not perfect, it looks rather smooth.

With deferred rendering, implementing SGSSAA doesn't seem worth the extra work, implementing OGSSAA should be extremely simple compared to that.

Also: OGSSAA will do everything - brutal and inefficient but it will get everything - SGSSAA? Not necessarily. I always hated to see a very nice and smoothed-out picture with just one or two edges still flimmering violently. So there are some advantages to DS/OGSSAA.

I honestly cannot disagree with this more.

OGSSAA fares generally very poorly with temporal and specular aliasing, and with shimmering and moire compared to SGSSAA in a vast majority of real world existing games/scenarios. In some more recent, newer games, there are some that don't end up meshing very well completely.
(Murdered Soul Suspect for ex in DX9)
But this is only because, it's essentially a hack and it can only work with what the driver can get from the game.

With a properly implemented SGSSAA, it would be able to literally get everything if wanted. Including low resolution buffers.
(For exactly the same reason why MSAA in Crysis 3 is for the most part only OK. Because they chose to use custom sub-sample mask to reduce redundant coverage by comparing sample similarity which resulted in MSAA missing a lot of things even though it resolves properly in the rendering chain where as Battlefield Bad Company 2's MSAA does not. Similar scenarios exist within games like Hitman Absolution and BF3)
There are many games with low resolution effects buffers, that forced SGSSAA also gets because the game has them in a format that a logic bit within Nvidia drivers expects the main rendering/backbuffers to be in.
(IE: Dead Rising 2 when properly configured, Mafia 2)

There are also games, where SGSSAA gets everything BUT low resolution effects buffers. Which is what usually ends up looking bad.
(IE: Battlefield Bad Company 2, Deus Ex Human Revolution DC;but not original edition; Far Cry 3's DoF in Cinematics ;Which can be thankfully disabled; Left 4 Dead 2's Character Glow Buffers)

Then you have some modern games as I mentioned that don't quite get everything and have tradeoffs.
(Such as Far Cry 3 with certain elements. Where as OGSSAA has a lot of trouble with certain elements that SGSSAA handles without issue. The same also applies to Murdered Soul Suspect and even an older game like Left 4 Dead 2)
OGSSAA only ever ends up getting *EVERYTHING* when you are talking extremely ridiculous and unplayable ratios like 4x4=16 in the same cases to get near SGSSAA quality. And in general is only better than SGSSAA in image sharpness and edge quality.

Monolith's FSAA in their mid 2000's games, is the only SSAA implementation in an modern game from the last 10 years that i've seen that actually is worth a damn at reducing temporal aliasing. Which is no small feat considering they are from an early era of using normal maps which resulted in harsh plastic'y lighting and tons of aliasing from normal maps.


I always hated to see a very nice and smoothed-out picture with just one or two edges still flimmering violently.
and you don't notice this with OGSSAA at all? I see tons of temporal aliasing with 2x2 OGSSAA consistently and constantly.
(8/10 UE3 games look worse with 2x2 OGSSAA even with some post AA on top than with SGSSAA. The best solution being OGSSAA+SGSSAA)

OGSSAA definitely has it's place,definitely is superior to straight MSAA in many ways, and when combined with other techniques like MSAA,SGSSAA, and various post-AA it produces fairly good results.

Nvidia has made efforts to help MSAA implementation into deferred with things like this
and has made somewhat of an effort to reduce performance of MSAA with yet to be validated MFAA

But, this is like, just my opinion after having spent literally hundreds to thousands of hours testing various AA methods in hundreds of games. I can see just about any and all aliasing, which is a curse.


Don't get me wrong though, OGSSAA is better than nothing. But i'd still have a worthwhile proper form of AA implemented that can get everything with the best of both worlds.

Shadows of Mordor's supposed Ultra Textures if worthwhile, are a good thing, not only for future proofing for better quality, but also offering the best quality as an optional thing for those whom want it. And I wholeheartedly believe that something like a properly implemented SGSSAA or even any SSAA comparable to Monolith's previous efforts would be the same
 
Those Ultra-Textures are supposedly a seperate download - which isn't availlable yet. So even when you set the option to "Ultra" you will only get "High" textures at the moment. When PC Gamer set everything on "Ultra", they still got "High" textures - This is not what (and, having seen the vid, shouldn't) they look like.
So R_Deckard is not right about the console version's texture being Ultra?
 
Could you have imagined last year having performance of a 970 under 350 dollars? Probably not :). Now if only AMD comes out swinging, we can finally start seeing some 4k capable cards in the 300-400 range.

No, never. I was looking for the next 8800gt but I was convinced we'd have to wait a lot for it. I am so happy that I was wrong!
 

Larogue

Member
Moral of the story,

If you've got a GPU with 2gb VRAM, you should really consider upgrading it when the new 8gb cards come out in November (980, 970 and possibly 960).

If you have a 4gb VRAM GPU, no need to upgrade anytime soon, it should last you 1-2 years as its rarely that you need more than that to run games in High or Ultra settings (High in the case of this game).
 

SparkTR

Member
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

So this is a TW2 Uber Sampling scenario? Did we have a thread complain about that back in 2011?
 
So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

Holy shit, are they stupid if this is actually true.

tFhCHpD.png


They wrote at "1080p rendering resolution". If you use super sampling, internally the game isn't using that resolution, but 4K! There is a world of difference. This is why some people were shitting bricks, all the discussion was thinking this requirements were with a native 1080p.
 

blaidd

Banned

By not worth the trouble I meant form a developers point-of-view. ;-) So they go for SSAA - which is not brilliant but okay from my perspective beats those millions of DX9-console-ports with nothing at all. I actually agree with you for the greatest parts.

I'd nearly always prefer SGSSAA (actually Rotated Grid or RGSSAA when we're at it), but some games with some weired shaders don't work so well.

And I always combine DS with a post-fx-aa (preferably SMAA), which work better with an higher resolution than they would with the native. Tog
 

coastel

Member
Fine, junior, I'll explain it to you.

You have completely missed the point of alexandros's post, and given your ill-formated, pointless rant, I doubted if you even read his post.

He was merely explaining how a 770, and later, a PC performs compared to a PS4. Not telling people not to buy one.

So the part at the end saying its not worth buying a console at launch due to.it being inferior just like the watch dogs port which could be just as craply optimised as it was on PC it gets tire some seeing the same PC crowd (not all) with the attitude consoles are inferior just based on specs it gets boring its misinformation as a whole just like alot of the bullshit console gamers say about PC. Lets get this right I will play games on anything minus fucking phones and my most played game.is on PC its my pic if people look I do apologise if it didnt seem direct to what he says but my point still stands.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
was considering getting this for ps4 because a long repetitive-ish game like this would be better suited to my couch but damn the price difference

70 bucks for a brand new console game. and i can get it on PC for like 25 from some people in the BST thread.

yeah PC it is.

The game has full gamepad support on PC so comfy couch gaming shouldn't be a problem.
 

blaidd

Banned
Holy shit, are they stupid if this is actually true.

tFhCHpD.png


They wrote at "1080p rendering resolution". If you use super sampling, internally the game isn't using that resolution, but 4K! There is a world of difference. This is why some people were shitting bricks, all the discussion was thinking this requirements were with a native 1080p.

Wrong pic. Wait, I'll go through the pain and upload the one with the resolution-options somewhere....
attachment.php
 

Witchfinder General

punched Wheelchair Mike
Moral of the story,

If you've got a GPU with 2gb VRAM, you should really consider upgrading it when the new 8gb cards come out in November (980, 970 and possibly 960).

If you have a 4gb VRAM GPU, no need to upgrade anytime soon, it should last you 1-2 years as its rarely that you need more than that to run games in High or Ultra settings (High in the case of this game).

I'm assuming you're pulling this data out of your arse as there's no currently prescribed date or announcement for 8Gb cards.
 
So the part at the end saying its not worth buying a console at launch due to.it being inferior just like the watch dogs port which could be just as craply optimised as it was on PC it gets tire some seeing the same PC crowd (not all) with the attitude consoles are inferior just based on specs it gets boring its misinformation as a whole just like alot of the bullshit console gamers say about PC. Lets get this right I will play games on anything minus fucking phones and my most played game.is on PC its my pic if people look I do apologise if it didnt seem direct to what he says but my point still stands.

I just don't understand how you got PC elitism out of a post that clearly stated the PS4 has a distinct VRAM advantage compared to 2GB cards. And the rest of the post was meant to point out that the old axiom of "buy console at launch and a PC later" isn't applicable any more. Not that you should not buy a console under any circumstances.
 

Durante

Member
Ha ha! Stop hitting your keyboard so hard ;-)

I have no idea where and how deluded you are but I can see a rational conversation is wasted on you..fare de well and keep fighting the good fight!
That's a pretty weak response to being proven wrong. If you don't want to discuss anything, then say so from the start.
 
The game has full gamepad support on PC so comfy couch gaming shouldn't be a problem.

Most people don't have their PC also connected to a TV, or maybe it is in an entirely different room.

I'm assuming your pulling this data out of your arse as there's no currently prescribed date or announcement for 8Gb cards.

When you'll get non-reference versions of the current cards there are very likely 8Gb cards between them.
 

BONKERS

Member
By not worth the trouble I meant form a developers point-of-view. ;-) So they go for SSAA - which is not brilliant but okay from my perspective beats those millions of DX9-console-ports with nothing at all. I actually agree with you for the greatest parts.

I'd nearly always prefer SGSSAA (actually Rotated Grid or RGSSAA when we're at it), but some games with some weired shaders don't work so well.

And I always combine DS with a post-fx-aa (preferably SMAA), which work better with an higher resolution than they would with the native. Tog
Oh, I assumed you meant so anyway. But I argue for the sake of quality and the same future proofing that I believe is a good thing for PC gaming.

beats those millions of DX9-console-ports with nothing at all.
This is true, but those millions of DX9-Console ports with nothing at all have the direct advantage of having a myriad of techniques that can be used with Nvidia cards.
I'd nearly always prefer SGSSAA (actually Rotated Grid or RGSSAA when we're at it),
RGSSAA when done properly too, could far beat out OGSSAA in many ways. SC:Blacklist had it IIRC, but it didn't work very well.

And I always combine DS with a post-fx-aa (preferably SMAA), which work better with an higher resolution than they would with the native

Which is what you should do! Even FXAA works a million times better at high resolution than at native

But , post-AA like still only really helps rounding out the edge quality and static IQ but should always be used with OGSSAA/Downsampling if no other option exists.

(Consider raw 2x2 OGSSAA or 2x2 downsampling Edge quality is still really rough.)


But again, knowing Monolith, i'm really still very much hoping that this 200% resolution option ends up working as good as their old FSAA :p
 

coastel

Member
I just don't understand how you got PC elitism out of a post that clearly stated the PS4 has a distinct VRAM advantage compared to 2GB cards. And the rest of the post was meant to point out that the old axiom of "buy console at launch and a PC later" isn't applicable any more. Not that you should not buy a console under any circumstances.

Thats kinda like saying dont and I understand what your saying and the comment is from seeing alot of your posts about consoles just gets really boring to the point of not wanting to read PC threads that I enjoy because of the arrogance of a few. I like PC and intended to buy another this year seems putting it of was the right choice I want a built to last a few years next year seems a great time to build.
 

mcz117chief

Member
Well, I still believe this game will look amazing on low/medium

pretty sure it won't look like this even when you set everything to the absolute lowest so all is good :)
 
Most people don't have their PC also connected to a TV, or maybe it is in an entirely different room.



When you'll get non-reference versions of the current cards there are very likely 8Gb cards between them.

Couch gaming works great on PC. If you want it the option, you have it. I know a lot of people here, myself included, have their PCs hooked up to their TV. I run a long HDMI cable down into the basement and back up behind my TV (something like a 50 foot run).

The Bluetooth adapter for my DS4 that I have works without issue on my couch, and I can mirror the display back to my desk for the instances where using the trackpad on the DS4 as a mouse doesn't work. Combine that with steam big picture mode, and you've got yourself an amazing setup.

Other options include putting a much less powerful media PC or steam box next to your TV and using Steam streaming, or buying something like the Alienware X51 to use more like a console and having it hooked up directly. Many other options exist.
 

R_Deckard

Member
That's a pretty weak response to being proven wrong. If you don't want to discuss anything, then say so from the start.
No its not, I have already proved it with images months ago and posted again, I am not getting drawn into another debate on this.

Fact is WD on PS4 has same textures when you look at them to PC on Ultra. Owning both I know this to be Fact.

The amount of ppl here sticking fingers in there ears with "la la" and stating I am wrong with less "proof" than me is the real issue.

if you are all so sure that PS4 is the same as high textures on PC then show proof, simple.

Why is this turning into an argument is beyond me!
 

dreamfall

Member
Moral of the story,

If you've got a GPU with 2gb VRAM, you should really consider upgrading it when the new 8gb cards come out in November (980, 970 and possibly 960).

If you have a 4gb VRAM GPU, no need to upgrade anytime soon, it should last you 1-2 years as its rarely that you need more than that to run games in High or Ultra settings (High in the case of this game).

Wait, do 8GB versions of 970/980 really come out in November? Source?

That'll be my time to upgrade, if true. Hadn't heard that!
 
So the part at the end saying its not worth buying a console at launch due to.it being inferior just like the watch dogs port which could be just as craply optimised as it was on PC it gets tire some seeing the same PC crowd (not all) with the attitude consoles are inferior just based on specs it gets boring its misinformation as a whole just like alot of the bullshit console gamers say about PC. Lets get this right I will play games on anything minus fucking phones and my most played game.is on PC its my pic if people look I do apologise if it didnt seem direct to what he says but my point still stands.
No, he was saying ditching PC for a console this time around is a bad idea, and that doesn't automatically equates going PC only, one can own both you know?
 

Durante

Member
Fact is WD on PS4 has same textures when you look at them to PC on Ultra. Owning both I know this to be Fact.
Whether or not you own both has no bearing on the facts at all.

The facts are that W_D on console uses a mixture of medium, high and ultra textures. On PC "ultra", it uses only ultra textures. This has been confirmed by actual analysis of what the game does.

So basically, you are completely wrong, have been so for 5 pages, and are obnoxious about it.

As a cherry on top, you are going directly against developer confirmation of NG consoles being equivalent to "high" settings, not ultra: https://twitter.com/SyrupCommander/statuses/469091994514366464
 

KungFucius

King Snowflake
Couch gaming works great on PC. If you want it the option, you have it. I know a lot of people here, myself included, have their PCs hooked up to their TV. I run a long HDMI cable down into the basement and back up behind my TV (something like a 50 foot run).

The Bluetooth adapter for my DS4 that I have works without issue on my couch, and I can mirror the display back to my desk for the instances where using the trackpad on the DS4 as a mouse doesn't work. Combine that with steam big picture mode, and you've got yourself an amazing setup.

Other options include putting a much less powerful media PC or steam box next to your TV and using Steam streaming, or buying something like the Alienware X51 to use more like a console and having it hooked up directly. Many other options exist.


I put my more powerful PC at my TV for gaming/ media and have a less powerful desktop. This option works great as all I use the desktop for now is web browsing and occasional resume editing. I actually just built a decent APU system for this desktop after my old i7 crapped out on me. I tend to leave the desktop PC on 24/7 so it's better to have the low power rig there. I came to this setup when I built a low power PC to be used as a media center only. I didn't even think of using the PC like a console until I had some problem with the CPU/mobo on the cheap PC while building it. I loaded up something like the Witcher 2 and never went back.
 

TronLight

Everybody is Mikkelsexual
I might be wrong here, but shouldn't they be able to stream the textures for the scene without having to cache ~6gb in memory? Pretty much like every game has for the last 10 years?
 

BBboy20

Member
With this and the evil withins requirements I think it's time for a rethink. May wait for the 8gb versions or nab 2 780 6gbs on a sale.
What? THe game appears good in areas but that good?

Wait 2 or 3 months, 8GB models shouldn't be far out. If you buy with EVGA they have a nice step up program to where you can upgrade within 90 days of purchasing the card. Ie, EVGA puts out a 8GB model in November you can send them the card and 50 dollars to get the version with more memory.
If they can get that kind of VRAM before Witcher 3 and not involve cutting my leg, then we're good.

So, does res effect FPS or not? Because I still use 1024x768.
 
No its not, I have already proved it with images months ago and posted again, I am not getting drawn into another debate on this.

Fact is WD on PS4 has same textures when you look at them to PC on Ultra. Owning both I know this to be Fact.

The amount of ppl here sticking fingers in there ears with "la la" and stating I am wrong with less "proof" than me is the real issue.

if you are all so sure that PS4 is the same as high textures on PC then show proof, simple.

Why is this turning into an argument is beyond me!
Aren't you making a heterodoxical claim without putting forth proof in the first place?
 
No its not, I have already proved it with images months ago and posted again, I am not getting drawn into another debate on this.

Fact is WD on PS4 has same textures when you look at them to PC on Ultra. Owning both I know this to be Fact.

The amount of ppl here sticking fingers in there ears with "la la" and stating I am wrong with less "proof" than me is the real issue.

if you are all so sure that PS4 is the same as high textures on PC then show proof, simple.

Why is this turning into an argument is beyond me!

Lol, you think PS4 version of SoM has texture quality equal to Ultra?
 
Me and my mates all have our PC's connected to our TV's now. Comfy couch gaming is awesome.

Couch gaming works great on PC. If you want it the option, you have it. I know a lot of people here, myself included, have their PCs hooked up to their TV. I run a long HDMI cable down into the basement and back up behind my TV (something like a 50 foot run).

The Bluetooth adapter for my DS4 that I have works without issue on my couch, and I can mirror the display back to my desk for the instances where using the trackpad on the DS4 as a mouse doesn't work. Combine that with steam big picture mode, and you've got yourself an amazing setup.

Other options include putting a much less powerful media PC or steam box next to your TV and using Steam streaming, or buying something like the Alienware X51 to use more like a console and having it hooked up directly. Many other options exist.

I use my TV on my PC as an option myself, but be aware that this is not what most people have. You'll need a a compatible controller if you don't happen to have the right console, possibly a receiver to connect it, maybe a long HDMI cable, preferably something with a trackpad to control the elements you can't control with a mouse, or buying an extra PC to use a a console.

Sure there are plenty of options, but there isn't a one solution that fits all and you can't expect everyone to go comfy couch gaming with their PC.
 

R_Deckard

Member
Whether or not you own both has no bearing on the facts at all.

The facts are that W_D on console uses a mixture of medium, high and ultra textures. On PC "ultra", it uses only ultra textures. This has been confirmed by actual analysis of what the game does.

So basically, you are completely wrong, have been so for 5 pages, and are obnoxious about it.

As a cherry on top, you are going directly against developer confirmation of NG consoles being equivalent to "high" settings, not ultra: https://twitter.com/SyrupCommander/statuses/469091994514366464

Show me this proof then of it other than a tweet?
 

Durante

Member
So, the OT and silly W_D tangent aside, I love the fact that some developers are already going beyond current console level not just in effects or IQ, but actual asset quality.

I honestly didn't expect that before late 2015. Nice job Monolith.
 

R_Deckard

Member
Whether or not you own both has no bearing on the facts at all.

The facts are that W_D on console uses a mixture of medium, high and ultra textures. On PC "ultra", it uses only ultra textures. This has been confirmed by actual analysis of what the game does.

So basically, you are completely wrong, have been so for 5 pages, and are obnoxious about it.

As a cherry on top, you are going directly against developer confirmation of NG consoles being equivalent to "high" settings, not ultra: https://twitter.com/SyrupCommander/statuses/469091994514366464

Show me this proof then of it other than a tweet?

Aren't you making a heterodoxical claim without putting forth proof in the first place?

I think you mean heterodoxy and no I posted images months ago. On a screen looking at them with my eyes Ultra textures=PS4 textures end of.
Lol, you think PS4 version of SoM has texture quality equal to Ultra?

No as I have stated above, this strawman witch hunt is comical and derailing this thread so I am done here.
 
Show me this proof then of it other than a tweet?



I think you mean heterodoxy and no I posted images months ago. On a screen looking at them with my eyes Ultra textures=PS4 textures end of.


No as I have stated above, this strawman witch hunt is comical and derailing this thread so I am done here.

Probably for the best eh?
 
Top Bottom