• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD dodges question on anti-consumer behavior regarding their bundled games. (Update: article is selective)

Buggy Loop

Member
Yet, their $1000 24GB 7900XTX trades blows in raster with the $1200 16GB 4080.

The only joke is this idea that this post is intelligent.

Their flagship $1000 reference card with shit cooler (add $$ for AIB with decent cooler, AMD reference is nowhere near close to Nvidia founder's edition now) managed to barely drags its way ahead with the heavily cut down Ada silicon that is the 4080 and for more power consumptions? (let's ignore RT or Frame gen for a minute)

Wow

citizen kane applause GIF


What an atrocious gen. 4080 is highway robbery, but because AMD is that uncompetitive, Nvidia can literally price them to stupid prices.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Their flagship $1000 reference card with shit cooler (add $$ for AIB with decent cooler, AMD reference is nowhere near close to Nvidia founder's edition now) managed to barely drags its way ahead with the heavily cut down Ada silicon that is the 4080 and for more power consumptions? (let's ignore RT or Frame gen for a minute)

Wow

citizen kane applause GIF


What an atrocious gen. 4080 is highway robbery, but because AMD is that uncompetitive, Nvidia can literally price them to stupid prices.
Dude, look at these benchmarks. The 7900 XTX is quite competitive with the 4080 and $200. Please refrain from embarrassing yourself.

ntQ5aZk.png

L4QKo91.png


But Ray Tracing....yes ray tracing is better than on nvidia's system. Heck, even Intel has done a better job with raytracing than AMD. No question, and if that is a huge priority to you, then go with nvidia, but I'd argue that you should spend the extra $$ and just get the 4090.

The 4080 has the specs of what the 4070 Ti should have been. The 4080 should have added about 11K CUDA cores and a 320-bit memory bus compared. That gives room for a 4080 Ti to have around 13K, which would go nowhere near encroaching on the 4090s MASSIVE 16K CUDA cores. The 4070 Ti would make a great 4070 if it had a 256-bit bus. The current 4070 would make a good 4060 Ti and down the line.
 

Buggy Loop

Member
Dude, look at these benchmarks. The 7900 XTX is quite competitive with the 4080 and $200. Please refrain from embarrassing yourself.

ntQ5aZk.png

L4QKo91.png

Oh yeah, counting those odd games that favour AMD like call of duty twice. Big brain move there. You totally fit with the avatar you have.

Here’s a with 50 games that is actually doing good work (best for VR)


I did the avg for 4k only, i'm lazy. 7900 XTX as baseline (100%)

So, games with RT only ALL APIs - 20 games
4090 168.7%
4080 123.7%

Dirt 5, Far cry 6 and Forza horizon 5 being in the 91-93% range for 4080, while it goes crazy with Crysis remastered being 236.3%, which i don't get, i thought they had a software RT solution here?

Rasterization only ALL APIs - 21 games
4090 127.1%
4080 93.7%

Rasterization only Vulkan - 5 games
4090 138.5%
4080 98.2%

Rasterization only DX12 2021-22 - 3 games
4090 126.7%
4080 92.4%

Rasterization only DX12 2018-20 - 7 games
4090 127.2%
4080 95.5%

Rasterization only DX11 - 6 games
4090 135.2%
4080 101.7%

But Ray Tracing....yes ray tracing is better than on nvidia's system. Heck, even Intel has done a better job with raytracing than AMD. No question, and if that is a huge priority to you, then go with nvidia, but I'd argue that you should spend the extra $$ and just get the 4090.

The 4080 has the specs of what the 4070 Ti should have been.

And that makes AMD flagship supposed to be what then?

Do you even remember how 6900XT vs 3090 was?

The 4080 should have added about 11K CUDA cores and a 320-bit memory bus compared. That gives room for a 4080 Ti to have around 13K, which would go nowhere near encroaching on the 4090s MASSIVE 16K CUDA cores. The 4070 Ti would make a great 4070 if it had a 256-bit bus. The current 4070 would make a good 4060 Ti and down the line.

Imagine what Nvidia did with that cut down chip?

We all thought that Nvidia priced the 4080 as a push to sell 4090’s, it ends up they actually fucking nailed the AMD flagship with a card that should normally cost ~$700!
How does a card with

379 mm² vs 529 mm²

45.9M transistors vs 58M transistors

4864 vs 6144 Cores

~30% of transistors reserved for ML & RT vs one that concentrates on rasterization

Gets -6.3% in rasterization? 4080 consuming -100W than the 7900XTX?
Look, if you don't care about RT, Even if data shows otherwise for the now vast majority of RTX users, fine, but let's entertain for a minute that we don't care about RT at the dawn of the tsunami of HW Lumen unreal 5 games coming, just for fun.
So RT's not important. Cool. That's why we have the hybrid RT pipeline since RDNA 2.

TmwEd2PhN4KqnwwNiceixn-970-80.jpg



Keyword here from AMD's own patent : Saves area and complexity. Saves area.

Cool beans. I can respect that engineering choice.

So then..

Why is AMD failing miserably at just trading blows in rasterization with a 379 mm² that has a big portion of its silicon dedicated to RT cores and ML?

Massive tech failure at AMD. If you're butchering ML and RT and still fail to dominate rasterization, it's a failed architecture. Nvidia's engineers schooled AMD's big big time.

AMD is fumbling around. Buying team red is basically just a teenage angts rebelious mind against Nvidia with "mah open source!" at this point.

Saves $200 for a cooler that barely keeps up with the power limits, is one of the noisiest card since Vega and has terrible coil whine. Pays $100-$200 more for a respectable AIB cooler solution... tolerates -45% lower RT performances in heavy RT games for roughly same price.. There's a market for all kinds of special peoples i guess.

But don’t get me wrong, both are shit buys, both are fleecing customers.
 
Last edited:

SHA

Member
Oh, Nvidia is not a gaming company because it allowed that to happen, it's fascinating when corporates thinks consumers are stupid and they'll follow them no matter what, it's one of the reasons I skip newly released games on PC and find low performed games on either platforms inexcusable.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Oh yeah, counting those odd games that favour AMD like call of duty twice. Big brain move there. You totally fit with the avatar you have.
OH! I see. These benchmarks don't count because they favor AMD. Nice job there. Your name of Buggy Loop fits you quite a bit. Very Buggy argument.

So youy have to resort to bringing up things like die size to prove your point. Ok? Cool. I guess....
 

Buggy Loop

Member
OH! I see. These benchmarks don't count because they favor AMD. Nice job there. Your name of Buggy Loop fits you quite a bit. Very Buggy argument.

So youy have to resort to bringing up things like die size to prove your point. Ok? Cool. I guess....

You really want to make a case for counting call of duty twice as valid metric?

Cracking Up Lol GIF by HULU


Hey, let’s count Cyberpunk 2077, cyberpunk 2077 with ray tracing, cyberpunk 2077 with psycho ray tracing, cyberpunk 2077 overdrive, Crysis remastered with software RT, with RTX optimized RT..

No surprise coming from a guy that has an Aussie AyyMD unboxed beaver teeth fucker as an avatar on a gaming forum. Who the fuck even thinks of having these techtubers as avatars, are you him?
 
AMD is completely uncompetitive in GPU's, that's why their market share has been continuously collapsing over the last few years. Gamers aren't stupid, it's not hard to tell that RTX 30 series was superior to RDNA2 and RTX 40 series is superior to RDNA3.

AMD knows it too, that's why they aren't really trying to undercut Nvidia by that much. What's the point of trying to undercut when your sole remaining market is a few truly dedicated fanboys anyways? Also AMD like everyone is heavily constrained by TSMC, and they are wisely allocating most of their TSMC output towards their extremely competitive CPU's. Nvidia doesn't even care all that much anymore either, they would rather allocate their TSMC output towards the highly profitable AI chips, which is why they aren't fussed that RTX 40 series isn't selling some amazing numbers, the fewer RTX 40 they sell, the more highly profitable AI chips they can sell instead.
 

M1chl

Currently Gif and Meme Champion
Not only that, but their games, probably by the contract, has to run terribly.
 

LiquidMetal14

hide your water-based mammals
It's BS if some of this is true. I like the free promo games but paying to keep features like DLSS (the best of it's kind) is not a good look but we kind of knew this.
 

nemiroff

Gold Member
Exclusivity fucks users in the ass yet again.

Of course this is going to be painted as a one sided “anti consumer” thing, when in reality everyone’s guilty.

Did you look at the games list, in this case only AMD is blocking "competing" technology in sponsored games.
 
Last edited:

marquimvfs

Member
They are partnering with the developers for the games release, why the F would they want Nvidia exclusive features in it? blame nvidia for not making DLSS open source.
Soooo obvious. Also, makes perfect sense. Nvidia was always shitty, if they're paying, they have the right to call at least some shots.
 
I guess technically its anti consumer, the same way Sony not releasing a PC version of their games day and date with consoles is anti consumer.
 
Last edited:

01011001

Banned
They are partnering with the developers for the games release, why the F would they want Nvidia exclusive features in it? blame nvidia for not making DLSS open source.

what would making DLSS open source accomplish exactly? AMD cards wouldn't be able to run it without a massive performance hit.
XeSS has to reduce its quality in order to work on other cards than Intel's Arc cards, and even with this reduced quality the performance is way worse than using DLSS or FSR2 on both AMD and Nvidia cards.

so in the end, you would have an FSR2 alternative on your AMD card, that looks not much better than FSR2, while running worse... because that's also exactly what XeSS currently is on anything that isn't an Intel Arc GPU.
 

Silver Wattle

Gold Member
what would making DLSS open source accomplish exactly? AMD cards wouldn't be able to run it without a massive performance hit.
XeSS has to reduce its quality in order to work on other cards than Intel's Arc cards, and even with this reduced quality the performance is way worse than using DLSS or FSR2 on both AMD and Nvidia cards.

so in the end, you would have an FSR2 alternative on your AMD card, that looks not much better than FSR2, while running worse... because that's also exactly what XeSS currently is on anything that isn't an Intel Arc GPU.
The topic isn't about how shit DLSS would be on AMD hardware, it's how AMD are somehow the bad guys for not wanting their competitors exclusive feature on a game they partner with.
DLSS being open source would make the criticism against AMD valid, where as right now it's just Nvidia stans having another bitch and moan at AMD for not pandering to their hardware choice.

Right now AMD as a business gets zero benefit from having DLSS in titles they partner with, if you want DLSS in more games then it's on Nvidia to make it more platform agnostic like FSR.
 

XesqueVara

Member
Most Games with DLSS have FSR because they started with DLSS, and then they added FSR because it's Easy and Familiar to devs to add FSR over it, but oposite isn't easy tho..., and devs may prioritize FSR over DLSS because its implementation can run in Consoles too making development easier if they use it.
 
Last edited:

01011001

Banned
The topic isn't about how shit DLSS would be on AMD hardware, it's how AMD are somehow the bad guys for not wanting their competitors exclusive feature on a game they partner with.
DLSS being open source would make the criticism against AMD valid, where as right now it's just Nvidia stans having another bitch and moan at AMD for not pandering to their hardware choice.

Right now AMD as a business gets zero benefit from having DLSS in titles they partner with, if you want DLSS in more games then it's on Nvidia to make it more platform agnostic like FSR.

ok, why are all of those games missing XeSS then? you know, the platform agnostic reconstruction tech that's also absent in all AMD sponsored games
 

01011001

Banned
Most Games with DLSS have FSR because they started with DLSS, and then they added FSR because it's Easy and Familiar to devs to add FSR over it, but oposite isn't easy tho..., and devs may prioritize FSR over DLSS because its implementation can run in Consoles too making development easier if they use it.


It's so hard to do in fact, that it takes modders HOURS, if not DAYS to implement
1x.webp
DAYS I TELL YOU!
hell, DLSS3 mods are starting to appear now too... I bet those also take DAYS for a single person to implement into a game they don't have the source code or mod tools for! it must be an ENORMOUS task for developers to implement, just an unfathomable amount of resources, that naturally, only a hobbyist can have!
1x.webp
 
Last edited:

Chiggs

Gold Member
Don't worry, Leonidas Leonidas , I'm sure China will invade Taiwan any day now, and if TSMC is blown to smithereens, Pat Gelsinger might have just sucked on enough Federal titty to stand up a subsidized fab business that will attract all of the business that Samsung rejects.

And then you don't have to create these limpdick threads to make yourself feel better, because both you and mighty/noble Intel won't have to worry about news stories like this--


--hurting you.
 
Last edited:

XesqueVara

Member
It's so hard to do in fact, that it takes modders HOURS, if not DAYS to implement
1x.webp
DAYS I TELL YOU!
hell, DLSS3 mods are starting to appear now too... I bet those also take DAYS for a single person to implement into a game they don't have the source code or mod tools for! it must be an ENORMOUS task for developers to implement, just an unfathomable amount of resources, that naturally, only a hobbyist can have!
1x.webp
In the End It depends if the Devs are up to implement it tbh, Since FSR works on both nvidia and AMD you could argue, that implementing DLSS is extra work/cost with little return.
I doubt that AMD/NVIDIA/INTEL haves control of what Features Devs can implement on their game.
 
Last edited:

01011001

Banned
In the End It depends if the Devs are up to implement it tbh, Since FSR works on both nvidia and AMD you could argue, that implementing DLSS is extra work/cost with little return.
I doubt that AMD/NVIDIA decides haves control of what Features they implement on their game.

dude... we are literally talking about outside, hobbyist programmers, implementing the feature in their free time within days, if not hours, after release, with results that shit all over the official FSR2 implementation.

there is just no way AMD isn't the reason behind the absence of DLSS and XeSS
 
Last edited:

XesqueVara

Member
dude... we are literally talking about outside, hobbyist programmers, implementing the feature in their free time within days, if not hours, after release, with results that shit all over the official FSR2 implementation.

there is just no way AMD isn't the reason behind the absence of DLSS and XeSS
All of that is Speculation Bro, AMD partnered games like Uncharted and Tlou, Forspoken, all have DLSS on them, If they payed for blocking it why would they have it along FSR?
 
Last edited:

01011001

Banned
All of that is Speculation Bro, AMD partnered games like Uncharted and Tlou, Forspoken, all have DLSS on them, If they payed for blocking it why would they have it along FSR?

AMD refused to give a clear answer when asked about it. and just because not all of them, and "only" the vast majority are missing DLSS and XeSS doesn't mean they don't do it.
some devs might deny more lucrative deals, and others take the money and agree to not support DLSS and XeSS.

all we know is
A: hobby modders can implement it easily within days of release, even up to DLSS3 frame generation now!
B: even this fan made implementation looks and runs better than the official FSR2 support.
and C: AMD, even tho they didn't confess to it, they do not clearly refute it either
 
AMD is wasting their money anyways, their market share will keep declining as long as they remain objectively inferior to Nvidia and denying features to the 85% in order to hold them down to the level of the 15% will do nothing but increase resentment and dislike of AMD. It's not smart to make people not like you when you are pretending to be the underdog, that's literally the dumbest strategy imaginable.
 

ToTTenTranz

Banned
OP's weekly attempt at AMD hate fell flat on its face, so him and his 2 comrades turned another thread into the typical aLl hAiL rAyTrAcInG dLsS-tHrEe jEnSeN rUlEzZz tired crap that they repeat everywhere nonstop.


Perhaps if the thread's subject is moot, it can be closed.
 
DLSS is a hardware AI upscaler that uses the tensor cores in the NVIDIA RTX cards, which is one of the things that makes this technology superior to either AMD's FSR1/2 or Intel's XeSS. DLSS being better than FSR isn't even an opinion at this point, it's pretty much fact.

Making DLSS open-source would likely only mean that it would look/run worse on non-tensor core hardware and be on a par with the other upscaling APIs. Hardware solutions are always better than software as they have less performance overheads. This is what makes NVIDIA GPUs more appealing than AMD ones for me. NVIDIA have already pushed new technologies whereas AMD typically just copy what NVIDIA do... at least that is how it looks to me. I mean NVIDIA had a 12-18 months head start on hardware ray-tracing over AMD and that is the reason their RTX cards have a competitive advantage.

Personally, I've never seen any reason to buy an AMD product. They usually cheaper than the competition but are almost always inferior or flawed in some way such as their CPUs vs. Intel (Windows 11 and BIOS/motherboard issues) and their GPUs vs. NVIDIA (weaker RT, inferior upscaling tech, weaker performance). That is the reason why AMD are not the market leader in either CPUs or GPUs. I've only ever bought two AMD GPUs since getting into PCs around 1998, one time was when NVIDIA released the disastrous GTX 480 and I went for 2x 5870s in CrossFireX, and while they were fine for their time, NVIDIA's better driver performance and feature set were what brought me back. AMD's drivers at that time had terrible OpenGL support and their DX11 driver was notorious for having a high CPU overhead plus support for new games was lacking compared with NVIDIA who were (and still are) better at getting new drivers out for Day One.
 

YeulEmeralda

Linux User
AMD’s biggest GPU problem isn’t performance, it’s perception. AMD hasnt caught up to nvidia, but they price their cards like they have. The 7900 XTX is an absolute beast of a card, trading blows with nvidias $1200 card (which you never see at that price. More like $1300) in rasteur performance for about $1000, even discounted to $900 in some instances. I maintain that most people could not spot an RT screenshot compared to at rast screenshot. Cyberpunk 2077 is an example. There is very little difference unless you use path tracing. I kept running the benchmark with each RT option enabled to determine if the visuals gained was worth the performance trade off. Sure you can see the difference if youre looking for them. But most people you could turn off RT and tell them it was RT and they’d be none the wiser.

What frustrates me is that AMD should have priced the XTX at $799 and the XT at $699. Nvidia likely would have been forced to compete. AMD is focusing too much on profits and not enough on marketshare which will generate more long term profits.
I agree with you that RT still has too much of an impact on performance. However RT is the future and Nvidia keeps getting better with it. AMD needs to get parity with this technology or they're done for.
 

SmokedMeat

Gamer™
Exclusivity fucks users in the ass yet again.



Did you look at the games list, in this case only AMD is blocking "competing" technology in sponsored games.

Sponsored or not, Nvidia blocks AMD as well. Like I said, it’s being portrayed as poor innocent Nvidia, which couldn’t be further from the truth.
 
Muh DLSS!!!111

I can live just fine with FSR2, in many games I can't even tell the difference unless DF tells me which is better. I've also played a few games where DLSS notably degraded the image like Witcher 3 or RDR2, so I don't really care about having an option just for the sake of it being there.
Good point, Games where vegetation is all over the place needs native rez.
 

winjer

Gold Member
DLSS is a hardware AI upscaler that uses the tensor cores in the NVIDIA RTX cards, which is one of the things that makes this technology superior to either AMD's FSR1/2 or Intel's XeSS. DLSS being better than FSR isn't even an opinion at this point, it's pretty much fact.

Making DLSS open-source would likely only mean that it would look/run worse on non-tensor core hardware and be on a par with the other upscaling APIs. Hardware solutions are always better than software as they have less performance overheads. This is what makes NVIDIA GPUs more appealing than AMD ones for me. NVIDIA have already pushed new technologies whereas AMD typically just copy what NVIDIA do... at least that is how it looks to me. I mean NVIDIA had a 12-18 months head start on hardware ray-tracing over AMD and that is the reason their RTX cards have a competitive advantage.

Personally, I've never seen any reason to buy an AMD product. They usually cheaper than the competition but are almost always inferior or flawed in some way such as their CPUs vs. Intel (Windows 11 and BIOS/motherboard issues) and their GPUs vs. NVIDIA (weaker RT, inferior upscaling tech, weaker performance). That is the reason why AMD are not the market leader in either CPUs or GPUs. I've only ever bought two AMD GPUs since getting into PCs around 1998, one time was when NVIDIA released the disastrous GTX 480 and I went for 2x 5870s in CrossFireX, and while they were fine for their time, NVIDIA's better driver performance and feature set were what brought me back. AMD's drivers at that time had terrible OpenGL support and their DX11 driver was notorious for having a high CPU overhead plus support for new games was lacking compared with NVIDIA who were (and still are) better at getting new drivers out for Day One.

The performance diference is not that big between using DP4A and Tensor cores.
Just compare XeSS 1.1 running on a 4080, using DP4A, to DLSS running on the Tensor Cores.
It's a 2-3 fps diference. that is not a lot when we are gaining 30% performance in Quality mode.
And consider that XeSS 1.1 using DP4A or XMX cores have the same image quality.
The reality is that on NVidia GPUs, using DLSS does not max out the Tensor cores. On most RTX cards, all those Tensor cores are overkill for DLSS, although they are great for any other AI applications.
Also consider that the DP4A path on RDNA3 has instructions to accelerate these calculations. So the loss in performance would probably be null.
AMD already has a solid base with FSR2.2 and all they need is to add the AI layer to polish up the end image, like NVidia and Intel are doing.

HwOz1nG.jpg
 
Nice that you ignore the updated post half way through the thread that includes Sackboy, an Nvidia sponsored title from Sony that has had even DLSS3 support added and upgraded ray tracing but still for some reason lacks FSR and apparently blocks the FSR mod.

I didn't ignore it, I just didn't see it. But, whoopee-doo, ONE game. Yeah, that sure makes all the difference, right? AMD redeemed?
 

Ev1L AuRoN

Member
AMD titles usually have, no DLSS and Poor RT Quality, enough to run without Image Reconstruction. Capcom games are the worst offenders, IMHO.
 

Buggy Loop

Member
The topic isn't about how shit DLSS would be on AMD hardware, it's how AMD are somehow the bad guys for not wanting their competitors exclusive feature on a game they partner with.
DLSS being open source would make the criticism against AMD valid, where as right now it's just Nvidia stans having another bitch and moan at AMD for not pandering to their hardware choice.

They block the devs from enabling it in UE games, a toggle, no source code needed.

Right now AMD as a business gets zero benefit from having DLSS in titles they partner with, if you want DLSS in more games then it's on Nvidia to make it more platform agnostic like FSR.

DLSS dev kit is available for anyone to use. Anyone. Even the Mario 64 RT dude implemented DLSS in a few hours after the SDK was out. You don’t need source code, do you? The open source argument is stupid. It’s widely available to all platforms and any devs for free.

And I get no benefit from AMD sponsored games. They have ~1.5% steam hardware survey between RDNA 2 and 3 while RTX cards from 2000 series to 4000 series are now the majority of hardware. They want to piss off that userbase? Pissing them off is really how you attract peoples to switch over?

AMD sponsored now means no DLSS most likely, shit RT effects, memory leaks (nearly all their releases). I have to witness their FSR 2 by force and it sucks balls.

How’s that for their reputation? You think that makes me interested to switch team?
 

Buggy Loop

Member


How does that explain that engines having literally a toggle to enable DLSS (UE) don’t have it in some AMD sponsored titles?

Unreal and unity engines have it implemented natively now, the DLSS SDK is available for anyone for free and is so easy to implement that an indie dude implemented it hours after the SDK was out



At this point, devs saying it’s hard are trolling. Probably the same devs that don’t bother with ultrawide and then script kiddies change an HEX value to enable it. So much hard work 😂 or literally changing .DLL file to switch FSR to DLSS.. ouf what a tough job

Work Working GIF by Vadoo TV


Let’s not go into the Boundary dev who had RT and DLSS working and soon as they signed sponsorship with AMD they removed those features.

They’re trolling us at this point. They really want to make you believe it’s a monumental task to support basic PC features, while script kiddies implement it within hours typically after the game releases, not particularly « passionate » peoples spending weeks on the problem.

There’s no excuses

And to think that open source would help in any way :messenger_tears_of_joy: Dude, they can’t even implement basic stuffs, nor even follow the basic SDK guidelines to implement it in a few hours. You think these peoples would understand what’s going on in DLSS magic box if they opened it?

Instant nose bleed

role playing flirt GIF by Hyper RPG
 
Last edited:

nemiroff

Gold Member
How does that explain that engines having literally a toggle to enable DLSS (UE) don’t have it in some AMD sponsored titles?

Unreal and unity engines have it implemented natively now, the DLSS SDK is available for anyone for free and is so easy to implement that an indie dude implemented it hours after the SDK was out



At this point, devs saying it’s hard are trolling. Probably the same devs that don’t bother with ultrawide and then script kiddies chance an HEX value to enable it. So much hard work 😂 or literally changing .DLL file to switch FSR to DLSS.. ouf what a tough job

Work Working GIF by Vadoo TV


Let’s not go into the Boundary dev who had RT and DLSS working and soon as they signed sponsorship with AMD they removed those features.

They’re trolling us at this point. They really want to make you believe it’s a monumental task to support basic PC features, while script kiddies implement it within hours typically after the game releases, not particularly « passionate » peoples spending weeks on the problem.

There’s no excuses

Yeah, the narrative is so fucking laughable. That "nerdtech" tweet above is so mental it borderlines a medical emergency.

I could understand it for DLSS 1 where you had to do your own ML model training, but DLSS 2/3, nah, don't fucking patronize me.

Even small indie studios have DLSS support these days.
 
Last edited:

ToTTenTranz

Banned
And this also explains with there are so many bad PC ports.
It's a really sad state o affairs for PC gamers.

In reality, it's something we've known for years.
Digital artists for asset creation are cheap (and getting increasingly cheaper with AI), but software engineering is incredibly expensive nowadays. While 3D digital artists and animators are working almost exclusively on videogames and movies (and maybe some metaverse stores nowadays), for software engineers the dev houses are competing with all other companies working on productivity, banking, mobile, estores, etc pretty much everything everywhere.
Some people might think dev houses can simply hire cheaper remote workers from India, Pakistan or Bangladesh, but they have no idea how much of a barrier there is in communicating through different cultures and timezones and how much that dwarves productivity, which is why e.g. westerners will tend to keep working with westerners, asians with other asians etc.


And when software engineering costs are huge, that's when optimization and QA gets hurt. Huge games with a gazillion assets with poor optimization and stability is simply the result of software engineering being so expensive.

Also, the people thinking "it's so easy for a modder to do DLSS2 on a FSR2 game" are still missing the point. No ill-intent will go towards the modder if the mod breaks the game, people will even cheer up on him for having tried.
Have an official patch breaking the game and your incompetence will be yelled across the four corners of the internet and at the very least the company needs to release jpeg apologies while facing sales returns and losses.

Any single line of code must go through heaps and loops of quality control processes because the risk of breaking the game (or worse, screwing up a save game) is too great.


I didn't ignore it, I just didn't see it. But, whoopee-doo, ONE game. Yeah, that sure makes all the difference, right? AMD redeemed?
The list I made has 3x more DLSS2-only games than FSR2-only ones. I found 8 DLSS2-only games with Nvidia cross-promotion, plus 15 DLSS2-only games without the cross-promotion.
Your idea that AMD is being successful at blocking DLSS2 from being implemented couldn't be more wrong. If anything it's the other way around.
 

hinch7

Member
Dude, look at these benchmarks. The 7900 XTX is quite competitive with the 4080 and $200. Please refrain from embarrassing yourself.

ntQ5aZk.png

L4QKo91.png


But Ray Tracing....yes ray tracing is better than on nvidia's system. Heck, even Intel has done a better job with raytracing than AMD. No question, and if that is a huge priority to you, then go with nvidia, but I'd argue that you should spend the extra $$ and just get the 4090.

The 4080 has the specs of what the 4070 Ti should have been. The 4080 should have added about 11K CUDA cores and a 320-bit memory bus compared. That gives room for a 4080 Ti to have around 13K, which would go nowhere near encroaching on the 4090s MASSIVE 16K CUDA cores. The 4070 Ti would make a great 4070 if it had a 256-bit bus. The current 4070 would make a good 4060 Ti and down the line.
Yeah Nvidia were clearly cutting down their SKU's after seeing how bad RDNA 3 turned out. They must've crunch their numbers and adjusted specifications of the stack, to performance accordingly. Only their prices were also completely out of wack as AMD are still so behind in many ways and don't have the mindshare they have.

Nvidia really did take the piss the generation. AMD too, with their eregeous pricing and crappy performance uplift. Particularly the 7800XT *cough* 7900XT. Shit generation all round.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Yeah Nvidia were clearly cutting down their SKU's after seeing how bad RDNA 3 turned out. They must've crunch their numbers and adjusted specifications of the stack, to performance accordingly. Only their prices were also completely out of wack as AMD are still so behind in many ways and don't have the mindshare they have.

Nvidia really did take the piss the generation. AMD too, with their eregeous pricing and crappy performance uplift. Particularly the 7800XT *cough* 7900XT. Shit generation all round.
The only nvidia GPU that I think is worth getting if I am being honest is the 4090. Yes it is priced sky high, but it's the only one that truly feels like no compromises were made. Everything below that has major compromises.

The 4080 needed about about 2000 more CUDA cores and 320-bit memory bus, leaving room for a 4080 Ti to get around 4000 more, which would still put it 2000 less than the 4090.

How it should be:
1. 4090 - price this as high you want. It exists for gamers to be price gouged, but $1300 would be somewhat fair.
2. 4080 Ti - 14K CUDA cores. 320-bit memory bus 20 GB memory - $1199
3. 4080 - 12K CUDA cores 320 - bit memory bus - $899
4. 4070 Ti - same specs as 4080 - $699
5. 4070 - same specs as 4070 Ti, but with 256-bit memory bus - $499
6. 4060 Ti - same specs as 4070, but with 192-bit memory bus 12GB minimum
7. 4060 - doesn't matter.

If nvidia would have priced their GPUS like that, they could have utterly destroyed AMD OR forced AMD to sell the 7900XTX for around $600.

Imagine THAT?

And yes, AMD runs their GPUs like they are in a much better position than they are. EVery analyst in the world has said and agrees that if AMD really wants to put a dent in the GPU market, they are going to have to bite the bullet and price their GPUs mega competitively for that to happen. The 7900XTX should be $799 MSRP.

AMD also makes a HUGE mistake in pricing their GPUs high on day 1 and they get permanently labelled a bad value, despite the fact that they get price cuts, but for many that label remains and AMD has nobody to blame but themselves.

AMD has finally caught up to Intel in the CPU market and now can price their CPUs accordingly. Nothing Intel currently has can beat the 7800X3D for gaming. The 13700K can come close, but at a fairly significant power increase. AMD is a no-brainer for gaming CPUs.
 
Last edited:
Top Bottom