• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GAF: Should I replace my 2080Ti with a 4070Ti?

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
No it doesnt. It'll be a bottleneck for any 4k gpu.

xxJ2hsYaDZZbTTAeNpR33o.png


Almost 20 FPS difference. "easily" lmfao. Whenes the last time you checked CPU benchmarks? Thats just one generation difference. We're at 13k and performances have increased even more.
65 average, 52 1% lows.
In Cyberpunk 2077 version 1.04.

And when we go to 4K again in 1.04.
VQMj8F8on5wMsqj8qxe4X-970-80.png


The graph is practically flat.
 

AV

We ain't outta here in ten minutes, we won't need no rocket to fly through space
With a 2080ti, Elden Ring cannot run at 60 fps at 4k in open landscapes, need to turn it down to 1440p. Dead Space at 1440p is in medium settings and I still cannot hit steady 60 fps.

Will a 4070ti run both games at steady 4k/60 fps ?

Elden Ring is a horribly optimised game. Don't use it for benchmarks.

Dead Space should be fine with DLSS Quality on, even better if you don't bother with RTX.
 
65 average, 52 1% lows.
In Cyberpunk 2077 version 1.04.

And when we go to 4K again in 1.04.
VQMj8F8on5wMsqj8qxe4X-970-80.png


The graph is practically flat.

Thats at 4k. If you want to test cpu benches, you'll do that at lower resolutions where thres no GPU drawbacks. Thats why techjesus tests at 1080p. Besides, saying theres barely any difference between a 9k and a 13k is quite a false and bold claim. Games and cpu performances have increased a lot lately. Ofc it depends on the game how CPU hungry it is. Theres a ton of youtube videos with real cpu benchmarks. OP is free to check that himself if he wants to. I'm just saying he'd be at a big disadvantage using a 4070TI with DLSS with that CPU. It's not a BAD cpu, he'll just not reach the performance heights one expects.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Thats at 4k. If you want to test cpu benches, you'll do that at lower resolutions where thres no GPU drawbacks. Thats why techjesus tests at 1080p. Besides, saying theres barely any difference between a 9k and a 13k is quite a false and bold claim. Games and cpu performances have increased a lot lately. Ofc it depends on the game how CPU hungry it is. Theres a ton of youtube videos with real cpu benchmarks. OP is free to check that himself if he wants to. I'm just saying he'd be at a big disadvantage using a 4070TI with DLSS with that CPU. It's not a BAD cpu, he'll just not reach the performance heights one expects.
But OP has a 9700K.
Is going to be playing at 4K aiming for 60.
And wants to get a 4070Ti (which slots neatly between the 3090 used in this test and the 3090Ti)

So why would I show OP benchmarks of 1080p?
With a vastly out of date version of Cyberpunk?

A 9700K paired with a 4070Ti will 60 pretty much any game and OP can upgrade down the line if the CPU actually becomes a bottleneck at 60.
Otherwise why not keep going.
 
Last edited:

flying_sq

Member
From the way GPU sales are going right now, I think they're going to do a price cut soon. The 4090 is pretty much the only card really selling. Everyone is complaining about price on the new cards, and either people will have to buy more, or Nvidia will have to do something, and I think a price cut is the fastest way. I just saw a sale on the 4070ti the other day. Granted it was only $20, but I think its the start.

So I would wait, plus if anything, bring your CPU up to date, major deals on CPUs right now. I do get that is a new mobo, maybe RAM, and a CPU.
 
Last edited:
But OP has a 9700K.
Is going to be playing at 4K aiming for 60.
And wants to get a 4070Ti (which slots neatly between the 3090 used in this test and the 3090Ti)

So why would I show OP benchmarks of 1080p?
With a vastly out of date version of Cyberpunk?

A 9700K paired with a 4070Ti will 60 pretty much any game and OP can upgrade down the line if the CPU actually becomes a bottleneck at 60.
Otherwise why not keep going.

DLSS lowers the resolution hence why CPU is important. DLSS 3, relies on CPU even more. I doubt hes going to play at native 4k.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
DLSS lowers the resolution hence why CPU is important. DLSS 3, relies on CPU even more. I doubt hes going to play at native 4k.
Even if he is playing at ~1440p internal.
The 9700K isnt going to stop him from hitting 60 in 99% of games until ArrowLake at the earliest.
At ArrowLake a full system rebuild may be in order.
Or just keep trucking along as long as those framerates/frametimes are still circa 60.
And FrameGeneration going up to 60 is pretty poopoo.
 
With a 2080ti, Elden Ring cannot run at 60 fps at 4k in open landscapes, need to turn it down to 1440p. Dead Space at 1440p is in medium settings and I still cannot hit steady 60 fps.

Will a 4070ti run both games at steady 4k/60 fps ?

Last week I sold my 2080ti & grabbed a friends 3090, the 4070ti is similar from what I recall, and its been a decent upgrade even with the R7 3800X I'm currently using. As an example I was testing Control Last night. Used to only get about 50-60 before, but now with increased settings and all the raytracing turned on at the same resolution, the 3090 is putting out about 80-90fps. For me above 60 is what I was aiming for with the 2080ti as I have it connected to a 120Hz TV, so there's plenty of headroom if 60 is all that's needed now. Even with a 3800X I'm getting about 30% more frames at higher settings on the 3090 with the games I've played, so a 9700k should be ok for a 4070ti at 60fps.
 

Nickolaidas

Member
Holy shit. The 2080 Super I used to have was 10 Tflops.

The 2080ti was 14 tflops and I felt the difference.

The 4070ti is 40 tflops! (salivates)
 

Griffon

Member
I live in Greece. If I buy that from the US, I'll end up 250 bucks extra for customs, 100 euro for the courier doing the import for me, plus shipping.
Definitely not worth it. Your computer is pretty good for 2 to 3 more years at least.

I bet you don't struggle to run any game right now.
 
Last edited:

Nickolaidas

Member
Definitely not worth it. Your computer is pretty good for 2 to 3 more years at least.

I bet you don't struggle to run any game right now.
Callisto Protocol runs like shit at 4k and even at 1440p with medium settings I get regular frame drops.

Ssme with Dead Space. It ran well at 1440p at first, but once I reached Hydroponics, the rooms with a lot of greenery drop to 40 fps.

Nothing's unplayable, but I've been spoiled recently and I want 4k (native or DLSS)/60fps constantly at high or ultra settings (RT can be low, don't really mind).

I no longer have that with a 2080ti.
 

ZoukGalaxy

Member
The real question is: do you NEED it ?

In mean by that, do some games run so poorly to the point of ruining your enjoyment and experience ?

If no, then skip it.

The race for performance is a never ending chimera.
 
Last edited:

ChazGW7

Member
Not until the prices come down.
If I was in your shoes I would probably search for a well-priced second hand 3080(ti) or 3090.
 
Last edited:

PeteBull

Member
If u need it then go for it, bro, but once again, unless u are really tight for cash, i suggest getting 4080 or even 4090 if u plan to keep this gpu for longer than 2 years, short term 4070ti will be amazing tho, again in the end u gotta decide how much u want to.

Here something u might find usefull with swaying ur opinion one way of the other, brand new hogwarts legacy, in various settings and res, including native 4k(ofc cant even hold stable 40fps in 4k native maxed on 4070ti).
 
Last edited:

Nickolaidas

Member
If u need it then go for it, bro, but once again, unless u are really tight for cash, i suggest getting 4080 or even 4090 if u plan to keep this gpu for longer than 2 years, short term 4070ti will be amazing tho, again in the end u gotta decide how much u want to.

Here something u might find usefull with swaying ur opinion one way of the other, brand new hogwarts legacy, in various settings and res, including native 4k(ofc cant even hold stable 40fps in 4k native maxed on 4070ti).

80 fps at 4k dlss quality. Sounds just fine to me!
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
If u need it then go for it, bro, but once again, unless u are really tight for cash, i suggest getting 4080 or even 4090 if u plan to keep this gpu for longer than 2 years, short term 4070ti will be amazing tho, again in the end u gotta decide how much u want to.

Here something u might find usefull with swaying ur opinion one way of the other, brand new hogwarts legacy, in various settings and res, including native 4k(ofc cant even hold stable 40fps in 4k native maxed on 4070ti).

70fps using DLSS Quality?

P.S
People need to realize PC games have a settings menu for a reason, for most people they skip a generation between GPU upgrades.
Once maxed out is not a stable 60, then drop some settings or resolution.
You dont need to Ultra settings every game just cuz you are keeping your GPU for more than a generation.
I dont know that many people IRL who upgrade every generation cuz that just isnt cost effective.....turning down some settings that dont have a huge visual impact is much better than buying a 1200+ dollar GPU every 2 years.
 

Nickolaidas

Member
The way I see it, 4070Ti's biggest shortcomings are: A) Few RT cores compared to the stronger cards of this generation and B) only 12 GB of memory.

I can live with little to no RT features for now, but the memory is the only thing that kiiiiiiiiiiiiiiiiiinda worries me.
 

AV

We ain't outta here in ten minutes, we won't need no rocket to fly through space
The way I see it, 4070Ti's biggest shortcomings are: A) Few RT cores compared to the stronger cards of this generation and B) only 12 GB of memory.

I can live with little to no RT features for now, but the memory is the only thing that kiiiiiiiiiiiiiiiiiinda worries me.

Memory thing depends how future-proof you want it to be. It doesn't go above 10gb in Cyberpunk 4K max settings, ultra RTX, DLSS quality + frame generation.

I'm getting Hogwarts on Friday, if you tag me over the weekend as a reminder I'll look at what that's doing.
 

PeteBull

Member
The way I see it, 4070Ti's biggest shortcomings are: A) Few RT cores compared to the stronger cards of this generation and B) only 12 GB of memory.

I can live with little to no RT features for now, but the memory is the only thing that kiiiiiiiiiiiiiiiiiinda worries me.
RT wise u just have to play game on console settings, but to add gasoline to the fire, 4070ti has only 192bit bus width, that combined with only 12gigs of vram is warranty of big performance degradation in 4k max/close to max settings on many new games, those gamers very often can be "lazy console ports" so u wanna have enough oomph to bruteforce it.

Like i mentioned i bought 3080ti (very close performance in 4k, around 5% worse maybe, and same vram) and i can tell u older/less demanding games run smoothly 4k max, at stable 60 but looking at those games whos base are current gen consoles, requirements(cpu, ram, vram, gpu oomph) are getting bigger every month, i got 4k monitor too and want/need/would like to keep it smooth and stable 60 for a whie, but i can already feel it, come next gen gpu family, so 5k series(maybe competition from amd if value is there, and vram+ very likely much improved rt perf) i will be changing my 3080ti for sure, dunno if in a year or two but it will have to happen .

Im not buying 4090 right now only coz i would need new psu too(got 750W goldrated) and very likely more cooling/new case too, so price for 4090 would actually be close to 2,5k really so im waiting for next gpu gen to simply save enough hard cash for whole new build.
 

Nickolaidas

Member
RT wise u just have to play game on console settings, but to add gasoline to the fire, 4070ti has only 192bit bus width, that combined with only 12gigs of vram is warranty of big performance degradation in 4k max/close to max settings on many new games, those gamers very often can be "lazy console ports" so u wanna have enough oomph to bruteforce it.

Like i mentioned i bought 3080ti (very close performance in 4k, around 5% worse maybe, and same vram) and i can tell u older/less demanding games run smoothly 4k max, at stable 60 but looking at those games whos base are current gen consoles, requirements(cpu, ram, vram, gpu oomph) are getting bigger every month, i got 4k monitor too and want/need/would like to keep it smooth and stable 60 for a whie, but i can already feel it, come next gen gpu family, so 5k series(maybe competition from amd if value is there, and vram+ very likely much improved rt perf) i will be changing my 3080ti for sure, dunno if in a year or two but it will have to happen .

Im not buying 4090 right now only coz i would need new psu too(got 750W goldrated) and very likely more cooling/new case too, so price for 4090 would actually be close to 2,5k really so im waiting for next gpu gen to simply save enough hard cash for whole new build.
My biggest pet peeve is that I don't want to start those 500-600Watt-hungry graphic cards. 4070ti is twice as strong as my 2080Ti and asks for the same Watts. That's a HUGE temptation for me.
 

PeteBull

Member
My biggest pet peeve is that I don't want to start those 500-600Watt-hungry graphic cards. 4070ti is twice as strong as my 2080Ti and asks for the same Watts. That's a HUGE temptation for me.
Worst comes to worst u can always resell it in 2 years once 5k series gpu launches:p
 
Top Bottom