• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

"I Need a New PC!" 2022. The GPU drought continues...

Status
Not open for further replies.

JohnnyFootball

GerAlt-Right. Ciriously.
Psu only came with two pcie to psu cables. The only extra one i have is for the cpu to psu and Google says that has different pins so it won’t work.

I’m gonna get a new cable tomorrow. Cant believe this $1,000 gpu didn’t come with a cable. Especially since it needs three whereas other 3080s only need two.
What cable would you expect it to come with? GPU makers will never supply PCI-E cables since each PSU brand has standard connectors into the PSU.
 

vpance

Member
Not yet. Will have to google it because i dont think ive ever done it. I usually only mess with MSI Afterburner. If its something I can do with that then i might give it a shot. I am terrified of messing with things in the Bios.

It's pretty easy to do in MSI Afterburner. That's what I used when I still had my NV cards. Check this post and the top voted comment.



Worth doing even if you do end up replacing the card for a stable one since you can noticeably drop the power draw and heat.


Interesting comment in that thread

Nice and quick tutorial. I undervolted my 3080 down to 850mV@1860 to minimize the watts drawn by the card since I'm 100 watts under the recommendation. The UV was stable in control but not in CB 2077. Even stock settings weren't able to stop my random crashes in CB 2077. A friend of mine with a 3080 as well managed to solve his crashes with an underclock but this didn't help with my crashes. Ultimately the crashes stopped when the card was at 1V@1960Mhz and after a little optimization 1V@2060MHz. I gained like 4-5 fps in CB 2077 and 2-3 °C more.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
It's pretty easy to do in MSI Afterburner. That's what I used when I still had my NV cards. Check this post and the top voted comment.



Worth doing even if you do end up replacing the card for a stable one since you can noticeably drop the power draw and heat.


Interesting comment in that thread

Thanks I will give it a shot.
What cable would you expect it to come with? GPU makers will never supply PCI-E cables since each PSU brand has standard connectors into the PSU.
Yeah, I realized that after I started shopping for cables. Didnt realize cables are not compatible across different power supply brands.
 

SlimySnake

Flashless at the Golden Globes
Well, I am pretty sure the GPU is defective. Got a third PCIE to PSU cable today. Hooked it up. Matrix didnt crash immediately but eventually it did. I noticed my UPS had a 600 watt battery backup and i was going over it so removed it from there just in case it was messing with it and hooked it up to a surge protector. No dice. Matrix still crashed. Ran cyberpunk. It didnt crash but noticed the CPU was running really hot. Looked inside and one of the two radiator fans were running. Unplugged and plugged it back in. With both fans running, temps are fine. Cyberpunk runs fine. RDR2 ran fine. I am about to call it a night and boom. blue screen of death. Something wrong with the hardware it says.

Going to hook my rtx 2080 back in there tomorrow and see if these games still crash but I am about ready to return this fucking card lol

P.S The i7-11700kf version consumes up to 150w while running the Matrix. If it wasnt for RDR2 and Control crashing, I wouldve been sure it was my CPU. RDR2 maxes out at 90w which itself is crazy, but man UE5 is really taxing the CPU. Even cyberpunk tops out at 130w.
 

GreatnessRD

Member
Well, I am pretty sure the GPU is defective. Got a third PCIE to PSU cable today. Hooked it up. Matrix didnt crash immediately but eventually it did. I noticed my UPS had a 600 watt battery backup and i was going over it so removed it from there just in case it was messing with it and hooked it up to a surge protector. No dice. Matrix still crashed. Ran cyberpunk. It didnt crash but noticed the CPU was running really hot. Looked inside and one of the two radiator fans were running. Unplugged and plugged it back in. With both fans running, temps are fine. Cyberpunk runs fine. RDR2 ran fine. I am about to call it a night and boom. blue screen of death. Something wrong with the hardware it says.

Going to hook my rtx 2080 back in there tomorrow and see if these games still crash but I am about ready to return this fucking card lol

P.S The i7-11700kf version consumes up to 150w while running the Matrix. If it wasnt for RDR2 and Control crashing, I wouldve been sure it was my CPU. RDR2 maxes out at 90w which itself is crazy, but man UE5 is really taxing the CPU. Even cyberpunk tops out at 130w.
Damn that sucks pimp. Just return it and use that beefy 2080 until the new heat comes out later this year. Or at least that's what I'd do, lol. How much did you pay for the 3080?

EDIT: Could RMA as well if you really wanna keep it, too.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Damn that sucks pimp. Just return it and use that beefy 2080 until the new heat comes out later this year. Or at least that's what I'd do, lol. How much did you pay for the 3080?

EDIT: Could RMA as well if you really wanna keep it, too.
I dont really want to keep it and Microcenter will give me a new one if I wanted anyway. They still have like 20 of those cards in stock.

Paid $999 + tax for the 12 GB 3080. I like the performance and wouldve kept it if it didnt have this many issues but i cant justify a 600+ Watt gaming system. Hopefully their new cards are on a smaller node and are more power efficient.

Will also go with an AMD CPU. Made a huge mistake sticking with intel just to save some money on the motherboard which i ended up replacing anyway. Never again.
 
Last edited:
I dont really want to keep it and Microcenter will give me a new one if I wanted anyway. They still have like 20 of those cards in stock.

Paid $999 + tax for the 12 GB 3080. I like the performance and wouldve kept it if it didnt have this many issues but i cant justify a 600+ Watt gaming system. Hopefully their new cards are on a smaller node and are more power efficient.

Will also go with an AMD CPU. Made a huge mistake sticking with intel just to save some money on the motherboard which i ended up replacing anyway. Never again.
What cooler do you have on your 11700k?

It's really a pretty good cpu, no need to upgrade just yet... Not a horrible choice but not the best either. A horrible choice would have been the 11900k which is no better than the i7 lol.

If you want 3080 performance but at lower watts maybe get the 4060 on release?
 
Looks like I will be getting the sythe fuma 2b as my new cpu cooler as it's the best thing that'll fit in my phanteks p400a case. Was just using the little wraith stealth cooler on my 1600af so far, never had an issue though.

That said I wonder if the sythe is good enough for an 5800x3d or 5900x not to throttle, or be able to get max auto OC.

That 5900x, I told myself I wouldn't get one but I might be able to snag one at 350 and at that price it's pretty hard to pass. At this point I don't think anything I go with is a bad choice (except 5800x which is just a hotter overpriced 5700x)... I just want to upgrade already!
 
Last edited:

SlimySnake

Flashless at the Golden Globes
What cooler do you have on your 11700k?

It's really a pretty good cpu, no need to upgrade just yet... Not a horrible choice but not the best either. A horrible choice would have been the 11900k which is no better than the i7 lol.

If you want 3080 performance but at lower watts maybe get the 4060 on release?
Arctic freezer 2 240mm. Its 30 degrees Celsius while idle in a 74 degree fahrenheit room. I was able to get it down to 20 degrees during summer by placing it in front of the vent and it would suck in all the cold air but that’s overkill.

The CPU performance is fine but The power consumption is nuts and the heat it generates warms my entire system. Probably won’t upgrade until next year but I’m done with these high tdp parts for now. The ps5 is 200 watts. I get that the 3080 is giving me 2x more performance but at 3x more power consumption. I’d rather just wait for the ps5 pro.
 
Last edited:

YeulEmeralda

Linux User
i remember the days, when you could afford the WHOLE middle-end PC for the 1000$, not the GPU :(
I remember a time when technology moved so fast you had to buy a new PC every 3 years.

At last today that $1000 GPU will last you a long time if you're willing to play at 1080p.
 

JohnnyFootball

GerAlt-Right. Ciriously.
If you want 3080 performance but at lower watts maybe get the 4060 on release?
I love how you casually throw this out there like 4060s will be easily available to purchase at launch. It’s like you haven’t paid even a little bit of attention to the last two years.

Either way, the 4000 series GPUs will be scarcely available at launch and will be scalped by bots for a while. That’s even without having to contend with mining, which will always be a threat until ethereum makes the proof of stake switch.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Well, I am pretty sure the GPU is defective. Got a third PCIE to PSU cable today. Hooked it up. Matrix didnt crash immediately but eventually it did. I noticed my UPS had a 600 watt battery backup and i was going over it so removed it from there just in case it was messing with it and hooked it up to a surge protector. No dice. Matrix still crashed. Ran cyberpunk. It didnt crash but noticed the CPU was running really hot. Looked inside and one of the two radiator fans were running. Unplugged and plugged it back in. With both fans running, temps are fine. Cyberpunk runs fine. RDR2 ran fine. I am about to call it a night and boom. blue screen of death. Something wrong with the hardware it says.

Going to hook my rtx 2080 back in there tomorrow and see if these games still crash but I am about ready to return this fucking card lol

P.S The i7-11700kf version consumes up to 150w while running the Matrix. If it wasnt for RDR2 and Control crashing, I wouldve been sure it was my CPU. RDR2 maxes out at 90w which itself is crazy, but man UE5 is really taxing the CPU. Even cyberpunk tops out at 130w.
I’m going to ask done basic questions here:
1. Did you update the firmware on the GPU? It’s something that a lot of people don’t do since it requires EVGA Precision software. If you haven’t done that, then do it. My GPU which is the exact same one as yours required a firmware update when I installed it. Even if you don’t plan on keeping the GPU it’ll be good information to have.

2. Is the motherboard BIOS up-to-date? ASUS boards have been doing funny things with nvidia GPUs as of late, I’d watch JayzTwoCents video on his problem with a 3080 Ti, a different GPU, but it’s a very educational video on how simple things can make things appear defective when they’re not.

3. Is Windows up to date? Sometimes a major software update doesn’t always show up as an immediate. Are you using Windows 11, if not you should.

Also, please tell me how you are testing your system. Since you and I have similar GPUs I want to put mine through the ringer since I’m now slightly worried that this particular series of 3080s could be a bad batch since they’ve been the most heavily discounted. So far I’ve had no issues, but haven’t really put it through demanding scenarios. I played a little bit of Doom Eternal, Rage 2 and Dishonored which is not very taxing. I’ll try RDR2 and see. I’ll also run the benchmark of Horizon Zero Dawn and see how that goes.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Really? I just assumed it was some kind of refresh as don't the 3070 and 3080 only have 8GB? Anyhow, it's good.
The 12GB is wasted anyway. Kind of annoys me since those memory modules could be used on other things that would actually benefit them.
 
I love how you casually throw this out there like 4060s will be easily available to purchase at launch. It’s like you haven’t paid even a little bit of attention to the last two years.

Either way, the 4000 series GPUs will be scarcely available at launch and will be scalped by bots for a while. That’s even without having to contend with mining, which will always be a threat until ethereum makes the proof of stake switch.
He's got a 2080, he can wait to get one. Not like he's got a gtx 970.

I have a 3060 through evga and hopefully their elite program will let me snag a 4060+
Not to be mean, but why get a Z690 motherboard with that CPU? You basically threw away $100 when a B660 is much cheaper and more appropriate for your CPU.
That's the smartest way to do it for a platform that's not done getting new generations of chips. He got the best value chip and now he's got a banging board for an 8 core 13th gen.

Just like I got an 1600af 6 core 12 thread for 85 bucks and an x570 Mobo... Now I can buy a zen 3 at a discount.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
He's got a 2080, he can wait to get one. Not like he's got a gtx 970.

I have a 3060 through evga and hopefully their elite program will let me snag a 4060+

That's the smartest way to do it for a platform that's not done getting new generations of chips. He got the best value chip and now he's got a banging board for an 8 core 13th gen.

Just like I got an 1600af 6 core 12 thread for 85 bucks and an x570 Mobo... Now I can buy a zen 3 at a discount.
It’s not the smartest way to do it since a B660 will accomplish the exact same thing and that $100 can be put toward a far better CPU like the 12600 which should easily take him through a few generations and there’d be no need to upgrade and/or any upgrade would likely not be worth the cost. If he went with a 12600K then gains he would get from like a 13900K would likely be minimal and unlikely worth the extra cost. As for you buying a 1600X and an X570, that was an even dumber move. You could have easily saved well over $150 by going with B450 and pairing it with the drastically superior 3600. That would easily have carried you through until AM5 came out.

There is a reason that system builders and YouTubers like Hardware Unboxed, Gamers Nexus, JayzTwoCents, Paul’s Hardware advise against pairing a weak CPU with a strong motherboard because more often then not that extra cost ends up being wasted.
 

SlimySnake

Flashless at the Golden Globes
I’m going to ask done basic questions here:
1. Did you update the firmware on the GPU? It’s something that a lot of people don’t do since it requires EVGA Precision software. If you haven’t done that, then do it. My GPU which is the exact same one as yours required a firmware update when I installed it. Even if you don’t plan on keeping the GPU it’ll be good information to have.

2. Is the motherboard BIOS up-to-date? ASUS boards have been doing funny things with nvidia GPUs as of late, I’d watch JayzTwoCents video on his problem with a 3080 Ti, a different GPU, but it’s a very educational video on how simple things can make things appear defective when they’re not.

3. Is Windows up to date? Sometimes a major software update doesn’t always show up as an immediate. Are you using Windows 11, if not you should.

Also, please tell me how you are testing your system. Since you and I have similar GPUs I want to put mine through the ringer since I’m now slightly worried that this particular series of 3080s could be a bad batch since they’ve been the most heavily discounted. So far I’ve had no issues, but haven’t really put it through demanding scenarios. I played a little bit of Doom Eternal, Rage 2 and Dishonored which is not very taxing. I’ll try RDR2 and see. I’ll also run the benchmark of Horizon Zero Dawn and see how that goes.
1. I did not.
2. I remember updating it a few months ago. I only bought it in August of last year and definitely updated it back then. I have a MSI Tomahawk.
3. Yeah, updated it the other day after i started having issues.
4. I simply run four games maxed out at unlocked framerates and at 4k. No benchmarks, I just drive around in the world. Cyberpunk, Matrix demo, RDR2 and Control. Initially, everything crashed within minutes despite me updating my firmware. Matrix would crash in seconds. After doing a clean install of drivers, I went from crashing every five minutes in cyberpunk, control and rdr2 to not crashing in cyberpunk at all (ran almost 35 minutes just driving around) while crashing in RDR2 and Matrix every 5 minutes. Again, just riding/driving around. Getting into shootouts every now and then. Then finally after I added the third PSU cable, I can now go 1 hour in Cyberpunk, 20 minutes in Matrix and RDR2. Havent tested Control yet.

I really wont know if its the GPU for sure until i go back to my RTX 2080 and run the same games again. It could still be my CPU but the errors ive gotten on Blue screens and on Matrix crashes have indicated GPU issues. I dont want to put too much stock into those Matrix errors since its just a demo compiled by a random dude on the internet, but its the one game that taxes my GPU the most. Routinely hits GPU clocks of 2050 and consumes over 400 watts. My CPU also goes up to 150 watts simply trying to run the game at 45 fps. in cyberpunk, it maxes out around 120 watts but only if i really push the framerate to around a 100 watts by reducing the resolutions. But the same CPU never really caused any crashes when I was playing the Matrix demo almost religiously just a couple of weeks ago on my 2080 so maybe you're right and the 3080 hasnt been pushed to its limits in games quite like the Matrix is doing. Like I said, Cyberpunk runs fine now but it also never hits 400 watts.
 
It’s not the smartest way to do it since a B660 will accomplish the exact same thing and that $100 can be put toward a far better CPU like the 12600 which should easily take him through a few generations and there’d be no need to upgrade and/or any upgrade would likely not be worth the cost. If he went with a 12600K then gains he would get from like a 13900K would likely be minimal and unlikely worth the extra cost. As for you buying a 1600X and an X570, that was an even dumber move. You could have easily saved well over $150 by going with B450 and pairing it with the drastically superior 3600. That would easily have carried you through until AM5 came out.

There is a reason that system builders and YouTubers like Hardware Unboxed, Gamers Nexus, JayzTwoCents, Paul’s Hardware advise against pairing a weak CPU with a strong motherboard because more often then not that extra cost ends up being wasted.
85 bucks for a zen+ 6 core is dumber than $200 zen 2? You do know which one of those had the much better perf for $ right? Calling 3600 drastically superior is stupid.

Going from zen+ to 3 is dumb but going from a 2080 to 3080 is genius?

How is it wasted now that I have gen pcie 4 (which I have just bought an ssd for) and now I already have the platform to upgrade to the cheapest/best zen 3?

This is all rhetorical btw :)
 
Last edited:
As for you buying a 1600X
1600af btw, not 1600x. $85 for zen plus cores. And that deal of the century was stupid xD

"As for you buying a 1600X and an X570, that was an even dumber move. You could have easily saved well over $150 by going with B450 and pairing it with the drastically superior 3600. "

My x570 plus 1600af cost less than $300 so your math is way off. And this way I have a much better mobo.

Your supposed upgrade path to zen 4 would have cost me more for a new mobo and ram, instead of just grabbing an 8 core zen 3 or 5900x at the lowest cost.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
1. I did not.
2. I remember updating it a few months ago. I only bought it in August of last year and definitely updated it back then. I have a MSI Tomahawk.
3. Yeah, updated it the other day after i started having issues.
4. I simply run four games maxed out at unlocked framerates and at 4k. No benchmarks, I just drive around in the world. Cyberpunk, Matrix demo, RDR2 and Control. Initially, everything crashed within minutes despite me updating my firmware. Matrix would crash in seconds. After doing a clean install of drivers, I went from crashing every five minutes in cyberpunk, control and rdr2 to not crashing in cyberpunk at all (ran almost 35 minutes just driving around) while crashing in RDR2 and Matrix every 5 minutes. Again, just riding/driving around. Getting into shootouts every now and then. Then finally after I added the third PSU cable, I can now go 1 hour in Cyberpunk, 20 minutes in Matrix and RDR2. Havent tested Control yet.

I really wont know if its the GPU for sure until i go back to my RTX 2080 and run the same games again. It could still be my CPU but the errors ive gotten on Blue screens and on Matrix crashes have indicated GPU issues. I dont want to put too much stock into those Matrix errors since its just a demo compiled by a random dude on the internet, but its the one game that taxes my GPU the most. Routinely hits GPU clocks of 2050 and consumes over 400 watts. My CPU also goes up to 150 watts simply trying to run the game at 45 fps. in cyberpunk, it maxes out around 120 watts but only if i really push the framerate to around a 100 watts by reducing the resolutions. But the same CPU never really caused any crashes when I was playing the Matrix demo almost religiously just a couple of weeks ago on my 2080 so maybe you're right and the 3080 hasnt been pushed to its limits in games quite like the Matrix is doing. Like I said, Cyberpunk runs fine now but it also never hits 400 watts.
Yeah, update the GPU firmware first. Even if the 2080 works without issue that won’t conclusively prove it’s the GPU,although that will work more heavily in the direction.

Other things to consider is the memory. Try running it at non-XMP settings. My NVME drive caused similar issues than you, but not in gaming. That could also be an issue.

Update the motherboard BIOS too if you got that motherboard last august it could be having issues with your GPU. Do not underestimate the value of clearing your CMOS either. It’s an overlooked move that can remove settings that you may have accidentally changed. Since you mentioned you have the latest version of windows 11 it’s possible that it’s conflicting with older BIOS.

What I would do:
1. Update the MB BIOS and GPU firmware. You till need to download EVGA Precision to do that. No ifs ands or buts, this MUST be done. Even if you’ve already decided to return the GPU.

2. After updates try the 2080 again. If they has an issue try memory settings and such.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Here is one the better pre-built PC builds I have seen. $1600 isn’t a great deal but it’s the best deal I have seen in a while. Something to consider.


 
Last edited:

twilo99

Member
I think that new 5800x3d is a great stop gap until things with DDR5 and AM5 settle down sometime in the next 12-16 months
 
Here is one the better pre-built PC builds I have seen. $1600 isn’t a great deal but it’s the best deal I have seen in a while. Something to consider.


Not bad at all spec wise. Except that I can tell it will have horrible cooling and you'd have to get either a new case or really muck about trying to force an aio in that case and extra case fans.

Here I am wondering if I can cool an 5800x 3d or not in my p400a case with 4 120mm fans and fuma 2 rev.b cpu cooler and then I look at this pre built with 11700...

Specs are good though, probably poor latency on the ram but it's 3200mhz, so good enough to keep. Not sure if the m.2 drive is gen 3 or just sata speed.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Not bad at all spec wise. Except that I can tell it will have horrible cooling and you'd have to get either a new case or really muck about trying to force an aio in that case and extra case fans.

Here I am wondering if I can cool an 5800x 3d or not in my p400a case with 4 120mm fans and fuma 2 rev.b cpu cooler and then I look at this pre built with 11700...

Specs are good though, probably poor latency on the ram but it's 3200mhz, so good enough to keep. Not sure if the m.2 drive is gen 3 or just sata speed.
In normal times that would be $1200-$1400 build.
 

JohnnyFootball

GerAlt-Right. Ciriously.


Right now, the best deal is currently the 6900XT for $1000 and is if RTX doesn't matter to you as that card is right at it's MSRP and is non-RTX faster than the 3080 12GB in most apps. if RTX absolutely does not matter to you, I'd go with that one. the 6800 XT still sucks in price at $800. However, these GPUs are limited by a 256-bit memory bus and slower RAM, so it MAY be worthwhile to see what happens with the upcoming 6850/6950 can bring to the table. But its very much a wait with caution.

Unfortunately, it appears correct that nvidia has more or less discontinued the vanilla 3070 and 3080. You CAN buy a 10GB, but at the cheapest price I have seen for $920 from EVGA. At price, it makes sense to spend $1000 for the 12GB version from EVGA. I keep talking about EVGA as they are pretty much the only brand I buy from. Sadly, this has more or less made me nvidia exclusive. I have heard Sapphire is the AMD equivalent in this regard. The 3070 Ti can be had for $750 in some cases.

Nevertheless, anyone holding out for anything close to a $499 3070 or a $699 3080 then you are probably out of luck.
 
Last edited:

Xdrive05

Member
Pricing trends are looking great right now! It's interesting how the higher end is getting closer to MSRP than the lower end. 3050 will probably be stuck at $300+ until or unless the 6600 drops even more. They don't seem to drive toward MSRP as quickly as the 3080's and up for some reason. That "$200 hero card" concept (shoutout 2kliksphilip) will continue to be elusive, though an actual $250 3050, should that ever exist, can probably get that nod considering inflation.

I really wish every tier hit the shelves at gen launch, instead of taking a year or more for the mid and low end offerings to show up. My 3050 will hold me over until the 4060 is a thing (and hopefully the 3070ti equivalent rumors are true), but that will probably be, what, two years from now? That's a bummer for those of us who can't justify $1000+ upgrades.

EDIT: forgot to mention, there is an RX 6400 now! AMD's $160 offering is basically an RX 6500 XT cut down by 25%, 12/12 instead of 16/16. The big win for this one is that 51w power draw, so no PSU line needed. But the same shit 4-lanes and lack of encoder support applies. AMD handicapped themselves by using their laptop design in these lowest end offering, meaning the 4 lanes can't be remedied unless you use a 4.0 spec board. Which you almost certainly never would if you're in this tier of market anyway. Further fucked by taking out their encoding support, which would have been the one saving grace giving special value to these cards still. It's a shame.

There's still kind of the very narrow market case for low powered "better than GTX 1650 performance" (on 4.0) for a good price, which is not for nothing I guess. So then I wonder if there's room for price drops on the 6500 XT and 6400. $150 and $100 respectively would change things quite a bit, there.
 
Last edited:
What’s the most popular or best 360 aio these days?
Maybe see if hardware unboxed has a recommendation.

Getting that 5800x3D mate?

I think the fuma 2 rev.b air cooler I ordered will do the trick based on reviews. Beats the Noctua U12A and it still fits in my p400a case.

Never did water cooling as I try to avoid the cpus that run like hot lava and I'd be worried about leaks lol.
 

Celcius

°Temp. member
Maybe see if hardware unboxed has a recommendation.

Getting that 5800x3D mate?

I think the fuma 2 rev.b air cooler I ordered will do the trick based on reviews. Beats the Noctua U12A and it still fits in my p400a case.

Never did water cooling as I try to avoid the cpus that run like hot lava and I'd be worried about leaks lol.
No I’ve got a 10700k overclocked to 4.9ghz all core and running a benchmark that puts all cores and threads at 100% pushes my old noctua nh-d14 almost like right to the edge of its capabilities. I’m wondering if a 360 aio would let me push it to 5ghz while also possibly running a little cooler too.
 
Last edited:

GreatnessRD

Member
5800X3D is selling better than I thought it would. My microcenter had 10 and now they're down to 3. Might have to expedite my decision on whether to pick it up or not, lol. If it doesn't get restock I ain't trippin', the 5700x would be a nice consolation prize, too.
 
5800X3D is selling better than I thought it would. My microcenter had 10 and now they're down to 3. Might have to expedite my decision on whether to pick it up or not, lol. If it doesn't get restock I ain't trippin', the 5700x would be a nice consolation prize, too.
I feel you on that. I wanted to wait so bad but I got the fomo lol. Saw the x3D was 40 to 50% faster than 5800x in some older games that I actually play and it swayed me, but the 5700x will be a good chip.

I have been thinking that the 5700x will eventually hit 250-60, probably.

X3D seems scarce, couldn't wait.
 
Last edited:
No I’ve got a 10700k overclocked to 4.9ghz all core and running a benchmark that puts all cores and threads at 100% pushes my old noctua nh-d14 almost like right to the edge of its capabilities. I’m wondering if a 360 aio would let me push it to 5ghz while also possibly running a little cooler too.
Well if you plan on re using this 360 aio on your next build, I don't see the harm.
 

SlimySnake

Flashless at the Golden Globes


Right now, the best deal is currently the 6900XT for $1000 and is if RTX doesn't matter to you as that card is right at it's MSRP and is non-RTX faster than the 3080 12GB in most apps. if RTX absolutely does not matter to you, I'd go with that one. the 6800 XT still sucks in price at $800. However, these GPUs are limited by a 256-bit memory bus and slower RAM, so it MAY be worthwhile to see what happens with the upcoming 6850/6950 can bring to the table. But its very much a wait with caution.

Unfortunately, it appears correct that nvidia has more or less discontinued the vanilla 3070 and 3080. You CAN buy a 10GB, but at the cheapest price I have seen for $920 from EVGA. At price, it makes sense to spend $1000 for the 12GB version from EVGA. I keep talking about EVGA as they are pretty much the only brand I buy from. Sadly, this has more or less made me nvidia exclusive. I have heard Sapphire is the AMD equivalent in this regard. The 3070 Ti can be had for $750 in some cases.

Nevertheless, anyone holding out for anything close to a $499 3070 or a $699 3080 then you are probably out of luck.

Yeah, the 6900xt benchmarks surprised me. Especially compared to the $999 RTX 3080 12 GB.

Crucially, the 6900xt is beating out the 3080 in UE5. Only by a couple of FPS, but its still very impressive. Cyberpunk, Control and Metro are its three worst performers, but with CD Project and Remedy both going with UE5, it's safe to say that only one or two studios will be using traditional ray tracing models that benefit RTX GPUs.

Here are all the studios using UE5. Pretty much all first party MS studios. We know Square Enix is using UE5 for Kingdom hearts, they already used UE4 for FF7 so they will likely use UE5 for FF7-2. We know Crystal Dynamics will be using UE5 for Tomb Raider. Respawn famously used UE4 instead of Frostbite. Even Bioware is using UE4 for the next Mass Effect.

85438_07_heres-list-of-the-game-developers-working-on-unreal-engine-5-games_full.jpg


That leaves Ubisoft studios who are pushing ray tracing really hard in games like Avatar, but those engines typically favor AMD cards. As well as Sony studios who will likely wont be using RTX cores too much since they have to dev for the AMD GPU in the PS5.

Long story short, 6900x is not a bad investment at all. especially since it only consumes 250-260 watts while the 12 GB 3080 can go over 400 watts consistently.
 

SlimySnake

Flashless at the Golden Globes
Here is one the better pre-built PC builds I have seen. $1600 isn’t a great deal but it’s the best deal I have seen in a while. Something to consider.


Thats an excellent deal but I have to warn everyone about the i7-11700kf. It is the unlocked version that routintely goes over 100 watts. Spends 140 watts running cyberpunk and UE5 even at around 40 fps. It also runs very very hot. Took me A LOT of time and money to get this thing to stop running at 45 degree while idling and 80+ degree in every game. This while using a cooler master AIO. Basically had to buy a better Arctic freezer 240 mm AIO then 4 140 mm case fans and remove the front panel altogether and replace it with a mesh cover to bring it down to 30-35 degrees.

These pre-built cases dont have good airflow for power hungry hot intel CPUs. they might be good for AMD CPUs but getting this CPU under control was a pain. you might have to spend $100 on a AIO cooler and another $100 on a new case. Plus some new ram because these pre-built PCs come with really cheap RAM that actually hold your pc back. Speaking from experience here.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
I think that new 5800x3d is a great stop gap until things with DDR5 and AM5 settle down sometime in the next 12-16 months
The performance of this absolutely blew me away. The price isn't great, but you can pair it with a cheaper B550 and compare it to the power draw and price of Intel's most expensive, it wins pretty decisively. The 12700K is currently the best Intel value it seems.
 
Got my nvme drive installed. Had to wait a bit longer because when it got here I realized it didn't have a heatsink, so I picked up a be quiet heatsink for like 13 bucks.

Drive is CRAZY fast, I really didn't expect it to be this much more responsive over the sata drive. And i'm still on the 6 core 1600af, from pushing the start button it takes 5 seconds to load windows. Man how snappy this thing going to be with the 5800x3d :messenger_grinning_sweat:

Feels like this PC is going to be all I need sans gpu upgrades for the whole gen tbh.
 

Klik

Member
Few days ago i bought a gaming PC after being console player for years and honestly i think its one of the best decisions i made in my life🤣

Bought I5 12600k
2x16gb ram
SSD 6000mb/s
27" 1440p 144hz monitor
RTX 3050ti

I bought used 3050ti because i will sell it in few months and get new 4xxx series
 
Last edited:

CrustyBritches

Gold Member
I had queued for a 3060 12GB and a 3060 Ti. Ended up selling the 3060 since by the time I got it I already had already bought a 3060 laptop. 3060 Ti purchase notification just came up. It was the card I really wanted and after rebate it would be $489. Already got a Steam Deck for my son, and my 512GB will be coming soon.

I just don't know if I'm really into desktop gaming as much as before. Not sure if I should pull the trigger or not. I have 3060 laptop, so it's only 6GB VRAM and 85W which is like a 2060 6GB desktop. That would be about a 60% jump and eliminate the VRAM bottleneck, but for $489 it's a hard decision to make.


Passed on it. I think I've moved into a stage where I prefer the convenience and flexibility of laptop and handheld PC gaming.
 
Last edited:

manfestival

Member
Dang it. Amazon sent me the 5700x by mistake instead of the x3D 😠

Now I have to wait until may 13th :/ they said I could use the 5700x until I return it, so maybe I'll test it out in the meantime.
yooooo lol what the. I guess it is gonna go straight to the warehouse used section. You may as well use it! Take some benchmarks and do your own hardware unboxed style of benchmarking cause... why not? lol
 
yooooo lol what the. I guess it is gonna go straight to the warehouse used section. You may as well use it! Take some benchmarks and do your own hardware unboxed style of benchmarking cause... why not? lol
I think I will do that. Presumably the ram xmp settings should be the same for both Ryzen 7 chips.

Sythe fuma rev.b cooler comes tomorrow, so I'll open it then. I can test 3 processors : 1600af (aka 2600), 5700x and x3D.

Even picked out the games to test, a mix of old and new to see how the 3d cache reacts.

Witcher 3, metro Exodus, kingdom come deliverance (since I know it really is affected by cache) and crysis warhead. 🤔 Heck I'll throw Spyro ignited in there since I'm currently playing it (runs at 120fps 1080p on 1600af, could probably hit 1440p but my tv can't do 1440p 120)

Normally I would test at 1080p but since I "only" have a rtx 3060, I am going to test at 720p to see how the x3D will scale with the new 4xxx series and onward.
 
Last edited:

manfestival

Member
I think I will do that. Presumably the ram xmp settings should be the same for both Ryzen 7 chips.

Sythe fuma rev.b cooler comes tomorrow, so I'll open it then. I can test 3 processors : 1600af (aka 2600), 5700x and x3D.

Even picked out the games to test, a mix of old and new to see how the 3d cache reacts.

Witcher 3, metro Exodus, kingdom come deliverance (since I know it really is affected by cache) and crysis warhead. 🤔 Heck I'll throw Spyro ignited in there since I'm currently playing it (runs at 120fps 1080p on 1600af, could probably hit 1440p but my tv can't do 1440p 120)

Normally I would test at 1080p but since I "only" have a rtx 3060, I am going to test at 720p to see how the x3D will scale with the new 4xxx series and onward.
Yeah I might be a bit of a turbo nerd but I do a bunch of benchmarking on any hardware I get. Just love to see what kind of real world benefit I am getting from these changes. Plus... I kinda enjoy it. If you have any Total War games I strongly recommend using one of those for testing. Especially if you can gamepass it. Total Warhammer 3 is an absolute beast for benchmarking
 
Yeah I might be a bit of a turbo nerd but I do a bunch of benchmarking on any hardware I get. Just love to see what kind of real world benefit I am getting from these changes. Plus... I kinda enjoy it. If you have any Total War games I strongly recommend using one of those for testing. Especially if you can gamepass it. Total Warhammer 3 is an absolute beast for benchmarking
Oh I agree, it's fascinating. Don't have total war, but I'd test it if possible. I know StarCraft 2 is like 50% faster on x3D vs 5700x though.

Also curious to see how far this 1600af can be pushed at 720p because until recently I have played games at higher res @ 60fps... But 120fps, even at 1080p is a game changer. Definitely will target 120fps from now. With black frame insertion on x900e Bravia at 120fps it's almost not game like anymore. Played metro 2033 original (amazing graphics still) last night at 1080p, 4x msaa, ran 120fps like a champ. Not locked, but 1% lows are above 100 fps. Will see if that's cpu bottleneck or not.
 
Last edited:

Hezekiah

Banned
I see 3080s getting close to MSRP levels, well I see them going ~£850.

Anyone thinking of buying one? I can't help but feel it's better to wait till the second half of the year for 4000 series, but no idea what supply is going to be like.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I see 3080s getting close to MSRP levels, well I see them going ~£850.

Anyone thinking of buying one? I can't help but feel it's better to wait till the second half of the year for 4000 series, but no idea what supply is going to be like.
The 3080 has pretty much ZERO chance of being at the $699 price point. The only way I see that happening is if there is still sufficient stock at the time of Ada Lovelace and nvidia wants to clear them out. Possible? Sure, but unlikely. 3070s and 3070 Tis can pretty much only be had at that price point.

Supply of the 4000 series is going to shit and they are going to be scalped like crazy and knows what the MSRP of said cards will be. They wont be cheap as we know that people will pay absurd prices. Who also knows what will happen with crypto.

At $850, I'd say you're better off spending a little more and going with the $999 EVGA 12GB 3080. If you don't have $999 to spend, then you're best bet is to probably go with the cheapest 3070 Ti you can find, which at the moment is this one for $739.

If RTX doesn't matter to you then the 6900XT at similar prices is easily the best option. Its currently the only GPU that is being sold at its MSRP.
 
Last edited:
Status
Not open for further replies.
Top Bottom