Help with next GPU upgrade

gillmanjr

Member
Hey guys, I am really struggling right now with my next GPU upgrade. I literally have the money right now and am ready to pull the trigger but I cannot decide which one to go with. The ones I am considering are the RTX 3080, 3080ti, 4080, and Radeon 7900 XT. I currently own an RTX 2080 and a 5 year old Acer Predator 1440p ultrawide with G-Sync but I am intended on upgrading to the new Alienware QD-OLED ultrawide (the DTW model with Freesync premium). Because I am most likely going to be buying that monitor with Freesync it is making me lean towards the 7900XT. However, I am stuck because I know the 4080 would end up lasting me longer. But I also know I would get more for my money with the 7900 XT. The 3080 and 3080ti are kind of on the bottom of my list now because most 3080 models are slightly more expensive than the 7900XT and yet are outperformed by it in most cases. Even the 3080ti is outperformed by the 7900 XT and yet is way more expensive.

What do you guys think? I really want a GPU that is going to last me at least 5 years and well into my next PC build, which will probably be 14th GEN intel. I know the 4080 would be best for future proofing but the 7900 XT is the best bang for the buck GPU right now...
 
The 4080's kinda a dud for how much it costs.

Are you struggling on some games with that 2080 right now?
 
I'd be a little hesitant on the 7900 series for this kind of stuff:
^I hadn't seen that but that is referencing the XTX cards only...and it appears ONLY the AMD models, not the 3rd party. Shouldn't be a concern with the XT unless you've seen the same for them. I don't plan to buy the XTX, its way overpriced right now because of scalpers.

The 4080's kinda a dud for how much it costs.

Are you struggling on some games with that 2080 right now?
Yes, particularly in Satisfactory, which I'm currently re-playing. But more importantly is the upcoming Bethesda game Starfield. I want to be able to run that game on the new Alienware at maxed out settings without issue. Besides, I have rarely been able to keep my current monitor pinned at 120 Hz (its max frame rate) with modern games. No way I'll be able to get the Alienware to 165 Hz with my 2080. Its time for an upgrade. I think my 9700k is going to be able to keep up for a couple more years but if not, I'll build a new system earlier than planned. I think GPU is the first step though because a monitor upgrade is a definite for me, and soon.

Also, I don't understand this talk about how the 4080 is a dud or whatever. I've read similar things elsewhere. The benchmarks show it is a 50% or more improvement over the 3080. How is that a dud at $1200? A 3080 is still over $800. Granted these are all ridiculous prices compared to a few years ago, but it is what it is. Seems like the 4080 is priced about right for the performance it offers, as far as nVidia cards go. The only card that is an outlier right now is the 7900XT, which appears to be the best priced card but then again the Radeon cards struggle with raytracing.
 
Last edited:
The benchmarks show it is a 50% or more improvement over the 3080. How is that a dud at $1200? A 3080 is still over $800.
That's the problem. Historically, it was never like this. The pricing would've stayed the same +/- minor price fluctuations and you got the performance jumps from each generation. As an example, you shouldn't be paying 3090 ti prices for a 4080 and when you compare those two, the performance improvement isn't 50% that nvidia claims.

Nvidia's greed is showing more and more and it's not right.
 
That's the problem. Historically, it was never like this. The pricing would've stayed the same +/- minor price fluctuations and you got the performance jumps from each generation. As an example, you shouldn't be paying 3090 ti prices for a 4080 and when you compare those two, the performance improvement isn't 50% that nvidia claims.

Nvidia's greed is showing more and more and it's not right.
I agree that the prices are outrageous, I’m not defending nvidia at all, but again it is what it is. I actually just watched a Jayz video this morning where he described this exact dilemma with buying a GPU right now, and that’s what it is, a dilemma. I saw another video that Steve Burke did a few days ago and he recommends waiting if you are in the market for a high end GPU right now. So I think that’s what I’m going to do...wait for a little while. Unfortunately I’m not known for my patience when I want something new for my PC.
 
Maybe a used 6900xt or 3090Ti? That seems to be what a lot of people are doing for scooping them up at a discount as a tide-over until the next generation which may have more 'favorable' market parameters.
 
So I have been looking around at new and used GPUs all week on amazon, ebay, craigslist, facebook, you name it. I ALMOST pulled the trigger on a used 3080 ti from a guy on craigslist, he was only asking $400, but it fell through. It also sounded kind of shady, especially considering the price. I also had a new 7900 XT in my cart on Amazon but didn't pull the trigger. Finally today I was looking at new 4070 ti on Newegg and ended up pulling the trigger on an MSI Gaming version. I really didn't want to buy a 4000 series Geforce card but the truth is I think the 4070 ti isn't too expensive considering the performance. I got it for $850 and they are basically equal to a 3090 or 3090 ti depending on the game. Not bad considering I was looking at 3090s specifically on ebay and could not find a decent USED one for less than a thousand.

Unfortunately this changes my plan for monitor. I really had decided firmly on the Alienware AW3423DWF but now I'm going to have to go with the DW model with the G-Sync ultimate chip in it. Or wait until another manufacturer releases a G-Sync QD-OLED ultrawide. I hate the idea of having an $800 nvidia GPU and buying a monitor with Freesync. I couldn't live with that.
 
Last edited:
Didn't nvidia add freesync support for their GPUs? IIRC you can go that way, but a gsync display with the module doesn't support freesync.
 
I did a little digging based on your replies and it looks like the AW3423DW (G-Sync ultimate) does, in fact, support Freesync. However, the AW3423DWF (Freesync Premium) does not support G-Sync. I wanted the DWF because its black and it supports firmware updates, the DW does not, but are you guys also saying that the 4070 ti will work with a Freesync only monitor?

Also, I just saw something about CES and there are a lot of new QD-OLED monitors being released this year, some of them look pretty epic. I might have to wait and see the specs on whats coming before I buy anything.
 
AW3423DWF (Freesync Premium) does not support G-Sync.
If it's not officially on the G-Sync list, you can force enable G-Sync for any Freesync Compatible monitors through Nvidia Control Panel.

The DW doesn't support firmware updates because it has a G-Sync module and that's locked behind nvidia's proprietary firmware.
 
G-Sync modules allow for tear-free variable refresh rate down to 1hz. Freesync normally doesn't go down that low.
 
I don't get it...whats the point then? Why do they even bother with the G-sync bullshit?
Back in the dayyyyyyyyyyy they were segregated/separate. Freesync came later since Gsync was nvidia proprietary, but it's an open standard. They probably added it just as a marketing bullet point 'OMG SUPPORTS BOTH VRR MECHANISMS, LOLOLAMD CANT DO THAT'
 
Got my 4070 ti the other day and got it installed. Holy crap its ENORMOUS. Makes the 2080 look like a baby graphics card. I'll take a photo later, I have a Lian Li 011D mini so its absolutely huge in my case. I already did a little gaming and it is unbelievably quiet, so that massive heatsink and triple fan setup is worth having. Makes so much less noise than the 2080 Ventus I had. However, I was playing satisfactory last night for the first time and I got the impression that the game was being massively bound by my CPU. I was able to go from low graphics settings to high without much change in average frame rate (which was totally impossible with the 2080), but I still get a lot of frame rate drops, and they were worse at the higher settings. So I'm assuming thats the 9700k really holding it back. You know what I'm thinking now, right?

What do you guys think? How much of a bottleneck do you think the 9700k is for the 4070 ti? FYI I have mine clocked to 4.9 GHz. Also, keep in mind that its a Z390 chipset with PCIe 3.0 (if I remember correctly). Not sure how much of a difference that would also make in holding back the 4000 series RTX cards...
 
Last edited:
Back
Top