Hello 👋
This is my first post on the fediverse after ditching Reddit, still trying to figure out how it works so I may be posting this wrong, in which case my apologies 😅
After the slight disappointment that is the RTX 4070 and overpriced RTX 4080/4090, I’m looking at potentially getting an RX 7900XT. The price has dropped from ~800 in March to ~720 this month. What I’m wondering is if it’s worth copping it now that’s it’s slightly cheaper? Would it be a future proof card?
I’m not yet massively sold on ray tracing or DLSS or any of that stuff and I’m hoping that the power that card can output make up for it.
imo high end AMD pricing is over priced. They are pricing as if crypto is still operating, Nvidia is expensive but AMD is struggling to compete. Intel actually looks good on a budget, and AMD looks good for middle ground.
For the price you found ~720 for 7900 XT is a pretty good deal, The VRAM on AMD is superior which really helps future games.
I bought one recently to upgrade my 1080ti, and I’m happy with my purchase.
Caveat is that I’m a gainfully employed adult with plenty of disposable income. I can’t in good conscience recommend such a costly upgrade to someone on a budget. The GPU market is still just an absolute mess, you would get better value updating just about any other component: games are leaning hard into SSD tech, are you ready?
But if you’re like me and can afford to upgrade your toys and are just looking for a reasonable choice, then I can say the 7900xt has been a great thing for me. I play on a 1440p 144hz monitor.
It depends what you’re upgrading from. There’s rarely a good reason to immediately buy new hardware for PCs imo, especially if you’re on a budget. If you’re going from a 2080 then you’re still good for the next few years. For me, I recently upgraded from the 970 to the 6800 when it had it’s price drop to $450 last month. Although, I’d say that s far it’s been worth it, AMD is still behind Nvidia so the experience isn’t as nice. Intel is also making massive improvements and works fine for the most part with newer games.
Huh, I went from a 1070 to a 6700 XT, I’d say the experience is equally nice, and I actually like Adrenalin more than GeForce Experience.
My friend updated to the 6800XT and told me the same thing but the moment I told him to boot up Deep Rock Galactic he immediately started to notice a lot of stuttering. Gunfire Reborn also had a lot of stuttering along with a few more games I play like Metro Exodus and No Man’s Sky. V-Sync somehow fixes all the stuttering issues.
I’ve had other problems like Windows 11 overriding my GPU drivers which makes Adrenaline unusable until I reinstall the drivers. It’s very annoying having to do this every time there’s an update since I’d have to either reinstall Adrenaline or make sure I have the proper drivers hidden away somewhere so I can reinstall them.
I happen to have DRG; loaded it up and played a bit on solo. A little stuttering at the start of a new mission, but other than the first 5 seconds or so it was pretty smooth. I wasn’t benchmarking but my Steam FPS counter indicated 165 FPS every time I saw it. I made sure to turn off V-sync first.
Windows overriding the display driver is a thing, I think it happened once with me on Windows 10, hasn’t happened since I use Group Policy to stop Windows Update from updataing my device drivers.
We had a lot of stutters in the main hub and throughout the missions with a lot of nasty screen tearing in DRG unfortunately. Not sure what’s going on there since we both have pretty decent rigs. I’m hitting over 244fps on it consistently except for when the strange stuttering happens with vsync off.
There were reports that Group Policy didn’t prevent Win 11 from overriding your drivers just last week.
https://www.extremetech.com/computing/windows-10-and-11-are-ignoring-update-settings-and-installing-gpu-drivers
It’s been a frustrating experience overall in regards to drivers.I’m perplexed, I really don’t have those issues once the game is “fully loaded”, so to speak. On a 3700X with a 6700XT, so it’s not top of the line either.
Read the link, turns out I also did the solution mentioned in the article. Hope that holds.
I dunno wot else to say lol I have a 5800x3D with the 6800.
I think you mightve misread some stuff because they mention directly that even if you disable that feature through group policy it won’t work.
"However, now it seems the company is taking this a bit further and is forcing driver updates for Nvidia and AMD graphics cards, even if you have this feature disabled via Group Policy Editor. "
"A Twitter account named @ghost_motely posted screenshots showing a Windows 11 machine’s Group Policy Editor with driver updates disabled and a fresh Nvidia driver installed recently, seemingly ignoring this setting. "
I do know how to fix these problems myself. It’s just a bad experience since Adrenaline refuses to even boot unless you reinstall the drivers yourself manually.
I’m currently on an RX 580 and it’s worked very well for all the time I’ve had it, however I’m starting to play more demanding games so.
I really do want to give Intel a chance, if only they had a beefier card to match up to the 7900xt at the likes
honestly the a770 can play most AAA games at 1080p which is “good enough” for most people since the majority are still playing at that resolution. If anything the 3060 or the 6600xt are around the same price range for similar performance. If you do need good performance then I would think about the 6800 or the XT version especially when it drops down to $450 again.
No AMD would be future proof. Future is AI tech. Unfortunately AMD has none of it. Until AMD gets the equivalent of Tensor cores, I wouldn’t recommend it. AI tech is moving REAL fast, and there is a lot to be gained in speed, physics and fidelity.
AI tech is moving REAL fast
Which is a good reason why you should not settle too fast for any “silicon baked” solution. So far tensors are a buzzword for the people with “fear of missing out”, and I certainly don’t want any technology that is so demanding on energy that it requires a specific architecture to run. Our houses are becoming ovens and we need to make these graphic cards less demanding, not more demanding, not more sophisticated.
That is fair, it’s a real shame AMD hasn’t caught up yet with AI yet. I have managed to bash my Tensorflow to use my RX 580 but it’s not as smooth or stable as an Nvidia card.
It should be fine unless you need CUDA. Most games coming out will have AMD FSR baked in which is close enough to DLSS. AMD typically is higher VRAM which is the best thing to have to “future” proof.
FSR 3 is just around to corner too, right? 😉