Why I Regret Switching to AMD's RX 9070 XT GPU After Years with Nvidia

RX 9070 XT Rant – A Long-Time Nvidia User's Frustration

Owner of an RTX 4090, RTX 3070, RTX 3070 Ti, and now the brand-new AMD RX 9070 XT—their latest flagship GPU. I’ve been a loyal Nvidia user since the GTX 1070 Ti back in 2018. My first ever GPU was the AMD HD 7950 back in 2012, and after years of dealing with AMD's notorious driver issues, I avoided them entirely—until now.

I decided to give AMD another chance with the RX 9070 XT, using it exclusively for gaming while my Nvidia cards handle ML workloads.

Big mistake. The experience has been rough, to put it mildly.


1. Micro-Stuttering Galore
  • Assassin’s Creed: Shadows, Overwatch, and Cyberpunk 2077 all suffer from frequent micro-stuttering.
  • It's jarring, immersion-breaking, and downright frustrating.

2. FSR is Still Not There
  • Tons of shimmering, pixelated distant textures, and overall lack of clarity.
  • FSR "Native + AA" feels like it’s barely on par with DLSS 4.0 (Quality mode).
  • Visual quality takes a serious hit.

3. FSR 4 Adoption is Barely Happening
  • Most games still don’t support FSR 4, not even brand-new releases like AC: Shadows.
  • Had to mod FSR 4 into the game manually to get a result that’s close to DLSS.
  • This shouldn't be the norm for a modern flagship GPU.

4. Anti-Lag+ is Doing the Opposite
  • Traced the worst stuttering back to AMD’s Anti-Lag+ feature.
  • Turns out it discards frames that take too long to render—great idea in theory, but a disaster in practice.
  • In fast-paced games like Overwatch, this results in horrible stuttering and makes the game borderline unplayable.

5. Misleading FPS with Frame Generation
  • First time launching AC: Shadows, I saw 2x the FPS of my RTX 4090 and thought, "Wow!"
  • Then I realized both AMD Fluid Motion Frames and FSR Frame Generation were enabled by default.
  • While it technically gave 4x the frame rate, the actual gameplay experience was awful—tons of stutter and poor frame pacing.

Final Thoughts

Sure, Nvidia GPUs cost 20–30% more.
But they just work. You're not spending hours tweaking settings or hunting down fixes just to enjoy your games.

With AMD, even their latest flagship RX 9070 XT feels like an early-access beta experience.
With Nvidia, you’re paying for stability, polish, and peace of mind.

If you're thinking about switching to AMD for gaming—think twice.

Comments

    • +1

      I'm still undecided what to do with my old system. It's a whole system that's still pretty decent - 5600X, 3080, 32GB DDR4. For now I've set it up as a "couch PC", but not sure how much use it'll get. May just sell the GPU by itself and keep the PC to turn into a NAS/server with Unraid in the future.

      • Def. sell the 3080 if you can get around $600 for an almost 5-year old GPU.

  • For number 1: Disable ULPS.

    Problem solved.

    • +1

      Actually I did that too on my system. It makes a massive difference.

      • I was going to jump ship back to nVidia until I did this one thing.

  • +1

    Yo Jensen, be happy

    The overwhelming gaming reviewer community disagrees with your sentiment OP

    Also Misleading FPS with Frame Generation, laughs in Nvidia. The 50 series is all fake frames, well documented at that as well

  • +1

    Do a poll and settle the score once and for all!

  • You'll want to use DDU to clean out nvidia's drivers and registry entries. Switching GPU manufacturers typically came with the recommendation of "reinstall windows" for a reason. Thanks to the driver cleaners out there we don't need to go that that step.

  • I've been wanting to build my first PC in over 15 years. Not sure if I go Intel/AMD for CPU & Nvidia/AMD for GPU.

    I'm thinking screw it, I'll just stick to an abacus.

  • I've got a 7900xt and never had these issues. I am guessing this is a driver install issue. I know other people have made the switch and ended up with two graphics drivers or a very outdated (like doesn't support the 9070 at all) driver. It could be a Windows issue too. I wouldn't race to blame anything until a new Windows install with up to date drivers.

  • I've used both NVIDIA and AMD GPUs.

    Both have pros and cons, but as long as DDU is used, I've experienced zero issues.

    The 9070xt isn't comparable to a 4090 though.

  • -6

    Why is a horrendously poorly made game like "Assassin’s Creed: Shadows" being used as a standard? It's pure DEI slop and performs as such on every system and console the same.

    • +1

      It’s actually some of the best looking games… the physics looks really good.

    • It's pure DEI slop

      What exactly is DEI about it? The character which everyone is outraged about (a black samurai) is literally a historical figure who was black.

      If you have a problem with a game because of the skin colour of a character, then I think that says a lot about you.

      • well 3 out of 4 of the black samurai's romantic interest options are male and 1 of those is even non binary

        So they could have been referring to that sort of stuff rather than the historical figure aspect of the character

    • "Tell me you're an ignorant Asmongold viewer without telling me…"

    • I still use GTA V as my gaming FPS benchmark. Although I've been testi g Cyberpunk a lot more.

  • Couldn’t make the switch due to how great RTX HDR is, better than a lot of native implementations

  • +10

    TL;DR PEBKAK, FUD post.

    Many of the issues you raise are literally born of Nvidia.

    DLSS at launch (20 series) was non-existent - ditto for every new generation revision, it's been mere weeks since FSR4, adoption takes time. Normal FSR is universal and can upscale any game or application, though it is of course no where near as good - it's a nice option to have built into the drivers where you would otherwise need to run at non native resolution, or use the worse in game scaling. For now you can use Optiscaler to inject games that already support FSR 3.1 (and even DLSS2) with FSR4.

    Frame generation is also a Nvidia ploy to fake FPS so that they can stinge you on hardware specs for a higher price and mislead you on performance in launch slide charts. If it was enabled by default this would be due to a global performance profile that you have chosen after launching AMD Adrenaline, the same as through NVidia's software, just turn it off globally.

    It's well documented that there have been compatibility issues with antilag and some can perceive it as abrasive, but cutting frames is literally the point of it, just turn it off if you don't like it. Leave it off by default and only turn it on if you think it will be beneficial.

    You can turn all of this stuff on or off universally in Adrenaline (AMDs drivers) and then re-enable it for various games as appropriate.

    I have been building computers and gaming since the Voodoo, throughout the TNT, Geforce 2/3/4, 970 Pro, >>> 10,20,30,40 series > RDNA1/2/3, often owning both competing models between different use case PC's. Currently there are pro's and cons to both but they are minor and if you have half a brain cell it's basic configuration option to change, and is probably just down to familiarity. A Nvidia card black screens a new game release and no one bats an eye, they just wait for a new release game driver. It happens on a Radeon fixable in configuration and everyone loses their minds.

    None of this is common or typical, my Nvidia and AMD PC's have been indistinguishable to use, outside of my own tweaking preference (which I prefer AMD drivers for, due to per game OC - rather than global through a third party app, amongst other additional customisations). In addition to that, since the 50 series release, the prices have been an absolute joke, not interested in that.

    I have recently purchased a 9070 XT, to replace a 6900 XT in my media PC and have been very pleased with its out box performance and silent operation for the price point (I have a PowerColor Red Devil). If you could actually find an equivalent model 5070 Ti available close to the price of the Radeons then they would be worthy of consideration, they overclock well and have plenty of bandwidth, however they retail at a 50% higher price and I cannot see the justification for that cost. Pay another $500 so that you don't have to go in and disable antilag after choosing the wrong global profile? Give me a break.

  • +1

    Why would you buy a lesser card than your 4090?

  • +1

    Edyolo, beeze, Godric, burningrage, gromit, etc, thanks for taking the time to put your thoughts down. I read all your comments with interest!

  • -1

    I agree. I did the same years ago moving to an AMD discrete card from my usual Nvidia card. It was such a big mistake. I was plagued with driver problems and had to keep waiting for a new version where hopefully the AMD team would have addressed the issue I was having with one game. I was frustrated. You pay good money even if it might be cheaper than Nvidia's card.

    Your money does not have bugs or issues. It is hard cash you work for and I could not deal with having so many problems that I could not enjoy what my money bought. I ended up selling the system and got onto Nvidia cards since then. I do not mind paying extra if it means my games work and I do not have to deal with issues.

  • +1

    From my experience with my 5950X with 6900XT:

    Anti-lag tanks the FPS. Even with my CPU boosting to 5.1ghz, it tanks the fps unnecessarily without perceivably improved responsiveness at 4k120 on an OLED monitor. Anti-lag would've contributed to the micro-stutter along with the AMD shader cache building in its own strange way. Once it builds the cache, microstuttering isnt as present.

    FSR3 support is still missing in a surprising number of games, but at least there are plenty of mods. Where mods are missing, I just use lossless scaling.

    Definitely avoid AFMF. it inflates the fps number but feels very stuttery. FSR3 runs much better than AFMF, same with lossless scaling.

  • Same here, Nvidia to AMD but my issues are with the sub-par software. Otherwise, I am still rocking an AMD GPU and CPU and waiting for 5070ti to go on sale before pulling the trigger. If I am paying $1k+ for a GPU, I better make sure I am getting the best.

    Buy cheap and you buy twice

  • Massive fan boy rant. Leave the AMD's for the big boys mate.

  • OP spent all this time to write up some nonsense and has barely commented.

    Reading comments from users plus all the reviews I've seen on these cards leads me to believe he's full of baloney

    Plus the fact he seems to be running multiple cards from competitors which is going to cause driver issues

  • My first ever GPU was also the AMD HD 7950, currently running an RX480, never had any driver issues or problems otherwise with either card, that also includes running overclocks, and at the moment underclocks as well.

    I'd recommend properly cleaning out the nvidia drivers, more than likely causing issues.

  • they just work

    Until they burn down your house 🔥

    • There's a house fire every 3 nvidia customers. With the 5090 coming more into stock it's going to be 1.5 house fires per customer. By 2026, it's going to be 2 house fires per nvidia customer.

      • Perhaps by 2026, the only people able to afford the top end Nvidia graphics cards will be those able to afford a fire resistant room.

  • If you're switching between manufacturers, you'll need to install a fresh copy of windows. All the issues you are experiencing are incredibly likely to be caused from nvidia/windows interacting with the card.

Login or Join to leave a comment