• expired

Sapphire Pulse Radeon RX 7900 XTX Gaming OC 24G Graphics Card $1459 + Delivery ($0 C&C/ in-Store) @ Umart / MSY

830

Been poking for prices, seems to be an all time low for this card. Fairly good compared to the RTX 4080 which is still at $1699, and a good performance boost in games over the 4070 Ti and 7900 XT.

MSY link: https://www.msy.com.au/product/sapphire-pulse-radeon-rx-7900…

Related Stores

Umart
Umart
MSY Technology
MSY Technology

closed Comments

  • +13

    Hodling for the other Sapphire Card to drop

  • +8

    Sapphire is good but its still expensive IMHO

  • +6

    If the Nitro+ drops Im gonna bite so fast

    • Drops by what?

      $1? $100? more?

      • $100+ would be nice

  • +2

    HODL, wait until it drops below $1100

    • +5

      Maybe in two years.

      • Nar I think we will see that figure next year soon after RDNA4 cards are announced

  • Just wondering what is convincing people to buy Radeon cards and thus forgo all the Nvidia tech like DLSS, frame generation, etc?

    • +18

      VRAM. This price is rather close to the RTX 4070 Ti and you're getting double the VRAM and around 15-20% more performance on average. DLSS is nice though, but even it cannot prevent frame stutter at higher resolutions caused by lack of VRAM.

      • +5

        4080 was 1499 recently. This is close to 4080 pricing.

        • Sure, but 16GB will be a limiting factor one day and that was with eBay Plus codes. Normal low price for a 4080 is $1699.

          • +8

            @KARMAAA: By the time 16GB is a limiting factor it’ll be time to upgrade the card anyway. Consoles have about 10-12gb of available VRAM for comparison since they have to share the other 4 with the CPU and rest of the system

            • @piston3461: Sure, but even then, PCs are using more VRAM than consoles in general, it's been a trend since Xbox 360 days that this is the case. I wouldn't be surprised if 12GB is considered a limiting factor by end of next year. Plus, PS5 Pro and Xbox Series Next are on the cards too, they might bring even further RAM increases to 24-32GB.

          • +7

            @KARMAAA: 16GB won't be a limiting amount of VRAM for a long time.

            Nothing wrong with preferring AMD if that's your thing but this card will be irrelevant before the VRAM becomes an actual advantage over the 4080. I'll take DLSS, AI, RT performance, VR performance, better drivers, etc rather than chasing a VRAM number you won't need.

            That being said, good price for a good card. For me personally the recent $1499 RTX4080 was a better buy but can't go wrong with this either.

            • @saitaris: I don't prefer AMD I'm using a 3060 Ti right now as we speak. I agree that more than likely the GPU is the limiting factor before the VRAM, but I cannot predict developers and how they release games. But the latest trend from developers is to make the worst PC port you can and to fix it later with patches down the road such as LoU Part 1 and Hogwarts Legacy and the only way to ensure you don't encounter a VRAM issue is to ensure you have more VRAM. I think DLSS and Frame Generation are killer features but they have shortcomings and they don't work across all games so in cases of games like Jedi Survivor you're out of luck unless you Shell out more money for a DLSS mod via a Patreon project.

              • +1

                @KARMAAA: You're right that DLSS etc doesn't work in every game, but you could say the same about VRAM… you may never actually need 24GB during the "useful life" of the card. If you hold onto your GPUs for 6-7 years then probably a more worthwhile consideration, but on the flipside features like DLSS and FG will help you get longer out of a card too.

                Guess the 4090 is the answer ;)

                • +1

                  @saitaris: 4090 is a totally different price bracket. But maybe one day we will have better cards and pricing than these trash generations from AMD and NVIDIA

                  • +1

                    @KARMAAA: 4090 price to performance is not bad actually. But the rest of the cards are.

            • @saitaris: So two black marks against the 4080 are physical size, and the need for a different power connector.

              For me that was the decider on a 7900XTX…

              It outperformed the 4080 by an edge in many cases, physically fit in my chasis, and didn't need a power supply or cabling change.
              Over time the drivers will improve… although that should not be a reason to buy, it's just what AMD do… they software catches up with the hardware.

              I'd have to be able to pick up a 4080 FE (that fits) for under $1200 for me to even consider it over the XTX for $1450~
              And this is from someone whos last ~6 purchases have been nvidia cards back to the 580

    • +6

      Raw raster. Some of my sim games in VR don't have DLSS, so raw raster performance and memory are good.

      • Have you looked at VR benchmarks for the 7900xtx?

        Performance was all over the shop but even the best performance was poor.

        They may have done some work on it since I looked (it was a few weeks after launch) but I distinctly remember seeing the 6900xt basically neck and neck with it in some VR games.

        • obviously the 7900xtx will massively out perform the 6900xt, it's not surprising that performance is hit and miss with immature drivers

    • +6

      I bought the XT version of this card after over a decade of Nvidia. Nvidia offering too little for to high a price. I vote with my wallet.

      Really happy with XT, wouldn't go back to Nvidia. Dont care for RT and FSR close enough to DLSS so non issue. Plus were talking 20gb vram vs Nvidia still using 8. Wsnted 4k@120hz HDR and these do that great on AAA titles

    • -4

      Some people inherently hate DLSS. Either becuase they are fanbois of AMD or havent tried DLSS or just tried DLSS in games where they are not good, or at very low resolutions. Frame generation is a lil gimmicky, but it will get better over time. DLSS quality+ Frame gen can give you 2X the frames with the same latency as native rendering as per youtube videos.

      If I have to choose between this and 4080. i would lean over 4080 for a lil bit more money.

      • +1

        I can see the difference between DLSS on or off enough to bother me I prefer native resolution for now

        • I use DLSS quality at 4K 42" and in most games I tried, the difference is 0 to minimal to DLSS being better than native also.
          With FSR on 42" the difference is more pronounced and easily noticeble.

          Like i said if you use it at lower res, you will notice the difference more. There is a long HUB video on this, watch it.

        • What resolution are you running currently?

          At 4k, DLSS quality looks better than native in my opinion. You get all the temporal stability of TAA with none of the smearing. In some cases things can look even sharper than native as well.

          There is still the occasional fault in the image, but they're rarely noticed when I'm actually playing and they're not bad enough to make me want to sacrifice ~30% fps.

      • -1

        Wait, so people inherently hate DLSS because frame generation is " a lol gimmicky ", so that makes people and fanbois

        • -1

          Ya, I gave a combination of reasons, and people find the one that annoys them lol.

    • +2

      cheaper + lots of people dont really care for ray tracing (at least right now while it ruins your performance) and fake frames dont make up for lower overall performance

    • From someone who owns a 6600xt, Radeon owners mainly don't care about driver issues. If they did they'd buy nvidia.

      Memory architecture on the high end AMD cards is superior to nvidia (same/more bandwidth plus infinity cache). 6700xt and below actually have poor mem architecture. Especially the 6600xt where 32mb cache, 128 bit bus & x8 pcie is a joke. The rtx 3050 is nvidias 128 bit 8x pcie gpu

      • +2

        The 4060 is NVIDIA's new 128 bit 8x pcie GPU.

        • 4060 is also a joke so is the 4060Ti

          • +2

            @[Deactivated]: They're bad. But not jokes. Jokes are funny. These deserve to be taken seriously as failures, not disregarded.

            • @Deterrence: Agree with that. The joke is on the people who buy that 4060Ti paying 399 USD.

      • +2

        Every low end card is screwed by these comapnies to sell top end as much as possible. Heck they even gimped the 4080 and priced it high enough to sell more 4090s. Let alone low end cards.

    • Frame gen is kinda borked in most games, like there's only two or so I would use it with. DLSS 2 is better than FSR 2 that's for sure though. If poeple use blender and or can really use the tensor/rt cores then you wouldn't even look at Radeon. Those that do either use programs like Davinci that don't care if it's Nvidia or not, and or use Linux, want a cheaper card with VRAM but also it looks like the Cuda conversion HIP(ROCm) will eventually come to select AMD cards. AMD will also get frame gen (at what quality though).

      Personally, I would buy another AMD GPU since I mostly play esports games and edit with Davinci. I don't use RT since it takes a chunk of perf that you need to claw back with temporal upscalers, and if it's really bad you need frame gen if it's supported.

      EDIT: Also you can record with AV1 and HEVC with the Adrenalin software, with Nvidia you can't choose and you're stuck with H. 264 (AVC). Also in the AMD Adrenalin software you can set the replay buffer to system memory which is good because Instant Replay is always recording and it can wear out your drive faster, with Shadow Play you again cannot choose which IMO is a crime in 2023.

  • if I got this and ended up with the high idle draw issue would I be able to return it for that? this card would fit in my case unlike all the currently available 4080s and is much cheaper, but I dont want it if it chews 100w+ of power not doing anything

    • +6

      of course not!!
      functiioning within the known specs.. AMD have never claim, low power draw at idle..

      the high power draw is a known feature with multiple screens, that has not been fixed…. and may never… may well be an design issue..

      thanks for reminding me why I haven't bought one yet.

      if you running it 24 hours a day at idle, that is about an extra $186.15 a year in electricity costs, assuming 85w more than the NVidia card at 25c per kilowatt.

    • +3

      People keep mentioning the supposed high idle powerdraw with certain multiple monitors but AMD drivers have gradually reduced this issue. People keep posting reviews from 2022 when the cards were launched.

      Please refer to the below link in regards to 7000 series Idle power power reduction from April 2023.

      https://www.hwcooling.net/en/radeon-power-draw-finally-stays…

      You can see in this review that even a 4080 draws more idle power with two monitors than a 7000 series GPU.

      AMD drivers have been reducing Idle power consumption issues since January as stated in the below post from Videocardz

      https://videocardz.com/newz/updated-amd-radeon-rx-7900-drive…

      • +2

        My 4080 is drawing 14W in idle right now with two monitors at different refresh rates (144 & 60.)

        • Something seriously wrong with their testing, agreed.

          My 4080 draws 14w or less at idle with 4k&1440p both at 144hz.

      • +4

        My Nitro XTX I got a couple weeks ago is drawing 115W idle with two monitors. Definitely not fixed.

        • +1

          Damn, thats almost my entire systems power at wall wattage when idle and I run 4k 120 and 1080P 60Hz on 3080Ti 5600.

      • +2

        Every time I have looked into it (which is many, I want to be able to buy this card) it has been improved in some configurations for some people. Its persistent mention as a known issue in the patch notes is evidence enough it has not at all been fixed.

        Additionally, that article you linked is only testing two monitors with the same resolution with the same refresh rate. That was never a problem to begin with, the problem is when you have mixed refresh rates, such as a primary at 165hz and secondary at 60hz.

      • That data is 100% not to be trusted, I have a 4080 and is idles 14w or less with 4k & 1440p 144hz monitors. Plenty of other reports of similar consumption.

        They're either doing something to skew the results or got their data switched around or something.

    • Repost

      The high power draw is monitor dependent and is why AMD still hasn't fixed the issue completely
      Unlike NVIDIA, AMD are too tight ass to buy the top 200 monitors in the market and then test single/dual/triple/quad setups along with differing VRR and refresh rates
      Also, high power draw on triple monitor setups and higher is a hardware issue and can't be fixed until RDNA 3.5 or 4.0
      If you use a lot of monitors and care about power usage, get RTX

  • Anyone fitting this into NZXT H1 V2? Any issue?

    • I have the 7900 XTX Asrock Phantom Gaming and that fits very snug. Have to remove the dust filter. Spec wise, this will easily fit, even with the dust filter on.

    • I have this exact card in a h1 v2 and it fits fine without any modification

  • -1

    Hodl 4 life

  • good price, going down!

  • +1

    Centre Com and JW price matching

  • 3 x 8 pin connectors …blimey!!

Login or Join to leave a comment