• expired

XFX Swift Radeon RX 9070 XT Triple Fan Gaming Edition 16GB GPU $1199 + Delivery ($0 to Metro/ VIC C&C) @ Scorptec

593

Surcharges: 0% Afterpay & ZipMoney, 1% card & PayPal payments.

Limit 1 per household

XFX Swift Radeon RX 9070 XT Triple Fan Gaming Edition (Boost: 2970MHz), 16GB GDDR6 (20Gbps), 256-bit, PCI-E 5.0 x16, 3x DisplayPort 2.1, 1x HDMI 2.1, FSR 4, Nickel-plated Copper Cold Plate

Related Stores

Scorptec Computers
Scorptec Computers

Comments

  • NICE!more MSRP is coming!

    • +12

      $60 over MSRP, atrocious!! not in ozb spirit, we all chasing those $2 and $5 cashback!!, $60 is like gold mountain for us.

      • XFX coolers are typically worth the little bit of extra money to be fair haha

  • +5

    these guys have always been my go to for any amd card. So would recommend.

    • Do you mean XFX or Scorptec?

      I’ve got a Gigabyte 3070 vision and was going to upgrade to amd soon but I’m not familiar with any of the AMD manufacturers, any insight would be great!

      • +19

        It would a very strange niche if someone only bought AMD cards specifically from Scorptec.

        • I don't mean to be lol it just seems to be every time they have usually the best deal when I need to get one lol.

      • +1

        Scorptec

        • scorptec the only thing keeping me from going full Luddite and smashing the weaving machines and destroying modern society tbh

          also, pitchfork deals when?

      • +2

        They're all good really, can recommend XFX, Powercolor and Sapphire. These three are considered like the best AMD Partners in the community.

  • -1

    Nice!more MSRP is coming soon!

  • -1

    Nice! More MSRP is coming soon SOON!

  • +1

    Is it worth spending $60 more on the 3 fans? Thanks in advance.

    • +3

      No, just spend $0 more to get 3 fans

      • +4

        onlyfans. mmmmmm

    • +2

      Haven't personally seen anyone do a breakdown of temps between these two specific models, but the other factor to consider is size:

      2 fans = 2 slots, 29 x 12.3 x 6.5
      3 fans = 3.5 slots, 32.5 x 15 x 6.5

      • +2

        the dual fan xfx swift is 3 slots

        • Huh, just pulled it from here where it claims 2 :

          https://prod.scorptec.com.au/35/1039/117014/354519_specs.jpg

          • +4

            @stimutacs: I can confirm it's 3 slots. I helped a friend install his. XFX's own website doesn't even have a page for this model yet last time I checked.

            I rarely trust the description and check manufacturer's website. Even then, I wait for when people have their hands on the product. So many hidden issues like missing ROPs on 50 series GPU, HDMI 2.1 port with 2.0 bandwidth, 3.5GB vram, etc.

            • +1

              @Radiskull: I got the dual fan card. I can confirm @Radiskull is correct.
              In case you ask, no, we don't know each other XD

              • +4

                @ktan919: Nice work. I'll hand you the cash in the usual spot.

            • @Radiskull: good man, GPUs are mostly reliable but if buying your new toy 3 months late isn't an issue, let other people test the products first

      • +3

        6.5 = 3 slots for both

      • +2
  • +1

    I've never owned an "entry level" model of a GPU before, can anyone comment on how they compare to the higher tier models?
    Is it worth it to spend an extra $100ish on a XFX Merc model instead? (when in stock)

    • +12

      Personally I find the factory OC to generally not be worth the price increase. The only thing I do check on lower tier models is cooling. Sometimes they can put a cooler on it that just makes the card run too hot and too loud. Xfx is generally pretty good, however just check a review.

    • +1

      It almost feels like an artifact of an older time, these days. The old idea was you pour more juice into the chip to make it stable to run at higher frequencies than it could otherwise. With the current processes though, what happens is that they're so unbelievably carefully thermally managed and on such a knife edge for melting down that there's just no scope to do that anymore for any gains outside of measurement error. The performance edge is instead in undervolting the card to get the same performance for less power usage, or if you've won the lottery, you potentially get something that'll run undervolted, with identical performance and stability as at default draw, and also not hit the thermal limit as quickly or at all under the regular sustained load.

      • +3

        The higher tier models have higher power limit set on them. That is where you get the extra performance. But the ratio of extra ooomph vs the extra dough is actually pretty bad.

        • +2

          The 9070XT seems to be entirely silicon lottery with plenty of Reapers or steal legends running 3300MHz and plenty of Mercs being loud and failing to. (And vise versa) 330W is about the most needed which base models can also reach with a +10% Power Limit in adrenaline from there it's just the Hotspots and VRAM Hotspots being an issue. (All cool core very well)

          I'd say $100 sure jump to one if there's one you like but the $1400 ($200-250 jump) of the Gigabyte, Asus, nitro+ are just silly.

          • @DuckWearingTopHat: Yep my reaper is a silicon dud, no power uplift and only about -40mv undervolt, that's about it's limit. Hit about 3ghz but still pretty good uplift for me :)

          • +1

            @DuckWearingTopHat: i get trying to optimise your GPU

            for most of us casual gamers, however, doing a stable overclock and getting 10% more FPS usually isn't a significant uplift, IMHO. i wouldn't worry too much about my silicon lottery

      • +1

        Some board partners also use it as a margin buffer; ie. As part of the fulfillment contract for a given chipset, AMD/Nvidia will require some percentage of all boards manufactured by ASUS/Gigabyte/MSI/Sapphire/etc. to be sold at the MSRP. And the former have been squeezing the latter more and more, such that the difference between the purchase price of the raw GPU silicon sold to board partner- and the sale price of a basic completed graphics card may be close to 0 (if not negative, as for context: Nvidia monopolising the bulk of the profit in each sale in this way is what pushed EVGA out of the market). To deal with that, partners will set aside the lowest % possible of their chipset allocation toward building cards intended/required to be sold at MSRP; with the rest, they can attach some more RGB or an extra fan, maybe a chunkier cooler (all of which is relatively cheap) to the basic design and eke out more profit by marketing those as 'overclocked' or 'premium' cards (though they likely aren't significantly overclocked in reality; if one GPU is better than another, AMD/Nvidia would be the first to know and sell it to the partner for more $$, eg. as an 9700XT instead of a regular 9700- and I doubt the board partners have the interest or capacity to do extra testing for the sake of grading the binned chips for their different retail SKUs).

    • +1

      Most of the time "entry level" cards are fine, they all using the same silicons from Nvidia/AMD/Intel. The only difference are fans, heatsink, design and settings (clockspeed, voltage, powerdraws ect). You can adjust the settings youself, and usually it's the best setting for your GPU. Factory's setting usually is the safe setting for their 10ks cards, not yours as all silicons vary in quality.

  • is it wise to migrate from a 3070 to this? much performance difference?

    • +5

      at 1440p or greater, it will be quite a jump.

      I'm looking at doing the same change for Topaz Video AI.

      • Thanks mate , i don't do much 1440 to be honest but good to know!

    • +6

      I went from a 6800XT and yes definitely a noticible uplift especially if you do 1440p ultra or 4k gaming. (in your case the vram uplift alone is going to give you a much better experience in games).

      • +3

        Thanks mate , appreciate your comment.

    • +3

      Came from an MSI RTX 3070 gaming at 1440p 21:9/32:9 while playing both AC:Shadows and MH:Wilds.

      Upgraded to a Sapphire Pulse 9070 XT($1299) and performance has either doubled or more than doubled (AC:Shadows running at highest settings, no FSR4) - really is quite the jump

      Spending $1300++ though, is a bit subjective given the initial MSRPs at launch but I just find it getting too close to a $1629 5070 TI. If you have the budget to spend $1500 on a 9070XT, rather just go with a 5070 TI if stock is not an issue (which who knows till when) unless there is an immediate need.

      Edit: for reference, this is in stock: https://www.scorptec.com.au/product/graphics-cards/nvidia/11…

    • +20

      Overly long ramble incoming but I recently made this switch.

      Ultimately, it really depends on whether or not you're happy with the performance of your 3070, which factors in things like:

      • What resolution do you want to play at?
      • What framerate are you targeting?
      • What settings would you like to use?
      • Are you willing to compromise on any of the above factors?
      • And importantly, what games do you play?

      When I'm considering an upgrade, I also try to factor in things like "what did I pay for my current GPU (inflation adjusted)?", "what is the average performance uplift I can expect?", and "what features would I gain/lose access to?".

      Jumping from a 3070 to a 9070 XT:

      • For gaming, you can typically expect ~70%-85% more performance depending on game and settings. You also get double the VRAM capacity which is only really relevant when the settings you want to use would exceed the 8GB available on your 3070 (especially true at higher resolutions like 4K).
        • Relative RT performance also increases but to a lesser degree (more like 50-65% faster).
      • For gaming features, you lose all versions of DLSS support, and gain FSR 4 support.
        • Some games can be modded with tools like OptiScaler to support alternative upscalers (including FSR 4), but it's not universal and not recommended for multiplayer games with anti-cheat.
      • For productivity apps, you lose access to CUDA. While you might get acceptable acceleration results in some apps, support for ROCm or other non-CUDA APIs is generally worse across the board. The raw performance available to a 9070 XT is far greater than that of a 3070 though.
      • For streaming and video recording (encode/decode), you swap NVENC/NVDEC for Video Core Next 5.0.
        • Unfortunately, I've only found very high level detail as to exactly what VCN 5.0 brings/changes vs VCN 4.0, but compared to a 3070 you gain AV1 encode support, but likely lose some or all 4:4:4 and 4:2:2 chroma support, even for H.264 and H.265.
        • AMD themselves have claimed that in general, encoder quality on RDNA 4 is a large step up from RDNA 3 and prior and this has been mostly borne out in some community testing (e.g. low bitrate H.264 looks significantly better on RDNA 4 than prior AMD generations, enough to be mostly comparable to NVIDIA/Intel). H.265 was already pretty good on RDNA. I heard AV1 had some odd issues on RDNA 3 and I don't know if they've been ironed out with RDNA 4.

      As for my personal reasons for making the switch?

      • I recently purchased a 1440p 360Hz QD-OLED monitor, and my 3070 certainly doesn't come close to maximising that refresh rate in more demanding games.
      • The 9070 XT fell into the 70%+ performance improvement I'm generally looking for at a minimum and at roughly iso-price vs what I paid for my 3070 in 2021. It's generally the best price/performance available in this market segment as of today. In some games I play the difference is more like 100% (double the performance).
        • Strictly speaking (and with the benefit of hindsight) a 4080/4080 Super on a deep discount ~6-9 months ago would have been a better overall upgrade almost across the board (similar market segment, similar performance, better RT, same VRAM, stronger support for DLSS/CUDA, similar video en/decode quality but greater feature support), but I wasn't in the market at that time.
      • Personally I'm quite unhappy with NVIDIA's approach to the consumer GPU market, and the unmitigated disaster that has been the 50 series launch has turned me off them even more.
      • I'm 50/50 on switching my primary OS to Linux. Broadly speaking, AMD GPUs are better supported than NVIDIA GPUs are in that space (though RDNA 4 support is still coming up to speed).
      • +2

        10/10 for detail - very interesting! Thanks

      • nice. do you actually feel the difference between 360hz and 144hz?? i always thought these ultra high refresh rates are like a gimmick, just like high polling gaming mice XD

        • +1

          It's definitely noticeably smoother than my 165Hz Dell S2721DGF monitors side by side, but how much of that comes from the raw refresh rate (2.78ms frame time vs 6.06ms) and how much of it comes from being an OLED vs IPS LCD I can't really say. I jumped at it because it was a really good price more than anything.

        • Not a gimmick. Few things to consider:

          • it comes down to individual perception and trained eyes.
          • attitude plays a role, if you come in with the preconception that it's not going to be smoother - then it won't be smoother because you'll be looking for reinforcement of your initial strong perspective.
          • if you come in as a blank slate, then it's not the same as polling rate at all. Each frame delivered to your eyes is naturally going to be at some level perceivable. Just because you adapt to smoothness quickly does not mean you're not sensually attuned to whats going on in a subtle way. We can only focus on so many things individually before they get put into the background.
          • while you might adjust to the smoothness fast, you'll often notice the motion resolution for the lifespan of the monitor, particularly if you are going between screens. motion resolution is how clear the pixels switch, which results in more of the image being clear. To many people the motion "resolution" is actually a temporal awareness of your surroundings. When things are clear you tend to look at things as 3d objects in space, whereas if they are blurring all the time you —- move camera, stare at image, unconsciously. Because while the camera is actually rotating your not able to see the temporal resolution (for example, at 60 hz with 60fps). If you're running at 480hz or 240hz you're going to be able to perceive things in the same way you do in vr. As objects in space.
          • +1

            @Matthew xxl: thanks for that response.
            i grew up with 60hz monitors and once I tasted 144hz I could never it was so fluid! I've seen decent budget 360hz displays demoed in JB hifi before but couldn't tell apart the difference. I guess the panel quality totally makes or breaks a 360hz panel; if it has noticeable pixel delay/ghosting it will smear the image making the super high refresh rate pointless?

            • @selphie: Refresh rate, resolution and even pixel density per inch is often something you notice in little chunks, that does not mean it's slight or insignificant. I just got a 240hz 4k 27inch qd oled, upgrading from my ultrawide 175hz oled. And i was initially let down. It did not feel like a revelatory upgrade, neither did the pixel density. But over time there have been lots of scenes in games where suddenly i'm like "yeah this is a hell of a lot clearer"

              So looking at something in a store might not be indicative of your medium to long term experience.

              That being said, ips screens tend to vary a lot more than oleds. Oleds you are guaranteed to get a singular upgrade of motion clarity over another oled with lower specs. Whereas with ips you are going to reply upon things like overdrive, adaptive sync tuning, a lot more. Things like colour calibration also matters more on an ips panel as well.

              Ive had some real terrible ips panels before i switched to oleds in years ago.

      • wow , you nailed it mate!
        awesome detail , thanks heaps!

  • +3

    Holddddd

  • +5

    I like how Scorptec is selling GPUs at MSRP.

    Looking back at my order history over the past years, I didn't deliberately choose Scorptec, but most of my orders were from them, with a few scattered across PLE, Mwave, etc.

    • +10

      I liked it, too. My $4039 ASUS TUF 5090 order was cancelled by them and they called me if I wanted to buy Palit 5090 for $6999. Lol

      • XD

      • How? I paid $3,189 on 07/11/2022 for a Zotac 4090 Trinity OC, even during the crypto boom?

  • Never mind, wrong thread.

  • +3

    Limited to 1 per household

    Hah none for you, roomie!

  • Time to buy?

    • +10

      Around 2 minutes to complete the purchase

      • LMAO. tru dat. Two minute load time for Cyber punk?

  • for people who are upgrading, where are your old gpus going? i dont think i dont see a surge of gpus on the second hand market(fb market place, gumtree) where should i look?

    • +2

      Selling to within the inner circle of friends for a discount

      I think my 3080 can get about 800 but am letting it go for 500. Mates Rates.

      • +7

        $500 is the right price. No one will pay $800 for it.

        • Completed listing's on ebay suggest otherwise.

          Don't know about Gumtree and Facebook Marketplace though, but they were also high.

          I think someone was asking 650 for a 3070 Ti

      • Same, I'll be selling my 2070Super rig to a friend for 500.

    • I'll sell my 7800XT on marketplace or ebay

      • How much are you planning to sell it for? I have got one too, and am contemplating getting the 9070 XT.

        • the going rate on ebay seems to be 700ish

          • +6

            @bAps: crazy to pay that much when you can get it for ~$730 new from vendors with new card warranty.

        • +2

          Put it up for "five fiddy" and I bet you get a few messages within an hr. I think 600 is a fair price for both seller and buyer as new 7800XT goes for 729-799

          • @Kevin Bacon: It looks like I'll be sticking with my 7800XT. When I came back, Scorptec had increased the price by another $50.

    • what do you do with a 1080ti at this point?

  • from RX 5700 XT to this? right move or should I wait more ?

    • +5

      This will be a huge uplift from that card.. always the same response though if you are happy with the perfomance and the games you play then you can wait for a better deal… this is only a deal because its in stock and not $1800 .. otherwise it's not a true "bragain" lol

    • +2

      i've got a RX 6700 XT, which is ~30% faster than 5700 XT. just bought 9070 XT which is ~80% faster. so you're looking at more than double the fps.

  • just wanted to say thanks for limiting to this to one per household (not sure whether that came from Nvidia or is your own policy)

    • Unless there are two gamer boys in the one family. Or maybe one in the granny haus.

  • -1

    lol is this not a dupe of my post….

    • -2

      Mine got taken down so enjoy this being taken down. Dumb OB

      • +2

        They unpublished mine until I pointed out it was for the 3 fan version. Then they republished it.

  • -2
  • buy or hold?

    • +2

      Should've bought on black friday where you could of got a 4070 ti super for same price

    • +3

      This is MSRP - the 2 fan will be the cheapest option until there is an official price drop. Hold all you like its just won't be any cheaper anytime soon

      • +1

        ^ this is probably true

        rumours are that AMD is subsidising some launch models to be close to MSRP. these are only rumours, and AMD's official statement:

        “It is inaccurate that $549/$599 MSRP is launch-only pricing. We expect cards to be available from multiple vendors at $549/$599 (excluding region specific tariffs and/or taxes) based on the work we have done with our AIB partners, and more are coming. At the same time, the AIBs have different premium configurations at higher price points and those will also continue.”

        -Frank Azor

        suggests that the rumours of MSRP models running out will NOT happen. Who knows tho

  • -2

    Some non-MSRP models of 9070 non-XT have just dropped their price. Stop posting and buying these 9070XT and we'll all get cheaper cards

    • +7

      good. 9070 non-xt was stupidly priced too closely and barely selling compared to xt. it should be 15% cheaper.

    • +3

      Thats because the price difference is crazy. If you could get a 9070 non xt for $900 it would make sense. The 9070 xt is a much stronger card across the board.

      • 12% isn't "much stronger"

  • Lol 0% bnpl surcharge but surcharging cards is insane.

    • +1

      depends on the bnpl fee's they charge the retailer, they may have waived it because generally they charge the retailer more than the CC companies interms of fee's …

  • Does anyone know if they will ship out the stock after purchasing or is this a preorder?

    • +3

      Illegal for it to be a pre order and not advertise they must have confirmed stock at a minimum.

  • +3

    I ordered an ASRock 9070 XT on 19 March and it's still not dispatched yet, scorptec sucks

    • What state is it in the portal?

  • An almost perfect GPU but too bad it is 3mm too long for my SFF build (Dan A4 h2o). Really appreciate clean and minimal GPUs that rival stock such as founder edition cards from Nvidia or AMD's previous gen radeon GPUs.

  • -1

    AMD pricing is so good, sucks their encoder is still nowhere close to NVIDIA's.

    • "With RDNA 4, AMD claims encoding quality is significantly improved, and the examples they showcased were certainly attention-grabbing. AMD is specifically highlighting 1080p H. 264 and HEVC at 6 megabits per second – one of the most commonly used setups – demonstrating a substantial increase in visual quality." - TechSpot

      • -5

        Yea I got a 9070 XT and returned it on the same day after testing AMD's claim and paid more for a 5070 Ti. It's all marketing sadly. NVENC is still way ahead.

        The same thing happened with the 6900 XT. I bought it on release, still a lackluster encoder, returned it on the same day and paid more for a 3090.

        • So you went from a 6900XT -> 3090 -> 4070 Ti -> 9070 XT -> 5070 Ti?

          Quit bullshitting… It's been independently tested and quality is pretty much indistinguishable outside of metrics.

Login or Join to leave a comment