ASRock Challenger OC Arc A770 16GB GDDR6 Graphics Card $429 Delivered ($0 SYD/ADL/VIC C&C) + Surcharge @ Centre Com

440

Near all time low for the best budget ML/AI card
Whilst Intel have been struggling on the CPU side, their driver team is on fire and released full PyTorch support across Windows and Linux last week
For those after a white card, 2 white Sparkle A770's from Amazon UK via AU are available for the same price

A770 CL 16GO

Boost: 2150MHz, 16GB GDDR6 (17500MHz), PCI-E 4.0 x16, 1x HDMI 2.1, 3x DisplayPort 2.0 w/ UHBR 10, Metal Backplate, Dual Striped Axial Fans, 2.4 Slot, 271mm
225W TDP, 2x 8-Pin, 650W PSU recommended
3 Year/s Warranty

Surcharges: 1.2% Card & PayPal, 2% AmEx

This is part of Black Friday / Cyber Monday deals for 2024

Related Stores

Centre Com
Centre Com

Comments

  • Thanks OP. Does anyone have any idea if there will be a next generation of Arc discrete cards? I wouldn’t mind going Arc for my Unraid server for transcoding videos.

    • I thought the next gen was supposed to be released in 2024 so maybe that's why they're discounting these

      • +6

        It has been delayed, according to rumours, and won't be out until next year. This is the same price from 4 months ago too, more likely clearing stock before nVidia and AMD launch because they'll be worthless then.

        We're already seeing the 4060 dropping to $400 as nVidia gears up for release, which is the better gaming GPU.

        • When is nvidia releasing their next gpus?

          • +1

            @bboybra: Jan-Mar is the rumour so far (5080/90 in Jan, 5070/ti in Feb, 5060/ti in March). AMD has been rumoured to have pushed back their next gen release to 2025 as well because they have excess stock of the 7900 they want to clear first rather than beat nVidia to market.

            All in all it might be good for gamers for once. Excess stock of older cards, all three releasing new cards heavily focused on the midrange market at the same time, I'm looking forward to some reasonably priced cards somewhere not long after release.

            That said, 4060 is already below $400. I expect the 5060 will cost a lot more. I hope AMD really pulls a rabbit out of the hat this generation otherwise nVidia will get to dictate pricing.

            • @freefall101: dang can't wait!

              I agree wish there was more competition

            • -4

              @freefall101: NVIDIA hasn't been stylised as nVidia for a very long time

              • +10

                @ldd-mn: True, but I'm old and refuse to keep up with the times.

                Now get off my lawn.

              • +2

                @ldd-mn: Who cares.

                • -1

                  @Boodek: I don't know why I was downvoted, I was just saying in case they didn't know? I don't care what they use.

        • There are times where the extra VRAM in the 770 will give better performance than the 4060. Check out the GN review charts.

    • Next gen is coming but it's looking like it will be the last generation of Arc cards as Intel is struggling to sell any and stay afloat with their recent shortcomings.

      There is an interview with an Intel engineer on Moores Law is Dead's YT channel that goes over the issues with Arc and its failure to really penetrate the market. Second gen is being scaled down and will only be entry level to mid range, its going to be more expensive because it's on a more advanced node and it will be competing with the 4060 etc.

      Intel has liquidated its internal GPU dev team and either sacked staff or reassigned them to other business units internally.

      Next gen AMD/Nvidia cards will be significantly faster so Arc Battlemage is unlikely to sell well and should be the last generation

  • -1

    Goddammit it. I just got a 4060 ti 16Gb to give me extra Vram for AI LLM generation. I works beside my 4070 ti.

    • +4

      Why 'goddammit'? That's way better than this card. You can use exl2 with them.

      • Can you not with the intel? Well in that case, good. I find my EXL2 models much better than the others.

  • Hmm pretty sure this card is slower than a 4060.

    • +1

      Also with dodgier drivers

      • +11

        They cant be dodgier than couriers please drivers?

        • +15

          I see your Couriers Please and raise you Aramex.

  • +1

    Perfect card for video editing but nothing else

    • +1

      So checks out for onlyfans ?

    • +1

      This is harsh, HUB did some extensive testing and found it much improved for games.

    • Even then, better just get the A310 if you don’t intend on gaming.

    • +1

      This - quicksync, 10 bit encoding/decoding for 4:2:2. I'm still waiting for Battlemage to upgrade my cheap A380.

  • +1

    but 4060 is even cheaper and better. what is the point buying this card??

    • Agree…will be more stable i've found when doing AI video upscaling. I had run the A750 for the last few years and it would randomly throw and error after hours of processing.

  • -2

    Do not trust intel on graphic card. I used to buy intel 740 on agp until they are obsolete. The company doesn't do well and may give up graphic card. It is better to buy amd or Nvidia graphic card.

    • +2

      Nah they'll never give up. The future of chips is GPUs, but they should be more generic and fully integrated into the CPU (think Apple SoC). I'm a generic engineer but I can say with good certainty outside of some revolutionary chip (i.e. quantum chip) this is the direction to go.

    • -3

      "I used to buy intel 740 on agp until they are obsolete." Gotta love broken English.

    • +3

      While I'm all for judging a company based on their past actions…. The Intel 740 was released in 1998… And AGP has not existed for decades…

      • +3

        I disagree. I'm not buying a router ever again. Look what happened to 28.8Kbps modems!

  • Tempting! Used to use the A750 for Topaz AI but kept getting random errors with video processing after hours of crunching. None since switching to a RTX 3060 Ti. So YMMV

  • Wow I can't believe Sparkle are still around, I remember buying them like 20+ years ago.

  • Does it support older games yet? I remember it had issues at launch

  • +1

    found a use for these they are actually not bad for AI stuff if your running it locally. eg using software like Facefusion etc. It flat lines almost any cpu or gpu to process stuff usually :D and the more ram you throw at it the better.

Login or Join to leave a comment