• out of stock

[Pre Order] Intel Arc B580 LE 12GB GDDR6 Graphics Card $439 Delivered ($0 MEL C&C) @ PC Case Gear

2710

Well well well
Even though he was pushed out, shout out to Pat Gelsinger for doing what cousins Huang and Su refused to do - provide real competition in the GPU market
US price is $249 USD so $439 AUD including GST is a straight conversion with PCCG including free delivery + no surcharge

Battlemage B580 is an ideal 1080p high/1440p mid card, 10% faster than the 4060 with 4GB extra VRAM, XeSS 2 upscaling and VVC/H.266 decoding
AMD's low end RDNA 3 7000 series cards - lacking matrix/AI cores, proper upscaling/RT, ROCm support etc can now be seen as the e-waste they truly are
Reviews here

31P06HB0BA

Boost: 2670MHz, 12GB GDDR6 (19000MHz), PCI-E 4.0 x8, 1x HDMI 2.1, 3x DisplayPort 2.1, Dual Fans w/ Passthrough, 2 Slot, 272mm
190W TDP, 1x 8-Pin, 600W PSU minimum
3 Year/s Warranty

Related Stores

PC Case Gear
PC Case Gear

Comments

    • +32

      The testing I've seen shows it somewhere between the 4060 and 4060 Ti in performance….although that was a single source and I haven't looked further.

      • +3

        Gamers Nexus?

        Those guys are awesome.

      • +16

        I've watched about 5+ reviews… Aside from the all over the place LTT one (which hyped this card up way over what it is) , HUB, GN, J2C and David Owen have it in that ball park depending on the game. All had some driver issues during review which Intel fixed straight away (except Indiana Jones for David Owen). Digital foundry also reviewed it but I'm not super sure of their review methodology as it's based on automated benchmarks seemed pretty fair and they also had similar results.

      • +8

        Which is where the 3060 Ti also performs? I'm not sure why JTayzer got downvoted for this.

        • +20

          Username does not check out.

          • +1

            @shreav: Agreed. Vote revoked for that reason.

      • Thanks for the info

    • +11

      yes you get 4GB VRAM but you're still bandwidth limited

      Out of all the competing cards this has by far the highest bandwidth, why do you need so much? The real issue is this card was released before AMD and Nvidia releases their competing cards, they're very likely going to have better options in a month or two if you can wait.

      • +14

        Except the rumors are that 5060 still going to have 8gb

        • -4

          They're also going to be dabbling in advanced texture compression methods, so if that's relatively easy to adopt - along with a bunch of other tech that devs have been twiddling their thumbs on - then it might not matter. VRAM bandwidth will also at least match this despite a 128-bit bus.

      • -3

        The 3060 ti had nigh identical bandwidth. I'm just saying if you listen to the reviewers they all say Intel is late to the party and that ideally the card should have been 10% less in price. We shouldn't lower our standards just because of what NVIDIA and AMD did last gen. The card is good in this market, but why can't we expect better (i.e. improvement in raster after 4 years or lower price for the same raster).

        • +22

          Personally, I'll entertain anyone who helps break things up. Competition is key to everything

      • +5

        Nvidia definitely won't be launching a 60 series in the next 1-2 months. Recently it has been as long as 9-12 months between the top end 90 series and the mid range offerings. Some rumours suggest this gen is going to accelerate that (probably to try to get ahead of tarrifs) but the most aggressive rumour I've seen claimed it would be the 60 series in March. And that's just a rumour.

      • +2

        Nvidia is going to be releasing the 5090 in a few months, not the 5060. Going to be a long time before anything comes that could possibly compete with this at the sub $500 price point

      • +2

        The 5080 will be released in the two months or so, but the 5060 is probably about 6 months away.

        If you want to buil an entry level rig using a new GPU in the before May / Jun, B580 is the best option bar any big price cuts by the other 2.

    • +28

      I see where you’re coming from - it’s frustrating having an older card that’s just not worth upgrading, 4 years later.

      I’d like to upgrade my rtx2070 super but even a 4060 is only a 5-10% increase

      Both amd and nvidia are taking us for a ride, and intel seems to be the only one putting real pressure on price to performance

    • +14

      3060 ti is directly between those two cards (4060 and 4060 ti). Interesting that I'm getting significantly downvoted for stating the actual fact that the card has the same performance as the 3060 ti. Yes it's newer and will see more support over it's lifespan, but you can get 3060 it's for well under 400. 249 USD converts to 390 AUD so there's some level of import inflation above the usual GST. Didn't say the card is bad by any means. The fact however remains that this isn't Intel's first or second attempt at DGPUs so it's a "good start" is by no means true. The start of Alchemist was miserable due to poor drivers, so really Battlemage is Intel learning the lesson. The hope is they can compete in the high end eventually. The issue is they're competing with NVIDIA and AMD's entry-midrange offerings from FOUR years ago. This is what Alchemist SHOULD have been. Idk why I'm getting jumped.

      • +6

        3060 it's for well under 400

        Maybe down to like $320 second hand on ebay if you are lucky. Obviously no warranty or support for the new stuff that Intel has.

        Intel have released the best price-to-performance card on the market. Mostly because Nvidia and AMD are ripping people off. If Intel left the market, people would probably end up paying $1000 for an entry level GPU in a few years.

        • -6

          Intel HAS to compete (or their GPU R&D would be for naught). We shouldn't hype it up because they are doing what they need to do to break into the market and survive.

          • +8

            @JTayzer: I don't think people are hyping it up for the love of Intel, people are hyping it up because it is value in a drought of value.

            • -2

              @Aureus: It is a return of the value we once treasured and people are getting nostalgic. Intel is still a corporation and this is the move they needed to make to join the market without completely killing off their GPU division. Props to them for being able to catch-up - albeit closer to previous gen offerings. Once AMD+NVIDIA next gen releases, it's a bit in the air about the future of Intel dGPUs. I'm hoping this will incentivise AMD and NVIDIA to lower their prices, but I have no clue how they'll react.

          • +8

            @JTayzer:

            We shouldn't hype it up because they are doing what they need to do to break into the market and survive.

            People are hyping it up because it's the first competitive budget GPU we've seen in almost a decade.

            Competition significantly benefits all consumers, so it ony makes sense to encourage disruption of a duopoly.

          • +2

            @JTayzer: Intel doesn't have to do anything. Just because you spend money and time on something doesn't mean you need it to work. Also, they're not even a GPU maker - it's not their core product at all.

            They could do a Google and kill it. Or just do basic work and realize it's not an affordable market for them to enter. You'll waste a lot more money trying to develop and release an unsuccessful product than doing initial work and killing it before it hits market.

            So yes, Intel deserves some praise for taking the risk and trying to enter the market.

            • -2

              @DingoBilly: The company is in dire straits, it's not really comparable to Google given the financial situation of both companies. Intel has seen year on year losses (no profits whatsoever) so they absolutely need every single division to succeed to recoup as much money they've spent on R&D as they can.

              If they really did nothing after investing that much as you said they may as well declare bankruptcy and call it quits. AMD has outcompeted Intel in the CPU market (desktop + data center, mixed bag with mobile) so that simply cannot save them. They chose to enter the GPU market because they want a piece of the pie, it was a calculated risk that they decided. They are not doing it for the benefit of anyone except Intel. Yes they could have pulled out, but if you've seen their net income that was genuinely not an option. Why spend that much on the GPU division to then shutter it completely making not a dime off it.

              • +1

                @JTayzer:

                Intel has seen year on year losses (no profits whatsoever)

                Hmm… their profit has declined but no where near "Losses". fact here:
                2023 they made US$1,689 M, a lot less than competitors but no lose yet.

                2024 might be hard hit and first time in the past 14 years to actually "Lose". Since then mostly due to investment on graphic card of roughly 12B, mostly tax evading and purposely stated.

                Since day dot Intel's graphic card has been the bare minimum, years after years of sour loser, they finally pick up the pace and they did well this time, deserve some love here, at the end of the day they give you a choice, just like the old Cyrix CPU, the obvious underdog but people are buying it.

                • +1

                  @dlovep: That was a profit in the December 2023 quarter, which breaks even over the losses in the earlier 3-6 quarters. The previous September 2024 quarter was a 16 billion loss, there's no real way to argue that the company is doing great at this moment given the trajectory. They rested on their laurels for far too long.

                  Just because they pick up the pace doesn't mean we should give them overwhelming support - it's only a good job relative to the terrible job competitors AND Intel themselves did last time. Overall, it's a decent card at a decent price in the current entry-midrange lineup, but is that enough for Intel to really catch up in market share? It's not an overwhelmingly amazing card at a bargain price. This post has the upvotes but is anyone buying it? Only time will tell when we see future Steam Hardware surveys. If they did even better (better pricing) then I'm sure they can actually take some market share from AMD + NVIDIA.

        • +1

          I saw a used 4060 Ti for sale for $350 on FB the other day.

      • +5

        249 USD converts to 390 AUD so there's some level of import inflation above the usual GST

        Huh? 390 + 10% GST = $429, this card is $439, that's barely any import inflation, only like 2% of the card price. People probably paying that much in credit card fees often enough on this item, really doesn't seem worth complaining about.

        • +1

          Bad math on my part, my main point is, as reviewers have said (Steve from HWUB etc) - if the card launched at ~220USD it would be a gamechanger for Intel. But right now it's moreso Intel has caught up and are actually competing (instead of flopping). They NEEDED to do this or they simply wouldn't survive in the GPU market. If they launched another Alchemist then it would have been the final nail in the coffin.

      • +3

        Haven't negged you

        It's almost 2025 - brand new 8GB dGPU's are the equivalent of 8GB MacBooks, 8GB PC laptops and 4GB Android devices - unless they are sub $300, they are e-waste

        Current gen consoles already have 14GB VRAM so 12GB on PC is already borderline

        1GB of GDDR6 costs $12 AUD - if Jacket Man took $50 off his margins to offer a $500 4060 12GB, it would be passable

        Instead, he offers the fake "4070 Super 12GB" - an actual 4060 Ti 12GB in disguise - for $700+

        AMD tried to follow suit but low and mid-range RDNA 3 flopped and pricing was NVIDIA-lite

        AMD are now the ones most concerned since RDNA 4 8800 XT now needs to over deliver on raster, RT, AI, FSR 4 upscaling and price to match up with the B770

        • +2

          I agree that 8GB dGPUs should only be absolute entry level products, and I'm happy there's 12GB in the B580.

          I'm saying Intel is bringing back what we used to have 2 generations ago - e.g. the uplift from 5700 XT to 6700 XT (35%) roughly ballpark to the 41% from A580 to B580. It's not a Pascal level increase (70+% from the 960 to the 1060). Last generation has destroyed our expectations, but we need to keep the pressure up whether it's NVIDIA, AMD, or Intel. If this incentivises AMD and NVIDIA to do better, then I completely agree that it's a good thing. We need to make sure Intel doesn't drop the ball though. Gamer's Nexus review selling point was basically "this isn't Alchemist" and "it seriously competes with the 4060 and 7600". HWUB would rather the card launch for ~220USD. Which is way more measured than some people are hyping it up to be.

          AMD suffered with MCM and have gone back to monolithic next gen, so we'll have to see how that pans out for them. I don't really follow GPU rumoured performance so 8800XT vs B770 is completely up in the air in my opinion.

          • +1

            @JTayzer: All 3 are currently fabbing GPU's at TSMC which is where the real monopoly is

            Maybe gamers should ask Taiwan to make gaming cheap again?

            TSMC are profiteering now to over invest into R&D before those new Intel fabs come online

            So Intel can help with market pricing - but only up to a certain point

            • +3

              @Look Up: Good point on the node being TSMC manufactured, that would certainly limit the pricing.

              I'd say the primary issue is the fact that AMD and NVIDIA are now completely free from needing revenue from the gaming market - they're absolutely swimming in data center profits (AMD CPU + some GPU, NVIDIA ARM CPU + GPU). If Intel could make a competitive data center GPU and outsell AMD and NVIDIA they would - see Intel data center CPU dominance for decades (slowly shifting to AMD EPYC as they failed to compete effectively). They're not exactly creating GPUs for the benefit of gamers, but to break into what has been an untapped market for Intel for so many years. I want them to keep up the pressure.

              In terms of catching up in fabs, I don't think there's any way forward in that path unfortunately. Unless TSMC wants to share their knowledge of building and optimising fabs, there's just no reasonable way for competing fabs to catch up. But the fact that Intel chose to buy from them means they have no qualms with supporting the monopoly if it means their GPUs can compete. Gelsinger made a huge mistake in losing the wafer discount they had with TSMC (funnily enough saying they shouldn't put all their eggs in the Taiwan basket-fab), so I wonder what they can do now to compete effectively.

        • -1

          You can't make a 12GB card on a 128-bit bus with GDDR6/6X. The top tier consoles use between 10.5 GB and 14 GB to deliver 1440p to 4K graphics, all usually poorly upscaled.

          Improving compression techniques and existing underutilised asset streaming techniques - the latter of which the PS5 uses a form of - would make 8GB relevant at 1080p and 1440p again for ray traced graphics, but devs refused to do the work at this point, in no small part thanks to Microsoft taking forever to get a working version of DirectStorage in place.

          I'm expecting neural compression, DirectStorage/Sampler Feedback, and neural radiance caches to make a big dent in the problem, enough to make at least 1080p work consistently with high AAA RT settings at 8GB VRAM.

      • +1

        Where are you getting 3060Ti for under $400? I can only find it second hand in that price range.

        • +2

          Yes I was referring to the second hand market - it's a 4 year old card at this point as I said. In terms of new, see recent 4060 deals for well under $400 and historic 6700 XT's for 440 (though the supply of those has dried out).

          I wouldn't buy a 3060 TI new in 2024, that's the reality. Now if I had exactly $439, would I buy the B580? I'd look at the games I play because the drivers are still immature - hell the game to game variance is very high. E.g. in Baldur's Gate 3 1440p, the B580 underperforms the 3060 TI, 7600, A770 (Alchemist beating Battlemage lol) - and wins over the 4060 by a hair (0.3FPS, with 10FPS less 1% and 0.1% lows). See Gamer's Nexus and other reviews for detailed numbers.

          If I had $439 a few months ago I would have bought the 6700 XT, which this is on par with. The B580 does have better upscaling and ray tracing, but again that's really a per game basis and how much you value upscaling/RT. Intel has matched AMD + NVIDIA* in raster from 2 gens ago.

          Tldr; the wheel hasn't been reinvented. Second hand market is always an option, but of course it comes with risks.

      • +3

        They don't need to compete in the high end market.

        The budget market is completely borked at the moment, both nvidia and amd have abandoned that sector so it makes sense for intel to try and take it

      • -1

        Lets just ignore that a 4060 is $430-530AUD a 4060ti is $580-$700 (for 8gb) and a new 3060 (non ti 12gb) is $419, $439 (on pccasegear). Feels appropriately priced until early next year when the competition starts.

        Comparison should be apples to apples, 6900XT can be had for sub $400 second hand; so what?

        Only time will tell how well games adopt XeSS2 since all this fake-frame upscaling is what works for Nvidia?

        • +4

          6900XT can be had for sub $400 second hand

          Where?

          • +1

            @Yumi: SOLD… LOL

          • @Yumi: Ebay according to completed/sold listings.

            Edit: I'm stupid and was on ebay.com not .com.au.

    • +1

      This guy fuc-c-c-c-ckin stutters in modern game.

      • -2

        If you've watched the GN vid, the B580 can stutter much more than the 3060 TI in modern and legacy titles - it's extremely game dependent due to the immature drivers. The 8GB VRAM is limiting the 3060 TI for sure, but having 12GB in the B580 doesn't automatically make it a better card as there are other factors to consider. As Steve said, it trades blows with competitors but it's not overwhelmingly superior - "poses considerable value in some situations" - Steve.

  • +4

    HUB gives it the green light :)

    • +19

      HUB gave it the green light because it was cheaper & faster than the 4060.
      At $439 its basically the same price as the 4060, which makes it a harder sell.

      At this price, I'd still buy the 4060 for $449, so i don't need to worry about potential driver issues (though intel has done really well and come a long way with this) and the superior RT/DLSS framegen tech on the 4060, even though it has lower base average performance and only 8GB VRAM.

      Its great that Intel has provided some real competition at the low end, but it needs to be $399 or less imo.

      • It comes down to who do you hate the least Intel or Nvidia lol. As much as Nvidia pricing sucks, they do push the performance standards higher on every generation.

      • +15

        They pointed to 12GB's of Vram as a selling point… 8GB's is not good enough

        • +11

          This definitely puts the A580 well above the 4060 in my books. VRAM is such a hard limit in some games. If you don't care about ray tracing it's a little faster and far more future proof.

          But yeah Australia has cheaper 4060 than anywhere else it seems making it a harder pick

      • agreed

        some budget 4060 cards are around $400 on special

        this needed to be $400 delivered to be an instant go to

      • +1

        This card has 12GB VRAm and beats 4060 in almost everything and is on par with 4060 Ti yet brother wants to have it at $399 yeah nah. You keep paying NVidia more for your 8GB VRAM

    • HuB giving the green light is not the be all and and all and a bible for deciding a graphics card. I am all for competition but mere ~10% performance uplift over RX7600 and RTX 4060 and at this current pricing its not very appealing.

      https://www.ozbargain.com.au/node/880718
      RTX 4060 at this price would have been a better buy. RX 7600 can be had for $399 as well.

      And also the 8gb vs 12gb argument. This card will struggle in future games @1440p at which 12gb vram will be useful anyhow.

      But good on intel to bring more options to budget gamers.

  • +38

    Any pressure on AMD and Nvidia to stop bleeding consumers dry is a good thing.

  • +9

    This is a great start. If on the next gen they can take the fight to the x070 level at a reasonable price it would be wonderfull.

  • This or the OC version?

    • +1

      No difference between the Titan and this version. Although these have TDPs of 190W or 200W in the case of the Titan, it hovers around 130W, so… pointless.

  • +2

    Good start, for desktop this is ok. But I would love to see Intel bring in something slightly more powerful(4070-4070ti range) for laptops because laptop market currently is almost all Nvidia.

    • It'll be interesting to see discrete Intel on laptops

      • We already have it with arc. The popular Intel nuc15 (was BPC sold as the kraken laptop) had arc GPU about equivalent to the 3060. So would expect battle Mage to hit the nuc laptop at some point.

        • The NUC brand has been sold to Asus now and I don't believe they'll be producing laptops in the same vein anymore.

          Although that doesn't rule out Arc Battlemage coming in the form of laptop dGPUs.

      • It already exist, Lenovo is selling LoQ with Arc a530M for $1099 right now

  • This or the 6600xt which can be bought for about the same price?

    • +4

      The Arc B580 without a doubt.

      Before going into anything else (the B580 is the stronger card), the 8gb frame buffer of the 6600XT is barely enough today and won't be enough tomorrow.

    • +8
    • +1

      3060 12GB FHR is $360 from a 0 feed back seller…

    • +1

      6600xt should be much cheaper at this point.

  • LTT review gives it a great rap - https://www.youtube.com/watch?v=dboPZUcTAW4

    • +10

      Do yourself a favour and watch Hardware unboxed and GN for detailed review even if you skip to the conclusion (assuming it's the only review you watched) . LTT is the last channel I look to for reviews… I even unsubscribed as it's no more than an entertainment channel peddling merch.

      • +3

        Who likes to be entertained?

        • +3

          I assume most the people that watch LTT … Just not my thing and not the channel that I'd go to for detailed review of tech … Once the WAN show became an infomercial for their stuff in clicked that unsub button but each to their own.🙂

          • -4

            @scud70: Yup I share the same. LTT is a joke vs GN and HU in terms of reviews. Love to watch Linus as he feels like a friend at this point but after all these years I think he's happy to just be an entertainer and commentor rather then a source for detailed reviews. Also I can't stand J2C, how that man has so many subs and people that watch him for info, he takes 20 years talking about pointless stuff and barely gives you any info on what you came for.

            • -3

              @Zylam Marex: Hahaha J2C has started to kinda try and GN is going to help him set up his testing methodology but he'll still be hit n miss … I watched him for water cooling builds and nice mods (where he built most his followers) but his stopped doing any of that so mostly generic news and updates now .. his still less frustrating than LTT who is milking the shit out of the 12+mil followers with their infomercial videos … Lol good on him though his on that gravy train now.

      • +1

        Hardware unboxed can keep their clickbait titles “RIP Intel RIP AMD” maybe it appeals to you and the fanboy wars you do tho

        • -1

          You can thank YT for that one mate … Nothing to do with what I click on.. you do you and enjoy your LTT merch.

          • +1

            @scud70: no i dont want to thank youtube for that, stop making excuses, there’s plenty who dont do it

            couple years ago he made a budget build which included Intel and he was complaining why his comment section was so toxic about him not using AMD

            he festers that sort of community, maybe you enjoy this sort of reddit-like environment

            just because he’s aussie i’m not gonna support this reddit-like behaviour, its weird. Buy what makes sense.

  • +2

    Good to see competition

  • +7

    12gb is the secret sauce here

    • +1

      12 herbs and spices

  • +1

    Battlemage B580 is an ideal 1080p high/1440p mid card, 10% faster than the 4060 with 4GB extra VRAM, XeSS 2 upscaling and VVC/H.266 decoding
    AMD's low end RDNA 3 7000 series cards - lacking matrix/AI cores, proper upscaling/RT, ROCm support etc can now be seen as the e-waste they truly are

    What is the 6900XT that I have lacking as far as these extras go?

    Great to see some competition from Intel. We really need this as graphics card prices have gotten out of hand. I remember when you could get a great mid-range/lower top end card for about $350-500 a few years ago.

    • Mostly just the RT cores, upscaling/AI tech and abilities and smaller codec support. Your 6900XT is still more than enough and performance wise the Intel card would be a downgrade.
      Correct me if I'm wrong!

      • Yes, performance wise it would still easily win I'd assume between the two, but I was just curious what features it's missing as I assume the newer cards can do more things, but I wanted to know if it's lacking a lot in regard to those features that newer cards seem to have. I'm not even sure if my card does AV1 decoding/encoding.

        • Yeah, the Intel card will have the better upscaling tech and AI performance, like XeSS2. I haven't had a look into that but I'd assume it's pretty good in the games that support it, not sure how it competes with DLSS and FSR. I don't think you're missing out on too much though.
          From some searching it seems like your card has AV1 decoding but not encoding.

          • @bolognesebag: Yep AV1 encoding only started showing up from amd7000 and NVIDIA 4000 iirc.

            • @Grish: Didn't know this. I thought the 6900XT could do it, but as @bolognesebag pointed it, it only does the decoding.

          • @bolognesebag: Ahh right, thanks. So it's lacking in a few things. Sucks that is doesn't have AV1 encoding. Oh well. It's still OK for games, so no need to upgrade for any of those other features until it's struggling in games I guess.

  • +2

    Fun fact, Huang and Su are distant cousins…

  • How does this compare to the A770 16gb?

    • The A770 is the flagship from the previous gen so it's going to be a mixed bag.
      I wouldn't upgrade… the faster spec cards are still on their way from Intel.

      • But A770 is cheaper and faster at $399?

        • Sure, that's because it's older and its price has come down.
          You may find the B580 could perform better in raytracing or other areas.

          If the performance is that important, wait for the higher spec cards

        • +4
        • I can't comment on performance of the new cards. I do own and use an A770 16GB, in the early days drivers were buggy and weird stuff would occur. Since then the drivers are frequently updated and have come along way, its been rock solid for me. If you can snag one at a competitive price I think they are still great cards to purchase. I am in no hurry to upgrade this generation unless there is a significant uplift in AV1 encoding or flexibility in profiles.

Login or Join to leave a comment