• expired

Inno3D X3 GeForce RTX 4070 Ti 12GB GDDR6X Graphics Card $1173 + Delivery ($0 C&C) @ Umart

411

Slightly cheaper than the last deal - https://www.ozbargain.com.au/node/765269.

Only appears to be available at their North St Marys and Villawood stores.

Related Stores

Umart
Umart

closed Comments

  • +23

    Hodl

  • +1

    How's Inno3D brand good?

    • +50

      I Dunno3D

    • +8

      They've been around for a while. I don't think they a have a local base for RMA returns. As always they're good until they aren't. I'm having a tough time getting a response from them regarding a RMA for a faulty card.

      • Not really an issue when we can deal with the retailer.

    • +1

      i believe they've been peddling Nvidia cards since the 2000s. (I recall having bought one)

    • +2

      I've got an Inno3d RTX 3070 x4 iChill and its been great. No issues and had it overclocked since i purchased it

    • +1

      I bought a second hand Inno3D 8800GT back in the late 00's. Lasted me ages.

    • They're as good as any other brand, they've been in the market since the 90's.

  • +37

    I know this may sound crazy, but I don't think 12gb is enough for this $$$ of a card.

    New games are using a lot of VRAM, even at 1080p / 1440p - it skyrockets when using Raytracing

    See the latest HUB Vram usage video - 16gb 6800 allocates up to 15gb vram in new games, to prevent stuttering -not even at 4k btw….

    Yes, I know "allocation" and "usage" arent the same, but when a gpu can allocate and does allocate, it does so to prevent stuttering and increase 1% lows - which is far more noticeable than say a 5-10fps difference in the AVG fps.

    8gb and 10gb cards are obsolete for new games, 12 soon to follow.

    A 3060 12GB (yes that's right NON TI) beats both a 3070 and 3080…AT 1080P…..purely due to VRAM buffer (Harry Potter) -Again HuB is the source, go look at their vids if you dont believe me….

    • just because the current crop of games treat PC like a second class citizen doesn't mean this hardware is obsolete.

      • +13

        Not true, 4k assets, even optimized / compressed etc, can eat vram for breakfast.

        Youtubers peddle this myth to cover their arse when they get it wrong, after telling you for years that 8gb is enough and to ask no questions, just consooooom and trust their 15 minute benchmark runs.

        • 8GB is approaching not enough. 50% more than that has a long while yet, even at 4k. I don't agree with the price of this product, but 12GB isn't a limiting factor. (excluding dogshit optimised recent titles like TLOU, Forspoken, Hogwarts).

          • +6

            @maybe a bot: HUB just released a video comparing 3070 8GB and RX 6800 16GB in 2023, and the verdict seems to be that 8GB is already insufficient. RT is ironically better on AMD now due to RTX VRAM choking. 12GB is fine now, but I can see it ageing like milk in a couple years.

        • https://compusemble.com/insights/home/how-sampler-feedback-s…

          https://blog.siggraph.org/2021/04/mesh-shaders-release-the-i…

          These are supposed to combined with DirectStorage to make asset streaming way better, and way more compact on the GPU while handling enormously detailed assets.

          The problem? Despite DirectStorage 1.0 being released last in 2020, then updated to be properly functional with 1.1 last year, somehow all of this tech is gated by the current generation of consoles being released. NVIDIA has had this hardware in place since 2018 with Turing, while AMD screwed it up before finally getting it right in RDNA2.

          If the technology was in place now, 8GB of VRAM would genuinely work the way 16GB does today, perhaps better. Sampler feedback alleviates the VRAM demands for textures, mesh shaders reduces it for raster and raytracing. It should have been ready by now.

      • +6

        Sorry mate but did you watch the videos? Nothing about obsolescence, it's about paying good money for a mid/high end card and getting the performance you paid for even a few years down the line… Switching down from high to med or lower on newer titles when the equivalent card with more vram is just fine is not what a customer who spent $800-1k on a card deserves.

        Devs will always make games for whatever has the bigger user base, consoles are going to be the focus and 12gig of vram is roughly what games will target moving forward, there are games that are not well optomised but even the games that are show the same signs. You can't say buy a 3070ti over 6800 because Ray tracing etc etc when turning ray tracing even with DLSS tanks the fps…

        I understand your view, but I don't think nvidia are doing the right thing here, this card should at least have 16gb.

        • +2

          Few here are doing their DD before such a large $$$ purchase.

          They want to blindly justify their purchases, on a bargain website mind you, then cry when the next gen comes out at 50% + the price of the last gen.

          And i didn't even mention the cut down memory busses yet….some of these people are far more ignorant then I expected.

          I see why these cards are selling well at these absurd prices, and why nvidia can easily charge this price for them.

          • +2

            @Ahbal: I understand not everyone has tech as their hobby, and it's ok to debate things but it's not ok to defend nvidia or amd for that matter blindly, especially when they do their best to screw customers over.

            Also if you haven't watch moores law is dead videos on the same issue, he also explains why nvidia is using smaller busses, and it makes sense in a business sense. (I'm not a fan of the guy tho lol find him a little obnoxious)

            • +5

              @scud70: Im explaining this whole thing to show people that they ARE screwing customers over?

              Nvidia gets away with it due to customers ignorance.

              • +1

                @Ahbal: Yep I get you… We can point things out, it's up to the people to take that and look into it… πŸ™‚

                • +1

                  @scud70: Thats true, I often do forget that you can lead a horse to water….cant force it to drink

    • +7

      I think there are some factors involved which are just exacerbating the VRAM issue (apart from the games being terribly optimised). Namely, reviewers are testing out everything on Ultra, and seemingly without leveraging DLSS (which will lover the effective render resolution and likely VRAM burden). That aside, I have this card and it works great but I do feel a bit like an idiot paying so much for this level of VRAM when AMD have better options on that front. I was just limited by the fact I am an SFF nutter and this card fits in my Louqe Ghost S1 and literally nothing else out there is 2 slot.

      • +4

        So a 3080 should run:

        Lower settings
        DLSS

        At 1080p, to beat and rtx 3060?

        https://imgur.com/i5TprWC

        lol

        • yeah in your cuckoo land a 3060 is beating 3080 because of bloody vram

          • +5

            @Freestyle: ….the proof is in the screenshot I linked?

            3060 = 43 avg fps
            3080 = 25 avg fps

            • +1

              @Ahbal: yeah screenshot proves everything, totally legit and using the example of one poorly optimised game, great job

              https://www.techspot.com/review/2627-hogwarts-legacy-benchma…

              here i can link better stuff

              • +8

                @Freestyle: thats without raytracing mate, irrelevant.

                You want to buy a $1000 rtx card and you CANT turn on raytracing at 1080p?

                Hardware unboxed are some of the most reliable, next to gamers nexus.

              • @Freestyle: Check the HU vid, where the 3080 isn't even on the charts because ITS TOO DAMN FAST so they stopped at the 3070 topping the charts haha

                • +8

                  @BTMoustachio: You mean the one where the 3080 has almost half the framerate of the 3060 at 1080P?

                  Or the old video based on games a couple of years old?

                  Do you want to spend $1200 based on games that are 2 odd years old, or games that have come out recently?

                  Im not here to ruin anyones day….you all complain about high GPU prices, but mock when someone tries to explain not to blindly spend so much money without knowing for sure your expectations are in check.

            • +1

              @Ahbal: The game is an outlier. The 3070 and 3080 will beat it in basically every other title. I think there is a trend where games are becoming more Vram heavy but I think performance wise, the bigger brothers of the 3060 will be ahead except in rare cases that are purposely designed to hit the VRAM hard

              • +3

                @FireRunner: Those other titles are older, newer titles (harry potter, RE4) are trending to higher vram needs.

                Especially when RTX gets turned on, VRAM usage goes way higher.

                • +2

                  @Ahbal: i’m running RE4 maxed out at 1440p on a 3080 10gb so dont pull that bullshit about vram

                  also stop advocating for sht decelopers sht practices

                  • +2

                    @Freestyle: Cool, never said RE4 was unplayable but ok.

                    I'm sure you've noticed its using more vram than older games.

                    A 3070 would not run it at all at your settings, but yes, the 10gb is the saving grace for the 3080.

                    • -1

                      @Ahbal: so on a post about 4070ti 12gb you are talking about 3070 on shit console ports made by lazy ass devs and excusimg the incompetence?

                      Nice

                      • +4

                        @Freestyle: The point is, when you are paying more money for a card, you should still be able to play a lazy azz ports. Not defend the card. nvidia has been gimping vram from several releases now. Some 6xx card has 2GB while AMD had 3. GTX 970 had 3.5GB fast and .5GB slow buffer, 3070/Ti had 8GB (even 1070 had that much) and 3080 10GB. Now this 4070Ti has 12GB. I am not saying this vram is not enough, but it is borderline enough and will get outdated by the time the very generation ends.

                        • -1

                          @John Doh: outdated because you are advocating for sh*t developers cutting corners?

                          Nice job

                          • +2

                            @Freestyle: The problem is going to increase going forward. What you choose is upto you. Am not advocating for anyone doing shitty job. Be it selling low vram cards or shitty ports. Both should be avoided.

      • +2

        DLSS alleviates performance but not VRAM apparently, barely removes any as the VRAM is needed for the upscaling again.

      • Here's a good video explaining this VRAM issue with some real examples. This guy compares a Nvidia RTX 4070 Ti 12GB with an Intel ARC A770 16GB just to illustrate that there's more to overall GPU performance than raw power. Memory bandwidth also plays an important role here. The RTX 4070 Ti is 192 bit and the ARC A770 is 256 bit which also makes a difference.

        • Yeah I also have an A770 16GB LE in another rig and I am quite impressed by it (that and the card just looks gorgeous!)

    • -4

      3060 12gb beats a 3080? good one. hahahahahah

      • +5

        https://imgur.com/i5TprWC

        Someone is out of date

        hAhaHaHAhaH

        • +2

          Yep… once the 3080 10GB card runs out of ram it runs into a roadblock and either crashes to desktop or frame rate tanks to 10-20FPS

        • So should I buy the 4070ti? Based on the chart, it looks like more than 3 times performance than 3080 10GB.Thanks

          • @Helloworld07: It depends on your expectations. You want to play with New and future RTX titles at ANY resolution (even 1080p) for at least the next 2 years, then no, based on the VRAM trends we are seeing.

            Don't forget that screenshot I linked only shows 1080p performance, watch the full video to see how 4070ti performs in the games you want to play at the settings you would use.

            Definitely don't expect 12gb vram to last with Raytracing on.

            • @Ahbal: Thanks mate, so in this case, the 4080 16GB would be a better choice(more future-proofing also)?

              For 4070ti, it definitely meets my requirements as I only play some light games on PC, was hesitating between a better build 3080 10GB or an entry-level 4070ti.

      • +2

        You are laughing at your own ignorance.

        • All you 3060 12gb bandwagon fans relying on one chart. keep your raytracing DLSS

          https://www.youtube.com/watch?v=onVSmXqzCeo

          • +8

            @Lexsus: If you took the time to digest and comprehend, you would understand we are using the 12gb 3060 as a vehicle to explain the importance of vram.

            We arent saying to throw your 3080 out the window and buy a 3060.

            We are saying, if you are buying a new card now, and want to play new and future games, to be aware of the rapidly increasing vram usage of games, especially with RTX, even at 1080p.

            And also to be aware of Nvidias dodgy tactics.

            They tried to lock "rtx voice" to rtx only cards, but turns out it can run easily on others.

            They locked DLSS 3 to only 4000 series cards.

            Deliberately gimped the memory bus on 4000 series cards.

            Gimped vram on 3000 series cards (3070s can run with 16gb vram soldered on, it just needs a driver to recognise it - more complex than that but thats in brief).

            Forgive me for trying to educate.

            You can continue to unquestionably consooooom, then come back here and cry when the 5080 is 2500 AUD.

            • @Ahbal: It's almost as if big business deliberately do planned obsolescence for profit, and then lie to the suckers who buy their crappy products.

          • @Lexsus: We got one one game if you turn it to a certain settings, it will beat a 3080. Now the 3060 is the superior card

          • +3

            @Lexsus: NEW games like last of us and hogwards needs more then 10GB's.. 12-16GB's to run well.. its sad but the consoles have 16Gb's of Video-ram so when ported to desktop 8GB and 10GB cards are just run them like trash.. This is only going to get worse!

            • @vid_ghost: Don’t consoles share system and video RAM?

              • +3

                @FireRunner: Its…different. Its streamed directly from the fast SSD, so that "shared" values equivalent on the PC side, is higher. b(check below comments from Fabman for a breakdown)

                Its why on both PS5 and XSX, newer game MUST run off a fast SSD.

                ON PC, we dont need to stream directly from it, but we pay the price in vram.

                They were supposed to get this streaming tech ported to PC, but ive lost track on whats happened there.

                • @Ahbal: Yeah, still waiting for those DirectStorage titles

              • +1

                @FireRunner: Consoles do share the ram.. but its dynamic and also the games are made to work with one set of hardware without overheads (APi's) ext.. When porting these games the programmers are lazy and the end result is higher system requirements on PC VS the console… most titles made for PS5 Xbox series x may struggle to run with only 8Gb's of video memory.. another year or two from now 8Gb's cards will only get worse FPS… if only Nvidia gave its cards more ram

            • +2

              @vid_ghost: Exactly, people misdiagnose this as an optimization issue, when in reality nvidia KNEW this was the case years ago, and still gimped 3000 series vram.

    • +4

      I have a 12gb 3060 which has been amazing for modern machine learning applications which are exploding in capability, where most need 11gb minimum, but in the end even that wasn't enough without constantly cutting corners, and I'm currently waiting on a 3090 24gb from ebay. Brand new >$1k 12gb cards now at this price is definitely disappointing as a long term consideration.

      • +3

        Agreed, 12gb has a lot more utility beyond just gaming.

        AMD makes the chips for consoles, it should have been obvious that they went with more Vram on their gpu's for a reason, they are well known for playing the long game.

      • +1

        just wish the 3080 12gb was easier to find, 3080 ti used price is still close to this

    • -4

      No 3060 is beating a 3080.

      Calm down.

      • +3

        https://imgur.com/i5TprWC

        another one that is out of date i see.

        Just lol.

        • -5

          Honestly, do you read what you type?

          Notice how you're the only one who believes this fantasy based on some random meme?

          Maybe read every single bench test ever made or optimised versions of games.

          I repeat, no 3060 is ever beating a 3080. That's just ludicrous.

          • +5

            @imurgod: thats not a meme? Its a screenshot of benchmarks from hardware unboxed?

            • @Ahbal: No problem. So there's no firmware updates or game updates?

              Maybe try it now, because I literally just ran Hogwarts and I got VERY different FPS figures.

              • +1

                @imurgod: Same settings as the benchmarks listed?

                If so pass it on to HuB and they can double check.

                • -4

                  @Ahbal: I've never even heard of HuB!

                  They clearly have no idea when every other benchmark is the opposite.

                  Either way, it makes little difference. Nobody is buying a 3080 to run games at 1080p when they can run them at 4K.

          • @imurgod: The 3080 would trounce the 3060 again once you lower the resolution and lower VRAM using effects like textures to get under the VRAM utilisation.

            • +5

              @FabMan: Lower the resolution below 1080p????

            • +8

              @FabMan: The last of us at medium settings with ray tracing on using the 3080 10GB card runs at 20FPS slidshow

              The 6800 16GB card with the same settings gets over 50FPS.. people need to get over the fact Nvidia screwed whoever purchased any of their cards that didn't have at least 12GB's or ram.

              This is not an AMD vs Nvidia thing.. this is a Nvidia is ripping people off and are trying to do it again.. AMD offering 20GB's and Nvidia are offering 12Gb's for the same price.

              • @vid_ghost: The Crash of Us on 3070.. it is pathetic… I always wondered why they decided to have only 8gb of Vram there…

      • +2

        i have watched a youtube 3070ti vs 6800 (something 2023 version), whenever 6800 used more than 8gb VRAM for the games, it beat 3070 by a lot. 3070 has 8Gb VRAM and 6800 has 16Gb VRAM, they are pretty similar perfomance if 6800 use less than 8G VRAM.
        https://www.youtube.com/watch?v=3jxh4kfxAmY
        this is difference video i watched on youtube, but still check after 3:56 performance on Hagwarts Legacy. 6800 has 80% more fps than 3070 ti on 1440p ultra RT on, so 3060 12G should beat 3080 10G when more VRAM is needed.
        Edit: found the one i watched https://www.youtube.com/watch?v=Rh7kFgHe21k
        Those video shows 1 thing, when you don't have enough VRAM for the games, it is run a lot slower. 3080 10G is faster than 3060G 12G when the game need less than 10G VRAM. The people above just saying eg. 4070 ti 12G VRAM may run slower than 6800 16G VRAM for some games.

    • +3

      Adding this link to make it easier, apparently people dont know how to do reseach before dropping 1k on a gpu?

      Maybe thats why they are so expensive….

      https://imgur.com/i5TprWC

      https://www.youtube.com/watch?v=Rh7kFgHe21k

    • -2

      8gb and 10gb cards are obsolete for new games, 12 soon to follow.

      A 3060 12GB (yes that's right NON TI) beats both a 3070 and 3080…AT 1080P…

      πŸ˜‚πŸ˜‚πŸ˜‚πŸ˜‚πŸ˜‚πŸ˜‚πŸ˜‚πŸ˜‚πŸ˜‚πŸ˜‚πŸ˜‚
      Bit late for an April fools joke

      I'll just drop this here so you can see how wrong that statement is.
      https://youtu.be/tPbIsxIQb8M

      • +6

        im referring to the 12gb model

        https://imgur.com/i5TprWC

        • -2

          Do you know how to watch YT vids? It compares the 12GB to 8GB and both get smashed by a 3070 ROFL

          Here cop another neg

          • +2

            @BTMoustachio: Good job, in older games that dont use more than 8gb.

            Referring to my screenshots, and Hubs newer videos, on newer games, the 12gb 3060 beats both 3070 and 3080.

            Do you know how to read graphs?

            Your info is not relevant to newer games mate.

          • +4

            @BTMoustachio: Your ignorance is outstanding. Newer games, as in brand new and not 4 months ago, are chewing into VRAM like never before, if VRAM utilisation goes above the VRAM you have your system has to swap the contents of the VRAM out and dramatically slows the performance of the card down.

            Yes the 3080 8GB is faster, but the lack of VRAM on that card for these newer games is strangling it and destroying it's performance.

    • +3

      These cards are running out of performance much quicker than VRAM

      The blame is current crappy ports, not VRAM. 12GB is fine for anything xurrently

      • +4

        Nope.

        Games on consoles have 13.5GB of data available to them and very fast data transfer speeds from storage. If the game is designed around using a large amount of graphical data at a time and also utilises fast data storage transfer speeds, PCs that cannot do that fast data transfer speeds of the consoles may find they need large amounts of system and VRAM to compensate. 12GB might be on the short side.

        • Add to that PC titles have superior textures compared to consoles which makes them more VRAM hungry. Add to it lack of optimizations.

          • @John Doh: People buying these NVIDIA GPUs will probably want ray tracing on and that will just chew into that VRAM more.

      • +2

        Not true, 4k assets pull a lot of vram, regardless of "optimization".

        Then you have rtx on top of that.

    • +4

      12GB might be enough for this console generation.

      The PS5 and XBos Series X has around 13.5GB of RAM usable to games, it can't all be graphical content, so the newer games should use less than 12GB graphics data on those consoles. So games designed for the current consoles and ported to PC should work on GPUs with 12GB of VRAM, and possibly 10GB too. 8GB is where you'd get the problem as there very likely is over 8GB of graphical data for many modern console games.

      In future I'll imagine VRAM will be a bigger issue for systems that cannot swap data from storage quickly enough, the PS5 can do massive data transfer speeds. If a game is designed around that and your PC cannot, do those transfer speeds more RAM will be required on both the system RAM and the VRAM. Then even 12GB might not be enough as you point out. Hard to predict.

      • +5

        Agreed, though I think an extra 4gb on top of that console cap, as "optimization buffer" for PC's make sense.

        20gb for higher end gpus, and 16gb for mid range. 12gb as minimum.

        I can tell very few in this comment chain have had to use different compression methods for best results on 4k assets in a game lol. Fixed mindset galore.

        They eat vram like crazy. Then add RTX on top….

  • -6

    [eBay Plus] Gigabyte RTX 2070 SUPER WINDFORCE OC 8GB + Bonus 480GB SSD $559.20 Delivered @ Computer Alliance eBay

    MiscOzB on 13/10/2020 - 13:27

  • +3

    I've heard these NVIDIA cards are perfect for running ChatGPT, as that should be the only reason we buy graphics cards these days (not gaming or anything silly like that). Think of the children who need AI, buy now.

    • +2

      (Jensen Huang and the revenge of the AI bros.)

    • You can run the Vicuna finetune of Facebook's Llama LLM model on a 3060 12gb with decent performance (the 4bit 13billion parameter model). For some tasks it's better than ChatGPT, because you can manually edit its output partway through the process to force it to continue from there and give an answer in the format you're looking for.

    • that should be the only reason we buy graphics cards these days

      This reminds me of the time when boomers read a few headlines, bought into the FOMO and keep telling their kids to invest in crypto during its peak before the crash.

      I wouldn't go as far as calling people's hobbies silly just because they don't align with mine, but yes, if you are thinking of playing around with multiple experimental ai models early, having the extra vram does help in some cases (not just for chatgpt…which in all honestly will have competition eventually and be cheaper to access online anyway)

      • I agree. Maybe I should have added /s at the end. I was attempting satire of the other poster in the AMD deal post that kept continually posting about how all we needed to consider was AI models and ChatGPT on our own devices.

Login or Join to leave a comment