Title says it all. Pretty good price at the moment IMHO.
Galax GeForce RTX 2070 Super (1-Click OC) 8GB GDDR6 Graphics Card $749 + Shipping @ Shopping Express
Last edited 29/06/2020 - 11:50 by 1 other user
Related Stores
closed Comments
Must resist!
Give in to temptation.
waiting for AMD n Nvidia to release some more details lol ….
Here's some NVIDIA details for you:
- ~2.5x density increase; confirmed by the GA100
- ~50% more SMs, gen on gen (3080ti the exception at slightly less, though probably but a launch product this time); all but confirmed by the GA100
- each SM will be around 80% larger than Turing if they were on the same silicon node (they aren't); confirmed by the GA100
- tensor & RT core performance increase close to 4x (ie. most of the SM build-out is devoted to RTX architectural improvements); confirmed by the GA100's die changes and Tensor core performance
- DMA tech similar to SSD benefits seen in next-gen consoles; NVIDIA have been openly working with developers on this for 12 months, similar to raytracing prior to Turing, and of course AMD will have said technology in their next-gen GPUs
Wait, and keep in mind that AMD probably don't have a worthwhile answer for either raytracing or DLSS, let alone both combined.
@jasswolf: Ray Tracing is another NVIDIA GPU crippling gimmick that will be dead soon just like Hair works, PhysX, Tessellation, Gameworks.
Most gamers can't afford RTX capable GPUs. AMD don't need Ray Tracing. Polls on Wccftech and Videocardz show most gamers want more affordable GPUs for next gen more than any other feature.
@[Deactivated]: What a short sighted and boring answer. AMD are already implementing RT, which do you think will be used in the PS5 and XboxSeries X?
Ray Tracing is another NVIDIA GPU crippling gimmick that will be dead soon just like Hair works, PhysX, Tessellation, Gameworks.
This is completely untrue. You should look into some of the real applications of RT cores and Tensor cores before commenting on it being a "gimmick", TensorFlow is growing rapidly in machine learning and AI applications and is gaining widespread adoption in fields such as finance and marketing. Ray tracing itself might well be a gimmick, but the technology underlying it, i.e. Tensor cores, have huge applications in statistics and machine learning, which is where Nvidia makes most of their money.
AMD's graphics division is in some serious trouble. The only reason why they're staying afloat right now is that they've had a pretty good lock on the console market. Without the funds coming in from the computational sector like what Nvidia has right now, AMD simply does not have the R&D budget to compete. Polls on "what gamers want" is largely irrelevant because gamers are not what drives Nvidia's bottom line.
@p1 ama: Ray tracing isn't a gimmick, it's literally a better solve for the Rendering equation, so don't indulge that poster. Ray tracing will move to path tracing, and then in time to even more detailed physics simulations. Keep in mind this kind of mathematics not only helps with light and shadows, but particle physics and sound simulations. For a practical gaming example, RTX gear should help with better weather and climate predictions.
As for AMD, they've literally confirmed that next-gen GPUs and consoles will have hardware ray tracing. Based on patents, that's likely by modifying some sub-units in their existing compute units (CUs), rather than building out something more ASIC-like, as NVIDIA have. So they're likely at a performance disadvantage except for very specific scenarios, and that may in turn be overshadowed by DLSS.
Ray tracing isn't a gimmick, it's literally a better solve for the Rendering equation, so don't indulge that poster. Ray tracing will move to path tracing, and then in time to even more detailed physics simulations. Keep in mind this kind of mathematics not only helps with light and shadows, but particle physics and sound simulations. For a practical gaming example, RTX gear should help with better weather and climate predictions.
That's exactly what I'm trying to say, so I think you misunderstand me. I was trying to say that even if ray tracing in games never takes off, it doesn't invalidate the technology. I use TensorFlow, so I'm personally all too aware of the strides that Nvidia made from Pascal to Turing, even if that isn't reflected directly in gaming.
@p1 ama: Raytracing doesn't use the tensor cores a great deal AFAIK, not even the de-noising, so you're not exactly making a great argument for it not being dead silicon. More accurate lighting actually gives us proper reflections and realism in lighting and shadows, only the people who haven't seen a single screenshot or video would still hold that opinion.
That should be 'non-gaming' for that example, my bad there.
@p1 ama: Hmmm going to disagree with the AMD being in trouble to a degree, totally get your point about Ai and the computational side of things… But AMD has now scraped 30% of gpu market… That's double digit growth in that area…
Nvidia just landed Mercedes as a client and will undoubtedly have more in their future with their success in Ai field… What we need to consider is at what point does this become more lucrative than selling $2000 gpu cards? Will they continue doing RnD in that field over just Ai/supercomputers ?
AMD has consoles on lock but if they can pull off 4k numbers as they are advertising then they may have something as good or better for the pc gamer field… Of course we don't know what detail they are talking about is it ultra setting @4k and 120fps?? Thats stuff even a Ti cant sustain and it costs more than double a console… Is there other tricks amd is pulling off on back of the console architecture or zen2?
Anyway i think we are at a very interesting point… And main reason id be waiting to see what happens in this space…
@scud70: There was a time that if you didn't have a 'quadro' card, you couldn't access many of the GPU features.
Nvidia will take us back down that path again if AMD doesn't come up with decent competition
@jasswolf: yep been following the rumors and the leaks … not much happening on the AMD side but that may be because of the consoles being released at some point… i'm not holding my breath that they will be able to compete on the higher end but if they have something remotely competitive i'm hoping to at least see some price drops to something even a little more realistic than $1000+ cards for 4k-ish gaming …
like i said in another post .. consoles might not sway a true pc gamer over to them but sure as sh*t anyone on the fence will when you get 4k gaming at the price of just a single gpu … so nvidia and amd will both have to consider this as well…
Single most interesting thing is DLSS from nvidia and if more and more games incorporate this then it'll be where i'd spend my money (and in nvidia's case you need both for raytracing to be viable, so AMD has to do the same or have something else that makes raytracing viable otehrwise it'll be crud lol).
@scud70: AMD will likely fall where they did last time: competing with the 3070 and maybe grasping at the 3080. But once you engage DLSS, I don't think it looks great for AMD.
They may attempt to leapfrog down the line by building out their architecture for ray tracing and bringing in some bigger machine learning speedups than what's likely planned, but right now it seems they're at least a generation behind NVIDIA there, and arguably still a generation behind on architectural efficiency, and no longer holding a silicon node advantage.
@jasswolf: I agree where AMD will fall definitely… And nvidia will definitely keep the generational lead… But can always hope for more competition that's something we really need at the moment… :)
@scud70: The competition is OK, though arguably not as good as the competition AMD just gave against the 2060 and 2070 cards.
NVIDIA have always made an effort to maintain competitive offerings through the stack, the 2080 Ti prices just reflected the technically feasible yields for such a large die, and they made it an early release to set pricing expectations for the TU102. Now if the GA102 is a few ratchets down (and maybe the 3080 is using a cut-down of the same die), that might actually come back down a bit, as part of a refresh line-up.
Professional and server purchases of the GA102 will be a factor as well (expect there to be huge orders).
AMD will likely fall where they did last time: competing with the 3070 and maybe grasping at the 3080. But once you engage DLSS, I don't think it looks great for AMD.
AMD have not really been genuinely competitive with Nvidia since the R9 290X came in a blew up the 780, but the problem is that the R9 290X was so riddled with driver issues, ran way too hot and loud, and AMD had so much genuine trouble getting supply production going that they never took advantage of their engineering. This has flow on effects and they've never really been able to compete since.
Resistance is futile.
Ahhh should I pull the trigger or wait for 3000 series… upgrading from a Galax GTX 970
Wait. If it helps, I think Galax cards have a history of coil whine - I know my 1070 does.
Yeah might wait, I know I’m going to regret it once it comes out
A fair amount of Nvidia graphics cards seem to suffer from coil whine. It probably isn't just limited to Galax and it probably happens on multiple vendors. Even my Quadro M4000 graphics card seems to suffer from it every now and then. It's based upon the GTX 970 which apparently did suffer from coil whine. So Nvidia and coil whine is nothing new.
Yup have seen coil whine on all of the following brands (both my own and other people's cards) EVGA, Gigabyte, MSI and Asus. It's not at all brand limited and some whine tends to be attributed to super high frames like on my EVGA 1070 in an uncapped menu such as star craft 2 but never in game. Or in the case of the ASUS 2070 Strix I was helping troubleshoot for a friend, it would happen in some games and not others regardless of frames.
@AEKaBeer: I had coil whine on my AMD cards too. Mostly as you said, with high frames.
@cnut: Strange that. I just upgraded to a 165hz monitor and the coil whine on my Nvidia M4000 GPU has disappeared. It's running at 165hz. I was running at 60hz previously on the old monitor.
AMD must have the opposite effect with coil whine.
@AEKaBeer: you can set max refresh rate for individual programs in 3D Settings in nVidia Control Panel. Might help the coil whine in the menu
@elli0t: Yep I've limited it to 120fps since it's not connected to a particularly fast panel (100hz) 👍
Lol my card is still running great 😁
I’m thinking of getting this or the 1660 super. Then selling it when the 3000 comes out. Very difficult to pull the trigger though
You'll be paying $3000 for the 3000.
Depends, if AMD's big Navi cards can actually compete, prices will sink
They should call it the Nvidia $3080Ti for brevity 😜
@jasswolf: “Joke or not”. Well it is clearly a joke and thus not a claim.
@eggboi: Given how 2020 has gone so far, I'm not taking any chances. You'd be surprised how often people spring up to suggest they're just going to keep jumping up prices because AMD can't compete.
AMD are competing, they're just not making huge chips like NVIDIA did last gen (and kind of still will be this gen). The chip for the 2080 Ti (TU102) was 754 sqmm. The die size limit for current processes is about 850 sqmm, and yields tend to fall off a cliff after around 550-600 under the lithography techniques that dominate the industry. The GA102 (likely 3080 and 3080Ti) is rumoured to be 623 sqmm, but this process also uses much more advanced lithography, leading to better fidelity and thus better yields and lower costs for producing the same chip sizes.
You'll take a very heavy loss with either card.
If there's still RX 570 models floating around for $150-$170, that's your best bet as a temporary GPU.
If you already have something close or better, just wait, or you'll lose at least 50% of the card value. RTX 3000 will be a huge leap.
50%??? I will disagree & I reckon resale for a 2070 S won't be less than $600, as I doubt the 3000 cards won't be cheap and people will just hold onto what they have. Few bargains on the used market may be had but you're dreaming if you think you will be buying a RTX 2070 Super for $400 in 3 - 6 months time.
That's what they said about the 1080 Ti, and its used price has never been higher.
@TilacVIP: RTX 3000 chips will be barely more expensive than last-gen, if at all, because the comparative models will use a tad less silicon (though the wafer is more expensive).
Memory costs are down, board costs are down (though HDMI 2.1 ports may not let those decrease much), so overall things look like staying roughly the same, but yields will be higher for the 3080 Ti, so that might come in a little bit, or perhaps go back to being sold as a series refresh card (and thus closer to 1080 Ti prices than 2080 Ti).
The 1080 Ti is seen by some as a unicorn, and a lot of people spend about 2 seconds learning what GPU and often read guides from 2018. Feel free to be smarter than them, instead of building out your thought process on their faulty logic.
I'd definitely wait if the CPU isn't a bottleneck.
3000 series leaks is very promising in terms of performance. It's not that far from September/October.The rtx 2000 series will be a whole 2 years old by the time the consoles launch later this year.
Hard to believe it has been that long but if you want a card to last the generation I would wait just to be sure.Wait, 3070 will crush this.
Is this one better than https://www.scorptec.com.au/product/Graphics-Cards/NVIDIA/77… ?
They're both at the bottom of the barrel, so… pick your poison. I think Gigabyte has better warranty service?
I've not been keeping up with the better brands in terms of graphics cards. Any recommendations on the better companies to make graphics cards. I always knew Galax was not great but I thought Gigabyte was not bad? I might be way off though!
Gigabyte Gaming OC line is generally pretty good. The Windforce variants, however, are usually lower binned, smaller heatsink/fans and/or less fans, so temp and noise are much worse than Gaming OC. Most of the time, the Windforce cards are very cheap, so could be worth it if you spend the time to undervolt and create custom fan curve.
Pretty much every well-known brands have their own line up of bad/cheap and expensive/good cards, like MSI and their crap Ventus line vs awesome Gaming X Trio line.
Bottom of the barrel for for $700 range cards? 🤨.
Whats your take on evga 2060 ko ultra / sc ultra (non super)
That's a 2070, this is a 2070 Super, so obviously this.
Nice catch, I didn't notice the one in the link is non-super.
If you go for the Super, make sure you're not using a krypton gaming mouse
Looks like it got down to $719 a couple of weeks ago during an eBay sale.
This price is hard to resist.down to $719 a couple of weeks ago
Seller axed the listing shortly after it was posted here, it was relisted a few days later then about half an hour after that it was jacked up by $100. Only a couple were actually sold at $719
Oh I didn't know that. Thanks @FireRunner.
It's helpful to keep track of the pricing if one is not in a rush to buy one.
how many more months for the 3000 series?
No one knows for sure
Good price and all but man the performance-per-dollar you get from GPUs these days is just woeful.
I remember when GTX 1080s were going for this price or even cheaper in fact, around 2017-2018.
Yeah, I got an MSI Gaming X 1080 for $679 from MSY in 2017.
Yep, GPU's are even more painfully expensive now, it especially sucks seeing how much low-mid range cards are currently going for! Yikes.
I'm hopeful AMD can help introduce more competition to the high end and price aggressively but I'm not holding my breath.
I'd say AMD is more tempted to make very good money other than lowering the prices for everyone (ie. their agenda might be more biased towards achieving a public image of parity with NVIDIA so they can charge the same figures).
In other words, if the gaming GPU market gets split to 50-50, AMD could then charge the same amounts as NVIDIA and live happily thereafter.
Given the gaming GPU market is not yet at 50-50, AMD may keep charging "less" (until they get there).
I agree.
I hope AMD get onto Ray Tracing soon.
Given that more and more "AAA games" are making use of it, I think AMD probably have to make a move into that space soon.If they don't, then the high end will belong to Nvidia :(
I'm still running the GTX 670 I bought for $700 like 8 years ago or something. I don't play many games but it still runs Dota 2 on almost max settings, I recently downloaded Valorant and that runs fine.
Paid around this for the gigabyte OC version before Corona, This is a pretty freaking good buy.
Check his ebay store before buy, probably u can get same price with ebay plus promotion!
There seems to be 3 variants of the Galax:
Both $749:
3 Fan variant, higher boost clock 1815Mhz
https://www.shoppingexpress.com.au/buy/galax-geforce-rtx-207…2 Fan variant posted by OP, lower boost clock 1770Mhz
https://www.shoppingexpress.com.au/buy/galax-geforce-rtx-2070-super-(1-click-oc)-8gb-gddr6-graphics-card/27ISL6MD441C$859 - $110 higher, slightly different fans, 1815Mhz boost clock but cant see why the price increase?
https://www.shoppingexpress.com.au/buy/galax-geforce-rtx-207…With the better cooling, the first one seems like the better deal? Or are there more differences?
Or are there more differences?
length appears different.
That's what she said …
Can someone with more knowledge summarize:
- This card (2070) and how it compares against the 2080Ti
- The difference between this and the upcoming 3K series cards
- How this compares against previous gen 1080Ti..
Could you fit two of these into a single PC relatively easily ?
2080ti is in another league.
3 series card is a new chip and architecture. Expect big gains.
1080ti is close to 2080, which is kinda close to the 2070s.
Sli is pretty much dead circa 2015, never worth it and you're better off saving for one beast card than go for 2. As has been the case forever. Hope this helps.Thanks.
Am looking at a research processing machine build (specifications anyway, I wouldn't be actually building it!) and it's useful to narrow things down for the vendor(s) who quote on this..
We currently have a machine with 4x 1080Ti which works well - but looking at something with 4 x 2080Ti to meet an additional requirement. I was trying to think about whether a 4 x 2070 (coming in at $3K) was worth thinking about.
To give you an idea, a single V100 was quoted in excess of $15K.
So I wasn't looking for SLI.. just the literal horsepower for computation of how many cards you could fit into one machine (with no actual graphics being rendered..)
I just purchased this! Would this card fit into a NZXT H200i Mini ITX case? PC Part picker doesn't seem to think they are compatible, but from what I read online, the length is actually below the requirement, so I'm wondering if there's another reason it's not working.
Basically - my requirements are that I need TB3, and I've already purchased:
-Ryzen 3900x
-Rmx 850W gold
-Samsung EVO 1TB
-This 2070 superThen I'm looking at:
ASRock X570 PHANTOM GAMING-ITX (and therefore some sort of intel cooler, not sure which one will be good for 3900x)
G.Skill Ripjaws V Series 32 GB
And the H200i.I'm open to larger cases and motherboards, but a lot of the x570s with thunderbolt 3 headers seem to be out of stock, or around $1000.
Thanks for anyone who has knowledge!
Did a build in the H210 over the weekend - not an expert but it might be too thick; based on the pictures on the website it looks like it might be more than 2 slots thick due to the fans.
Also you wont be able to use an AIO CPU cooler with a dual radiator as the card is a bit longer (Air cooling in the H200 should be fine though - it's a big m-itx case)
Thank you for the info Willie! Appreciate it.
Maybe look at the Nzxt H510 if you want that aesthetic - should have enough clearance for both GPU thickness and length with AIO + might be cheaper if you can find an m-atx or atx X570 mobo with same features.
I love a good mitx but defs has compatibility issues with larger sized things
@Hardstuck Silver 2: I'll look into that one! Also had a quick look at the Fractal Design Define Mini C MicroATX Mid Tower Case which looks about the same size as the H510! That appeals to me because people seem to report it as being quiet.
Thanks again!
OzPrettyGoodPrice
LOL
Still using GTX 750 TI here and my face is still smiling when playing all the latest games
It will be interesting to see where the pricing for PC cards ends up after Xbox Series X and PS5 release. Anyone want to make some predictions as to whether it will have much effect or is PC GPU pricing basically unaffected by console releases?
I have ordered Sapphire nitro+ rx 5700xt which will be delivered on July 8. It is currently in transit. Should I order 2070 super and sell the rx 5700xt once it arrives. Any suggestions? The nitro+ costed me $697.
5700XT is just slightly below the 2070 super in terms of performance. However the 5700XT you bought is one of the better ones whilst this 2070 super is lower end which makes the difference even smaller.
Are you planning to use the cards long term? Or just as a placeholder for 3000 series GPU?
Honestly if you resell the 5700XT you’re going to lose value on what you paid for it, combined with the $50 price difference already, I do not think it’s worth. Unless you value ray tracing and DLSS then that may sway you to nvidia.Thanks for the input jezza10. Yes planning to use long term. I am building a new pc. I have ordered all the parts and the reason I ordered rx5700 xt is because I already own a LG monitor which supports radeon freesync.i currently use the monitor with my laptop.my configuration is 3900x,Asus rog crosshair viii hero,sapphire nitro+ 5700xt. I think I should stay with 5700xt.
Honestly if you can return the 5700XT for around the price you paid for it, then a $50 difference for the long-term performance gain in the 2070 super is not bad. But considering that you have a monitor with freesync then the 5700XT is not a bad choice either. I don't think you can go wrong with either card :)
What resolution would you be using? or plan to use in the future? 2070s will have a better performance at higher resolutions but not by an insane amount.
I actually have had both a 5700xt nitro+ special edition and this galax 2070 super run in my main system last week- suprisingly this galax ran cooler and queiter than the nitro+, was expecting the cooler to suck as it's the cheapest 2070 super model on the market but actually performed better than even my 2060 super strix's cooler. Also performance was better in most games with some games leading by a fair margin. But hands down the nitro+ is a better looking card- one of the best looking coolers IMO.
They had the Gaming Black Edition yesterday at the same price. Was incredibly tempted then… but I've made my peace. I'll wait till next gen.
Still at that price today, much better cooling and OC'd - a slightly better buy than OP's deal.
https://www.shoppingexpress.com.au/buy/galax-geforce-rtx-207…Are you sure you don't want it?
Hey guys, I'm only a part time gamer but I'm building a PC relying on advice from some much younger full on gamers(30 yr olds) and they said this GPU will suit my build Sapphire Radeon Pulse RX 5700 Xt 8GB GDDR6, it's $665+ $18 del from Amazon. Would this be a step up or sideways or just stick with what they advise?
The pulse 5700XT is a great card, it was recently around ~630ish earlier this month (and you can sign up for free 30 day trial of amazon prime to get free delivery so -$18 fee). If you can get it around that price than it would be a pretty good deal and imo wouldn't justify the extra $120 on the 2070 super.
The 2070s will perform better overall but bang for buck wise the 5700XT is very good. You will have to consider if you value factors such as DLSS or raytracing in your games that nvidia cards provide or if you plan to game at much higher resolutions (where the gap between the 2070s and 5700XT increases)Thanks for that advice! The Pulse just dropped nearly $50 overnight so I'll grab it.
*Email from eBay shows this card dropping $76 as well
Gigabyte Radeon RX 5700 XT GAMING OC AMD Graphic Video Card
The monitor you are using is very important for 3d gamers. Higher resolutions require a much more powerful card to get the same frame rate with the same settings. That's why I am not migrating to 4k gaming.
Bought. I hate you all so much.
Good price on a 2070 Super!