To hodl or not to hodl?
I think this is a much better model of rx6600 xt
To hodl or not to hodl?
I think this is a much better model of rx6600 xt
Oh god all these amd price drops make me cry. the one software I need a powerful gpu for is optimised for nvidia…
If only 3060ti would get cheaper and fix the price gap between 3060 and ti
Don't think 3060ti will never get super cheap as its the same chip as 3070, so the bad quality chips become 3060ti's and as they increase their yield there are less of the bad quality chips.
Same chip as the 3070 Ti as well, so they'll probably have the best deals going forward. 3060 Ti quantities will be limited. But the 3070 Ti is also a really good crypto miner, which limits price drops somewhat.
The large performance difference between the 3060 Ti and 3060 sorta justifies the cost. Best performance per dollar in my opinion (although that 6800 deal was pretty amazing).
May I ask what is the nvidia-optimised software that you're running?
Davinci resolve, it’s not even a competition, the davinci results in pugetbench are…
https://www.pugetsystems.com/labs/articles/DaVinci-Resolve-S…
On top of the Resolve use case, there's also a lot of machine learning stuff which only work with Nvidia's CUDA toolkit (though I'm unsure if you can get that working on an AMD card or the CPU these days), and cases like Blender 3.0 not having AMD support at release.
For gaming I'd be fine with saving money on AMD, but for non-gaming stuff it can be worth the extra investment to go with Nvidia, or there's been times in recent years where that's been the case.
I think the problem with the 3060ti is that now the ETH LHR unlocks have come through, they perform on par with a 3070 mining ETH so I'm doubtful they'll come down in price.
Great price, I was looking earlier at what this card can do with SAM, FSR and RSR it's really a great value card, far better than the 3060.
this is nice. too bad i already got a 6900xt
I think it's more like "this $10 steak is nice. Too bad I already one double the size for $40"
Its more like "this $10 steak is nice. Too bad I already one 2x the size for $30"
@congo: It's more like "this quarter chicken lunch pack for $10 is nice, too bad I already got a half chicken lunch pack for $30"
@Skhtor: Hahah. Isnt that the same ratio?
@congo: Idk dude I just like talking in terms of chicken lunch packs
@Skhtor: bu - gerkkkk
@Skhtor: That's much better analogy, just needs more $ to better reflect the insane prices of the current GPU shortage:
"this quarter-chicken lunch pack for $20 is nice, too bad I already got a half-chicken lunch pack for $60"
@ItsMeAgro: This quarter chicken lunch pack for $499 is nice, too bad I already got a half chicken for $1399
ferrari testarossa
show off
How does compare to say a train, which i can also afford?
The heat sink will warm your buttocks
Glad I hodled and didn't buy the very tempting MSI Mech. But I hodl and ignore this deal too. I've been forced to hodl 14 months so another 2-3 months is nothing.
With the fake hodlers having lapped up the eagle/ventus/asrock/colorful/asus dual models there's now more stock of better models. Thus prices on these better models now fall. The 6600xt will also micro stutter at 1440p/4k even in older games where gpu core performance isn't an issue. Imagine running 1440p over 8x pcie3 and with a nerfed 128 bit memory interface
It's insane that a card even at this price is severely gimped by PCI-e bandwidth. I wouldn't touch it.
it makes a tiny difference running at x8 instead of x16
proven many times
Even at 1080p 8x pcie3 is shown to already effect games like Doom Eternal, who knows what future games it will bottleneck. 1440p and higher there is indeed a bottleneck. At 2560x1440p a 6600xt is no faster than a 3060. Gets worse the higher you go. Probably more attributed to the 128 bit interface than the nerfed pcie bus.
The 6700xt is outperformed by the 3060ti at 4k. Despite both being 16x pcie the 6700xt has a narrower memory bus than the 3060ti. 128 bit is a very narrow memory bus
@Jimmy77: yawn
not talking about the 6700xt.
@Lonewolf1983: Lol. At 1440p the 6600XT fps drop to equal that of the 3060. It gets even worse as the res goes up watch the HU video. I've already explained to you before the pcie and memory bus nerfs cripple the card at anything above 1080p.
@Jimmy77: and its still cheaper than a 3060.
Both mine pretty well which is the best thing
You merely adopted the HODL, I was born in it, moulded by it …
…I'm still playing on an R9 280x. 3GB of VRAM…
still got my 5850 sli on my 3930k
Damn bruh im surprised you didnt send that message out via telegraph
@s1Lence: It still plays the very latest games at 1080p@60, often with High settings.
If you've been holding this long, it would be absolutely stupid to not keep holding until September.
Why thank you!
I was wrong btw. Lovelace isn't going to be September, it's looking like the launch date will be in July. It's only two months away!
Both amd and nv have keynotes at computex.
AMD will launch later though due to redesigning the monolithic die into chiplets, whereas Lovelace is rather straightforward.
Noob here, what's the price to pull the trigger for a 6600XT? I'll keep hodling for as long as it takes. Thx
The price is apparently… next Gen!
About tree fiddy.
I play every game at 1440P max settings… zero issues with the 6600XT.. all games run at or above 60FPS… not sure what your on about with micro stutter?… zero reviewers have also picked up this so again.. what are you talking about
lol at the childish (profanity) who downvoted that article. "Oh noes my card is nerfed and it hurts my feelings better downvote/censor facts"
Imagine trying to censor the fact the gpu is memory nerfed. Pathetic. Go follow a soccer team not a company
@Jimmy77: Nvidia 4060 is coming out with a 128bit memory bus…. it will be faster then a RTX3070 becuase nvidia is also moving to adding cache on the die… what do you know better then both AMD and Nvidia now?
Nvidias whole next GPU line will have small memory bus's and they all have lots of cache on the gpu die…
@vid_ghost: With all due respect I don't think you know what cache and bus is. Cache is a memory bank. If the core needs information that isn't on that bank it still needs to communicate with the memory. And use the nerfed 128 bit bus to do so.
Cache isn't a core lol it can't process information. What you're arguing is more information can be stored on that cache hence less dependence on the bus. The bus is still a critical factor at high res (especially with the measly amount of infinity cahce on the 6600xt).
Additionally why is 256 bit on rx6800 cards and above still required? They too have this infinity cache you claims nullifies the need for greater than 128 bit memory bus
And the 4060 will obviously be marketed as a 1080p card. Given it's xx60 & specced with a 128 bit memory interface
From your article: 'Architecturally, it's still impressive how much performance AMD was able to wring out of a 128-bit memory interface. That's thanks to the Infinity Cache, which even with 'only' 32MB clearly does a lot for performance and helps avoid massive GPU bottlenecks. That AMD was able to match and even exceed the performance of the RX 5700 XT — at 1080p and 1440p, anyway — with a bit more than half the bandwidth proves how much a larger L3 cache can help GPUs. But the resulting chip isn't much smaller, though the board complexity and power use are also lower.'
I can see the bus specs listed as a downside at the start of the article, but the article's conclusion indicates that they didn't encounter significant performance issues arising from it ultimately. You are now quoting articles that do not support your allegation that the 128 bit bus width causes significant performance issues. This card has always clearly targeted 1080p and 1440p60 (with RX 5700 XT levels of performance)
This card has always clearly targeted 1080p and 1440p60
This is just marketing spin. I'm running 1440p on a 750ti. Not only are different games going to utilize the hardware differently, but non-game workloads too. IMO worse than the gimped interface is this stupid marketing around being "a 1080p card".
The fact it's got a gimped interface does suck, although it makes the lower prices they are currently sitting at (vs nvidia) fair in a sense. Still, the whole market is overpriced by 10-20% IMO given how close to next gen we are. I think we'll see some further reduction, or at the very least normalisation around the current low end of retail prices.
same price as previous best 6600XT, so can refer to the same graph and table I posted for the 6800 today.
Graph - https://files.ozbargain.com.au/upload/47948/95874/20220520.j…
Very close to pulling the trigger, but must HODL
Are you HODLing for cheaper prices, or for RDNA3/40 series?
Which ever comes first
Good price indeed but for the love of goat can we have some deal on 3060ti/3070…
Eth needs to die. 3060ti/3070 are priced to hashrate. So if they drop in price miners will buy them all again
Need Eth to switch to APU mining ;) Give poor ol' Creative some market back for their Sound Blasters!
miners like the 6600 series too as they are nice and efficient, and the 3070 mines other algorithms pretty well even when Eth does eventually die so dont hold your breath…
Good deal..
Also.. Gigabyte Radeon RX 6600 XT Gaming OC PRO 8GB Graphics Card for $539
https://www.amazon.com.au/GIGABYTE-Radeon-Graphics-GV-R66XTG…
Fantastic card from a solid brand
My current gpu is a ASUS 6600, being 2.5 slot. My motherboard is a gaming X b550m. There's very little space for the wires at the bottom of the pc, I'm afraid of the gpu fans touching the wires. Should I just buy this instead and sell the 6600?
What wires? You do realise the PCIe cables can be routed to come over the top of the GPU, they don't need to come from underneath. But PCIe cables aren't going to get sucked into a fan either, only really thin case fan wires might have a problem.
not likely to happen. don't worry about the pcie cables.
Its the smaller spare fan/rgb cables you have to tie down
According to Vita85,LoneWolf1983 & thestripedlion this nitro+ is no better than the asrock challenger and possibly even worse according to Vita85.
same performance and warranty so yep
I feel like the copium is strong with those three.
Good price and card from a good brand, sort've a 1080ti+ but hey $500 ain't bad
1080ti has a whopping 356 bit memory interface & 16x pice3 support. Aside from better power consumption this card is absolute nerfed rubbish compared to a 1080ti. Only 128 bit interface on the 6600xt. Only 192 bit on 6700xt. AMD nerf their cards a lot. For reference 2070s 256bit, GTX 1070 256 bit, GTX 970 256 bit. GTX 960 128 bit memory interface..
AMD cut down their $500-800 cards to the same specs nvidia cut down their sub $300 cards
You know the infinity cache works to negate the memory bus.. also the faster DDR6 memory speeds help..In the end its pumping out the same FPS as a 1080ti/2070Super at 50% less power usage .. new RDNA2 Arch has more IPC.. things change and improve with time.. if anything the older cards with so much memory bandwidth and power usage suck.. they cant even beat a low to midrange new gen card.
whats the equivalent nvidia?
The 3060. But the 3060 mines crypto ~65% faster.
Good price, I've had Nvidia cards for the last 2 gens. I wonder how good AMD cards are performing now.
not bad, generally compete really well with Nvidia but game dependent.
Check out something like hardware unboxed for the model you're interesting and compare against games you use
June is at corner , maybe just hodl a little bit for EOFY Sales
*around the corner
160 watts tdp with 1080ti performance. my 3080 sucks too much power I should have gone for something like this.
I have a 6600 8Gb.. it sips power and runs cool as ice… plays every game i want at 1440p 60fps.. Elden Ring is fun :)
I will upgrade it to the 7600Xt when that cones out.. i never see any point going for the flagships when i can save money and have the latest GPU with the mid to low end range
Yeah my story was I sold my mint condition 1080ti rog strix hoping to snag a 3070 and got stuck with just a 1650 super for like 18 months. As soon as 3080s hit $1200 pulled the trigger cause I was blue balling so bad. In retrospect $500 on a 5600xt would have been a good move and that could have served me until next gen. I like the low power usage so I'm basically power limiting my 3080 quite often anyway!
As a OZB dude with no need for this. I pussed out and went Afterpay on Umart. We are not poor, but play games on what we buy. Afterpay saves a lot of arguments LOL.
This makes me regret buying the ASRock challenger for $499. It came yesterday and it feels like a cheap plastic toy. Was blown away by how cheap it feels. This would have been a much better buy.
Yeah Saphire make decent cards. At the end of the day 160 watts isn't going to need an amazing cooling solution and you should probably be ok.
Yeah it will definitely be fine, it just feels so cheap lol
If OOS, same model number and $499 + delivery @ https://www.mwave.com.au/product/sapphire-nitro-radeon-rx-66…
Sapphire is an excellent brand but I'm waiting for something better at this price range.