solid deal, 3yrs wrnty, APAYDAY
If anyone is sceptical of APAYDAY or has had issues with them let me know, I've never used them before and plan on purchasing
solid deal, 3yrs wrnty, APAYDAY
If anyone is sceptical of APAYDAY or has had issues with them let me know, I've never used them before and plan on purchasing
90% of people I know sell their PC after a few years to upgrade, I learnt quite the hard way as to why NVIDIA is go to if you plan to upgrade in next 3 yrs whilst maintaining as much value as possible
Amd is already out of the question for this, Intel is even worse than AMD
but yes rtx 50 will come out in like 6 months, so if you can wait that long sure
Intel is even worse than AMD
It's a red-hot race for 2nd.
Intel
Pros: proper AI upscaling, great raytracing performance, better AV1 & H.264 encoding than AMD (though AMD made a recent update, this may change as comparison data comes in)
Cons: power consumption a shade high (so far), xx60 Ti competitors at best (so far), driver support mixed (but improved a lot, and still improving)
AMD
Pros: raster performance goes right up against NVIDIA, game support solid
Cons: everything else until at least RDNA4 (which won't compete far above xx70 level)
Sorry, when I meant worse, I meant resale value
@JJtoTheRadio: I can't account for the unwashed masses, but sentiment can and will change.
@jasswolf: Wdym? It's like apple vs android, kinda hard to change the brand value other than by the brand itself
@JJtoTheRadio: Did I not just point out that Intel are doing all the work AMD have refused to do for years? With Battlemage release, Intel are effectively the #2 option until RNDA5, which is probably late 2025 at best.
@jasswolf: I get you, AMD is moving towards mid range and low end gpu's for their next launch
@JJtoTheRadio: Yes, because they cooked AI & RT acceleration improvements for RDNA4 and know they can't compete initially with the RTX 40 or 50 series.
@jasswolf: I haven't kept upto date with 7000 series, but for me the Rx 6800 was such value for money
What happened this generation?
@JJtoTheRadio: They didn't start devoting more processing power to RT and introduce AI upscaling, and now they're a visual mess if you try to do anything other than the basics. Going to leave it there as this is a huge comment tree now!
How do you not include VRAM buffer in your pros for AMD
@Budju: Because it ultimately accounts for little unless you're loading up on poorly optimised texture packs in an age of rampant AI upscaling and compression passes.
Specifically for the 4060 Ti you might have more of a point, but it's a couple of settings tweaks at 1080p, even 1440p. Once you flip on anything above basic RT AO & shadows, AMD drops off the map, while pathtracing performance is horrific.
The tech will age better despite the VRAM count, lots of asset streaming and in-pipe workload capability still yet to be utilised.
@jasswolf: 4060 and 4070 it's a problem. Don't even get me started on 3000 series. Despite the mental gymnastics about how things MIGHT get better in the future with AI compression (I assume that's where you were going), presently the VRAM buffer on most low and mid-tier AMD products is a perk.
@Budju: 4070 is a 1440p card and is fine for that. You don't buy a mid-range card and expect to get 1440p ultra in every game at 120+ FPS no matter the settings or developer intent.
The AI compression and upscaling passes I'm referring to are done on the assets that are shipped, not the card, and the latter has been used for a long time. What is done on the card is DLSS, mesh shaders, DirectStorage/Sampler Feedback, and now work graphs. DLSS aside, you can count on one hand the games that use these, and any technology implementation like it that's actually optimised (Unreal Engine 5.4 and above).
All of these technologies reduce VRAM requirements and CPU and GPU bottlenecks.
@jasswolf: You're an Nvidia shill. Bye.
@Budju: You've got reddit brain. Until AMD bother to put proper AI and RT acceleration units into their GPUs, they are not worth anyone's time, which at this point seems to be RDNA5, so late 2025 at best.
@jasswolf: No I just watch proper reviews. You not even being able to admit VRAM is an issue with the 3000 and 4000 series is actually reddit shillery. Also claiming a $1000 GPU like 4070 is a '1440p card' is an absolute joke.
@Budju: It's only an issue if you set things to ultra at 1440p. Most of the features I have listed were released in 2018, and with the exception of DirectStorage, have been free to use.
It's been 6 years, devs and game companies have screwed up.
@jasswolf: It is what it is, doesn't matter why. 16gb vram is a perk.
I know it's not a common price but if I was going to get the 4060 Ti, I'd rather wait until I could pay bit more and get the 16GB version e.g. https://www.ozbargain.com.au/node/829356
Yeah 8GB is not enough
Honestly a terrible price. The 3070 is 5% faster and 448.0 GB/s. This gimped card is 288.0 GB/s
https://www.ozbargain.com.au/node/777430
https://www.ozbargain.com.au/node/780177
Since when should tech (which becomes obsolete fast) appreciate like a Mclaren F1?
Would be a great price if not for Intel's Battlemage being reasonably imminent. Probably Q3 now at this point, but they can't hold out much longer given RDNA4 and then the RTX 50 series.