Good price considering the current market. Not a clue when exactly they'll be in but is you can wait, this is by far the cheapest 5080 available to order right now.
[Pre Order] MSI GeForce RTX 5080 16G VENTUS 3X OC White Graphics Card $2,264.40 + Shipping ($0 VIC Pickup) @ CPL Online

Last edited 20/03/2025 - 12:23 by 1 other user
Related Stores
Comments
Is the 5080 faster than 4080 super?
Yes, but barely.
You can then OC it for another 12% performance while using, which takes it pretty close to a 4090 while consuming around 350 W.
Agree, am surprised nvidia didnt milk it. Unless a lot of dies cant do it. We dont know as there are not even enough cards with customers lol.
@John Doh: There are enough, and there's plenty of reviews. No one seems to contest this.
The reasoning for more of it not being in stock configs? Probably dropping additional testing time at several levels in order to keep costs down and target a competitive price point when stock is plentiful. I'd imagine the goal is for the MSRP to be more of a ceiling within 6 months.
Whether NVIDIA starts to come up with a system in their drivers that bumps clocks over time once the performance data comes in, who knows. Can imagine that will become standard for CPUs, GPUs, NPUs, etc, over time.
@jasswolf: Probably, I havent been following up on latest news with the crappy release. Technically even the 9070XT OCs well when undervolted.
@John Doh: I mean they can tipple towards +10%, but the power consumption surges well past the the 5080, so they're accomplishing this by pushing far further along the voltage-clock curve than any consumer should.
Completely different efficiency gradient compared to what I'm referring to. Remember that Navi 48 XL has 18% more transistors than GB203 despite having around the same die area, and AMD have pushed density into something more favourable for typical laptop architectures and voltages (or perhaps enterprise).
It's potentially a long term boon for AMD that they have the capability to refine their silicon this way, but this was not a good design for desktop sales, hence their massive price cut right before launch to compete.
@jasswolf: The 9070XT consumes more power than a 5080 by defualt. I just rechecked and its surprising that even the 5070Ti is on par and some times greater than a 5080 when it comes to power consumption - https://youtu.be/VQB0i0v2mkg?si=v_z0F9hPeiSncuKA&t=931
@John Doh: I mean this isn't a stunning revelation when you consider the entire board and chip design is basically the same, with the only difference being 14 SMs and an NVDEC subunit being disabled by laser.
On specific graphics workloads the power consumption can be very similar for what would be the same clock optimisation algorithm. Core scaling only takes you so far, as you'll hit walls with latency and voltage/current capabilities.
Given the price that you can currently sell a 4080 super for it is an OK deal. Waiting until the next gen, the 4080 super will only have limited resale value.
Much more expensive though 😱 https://www.3dcenter.org/artikel/fullhd-ultrahd-performance-…
Is your understandings is tainted by bias &/or paid reviews…has past GPUs &/or CPUs meet expectations 🤔…maybe SOME-comments here reflect that ?
Honestly dont know why anyone would get a 5080 over a 9070xt. $1000 for maybe 10% better performance in SOME games. Unless you BADLY want Slightly more raytracting performance.
Maybe because DLSS is far better than FSR? "10% better in SOME games" and more than 10% in others. I'm still hoping AMD surprise us all and introduce a 9070xtx.
FSR4 is not to far off of DLSS give it a few months and that gap will close even more
Very unlikely, given nvidia has been at the deep learning game for a few years now and AMD is getting into it only now after wasting time and resources on the non-deep learning FSRs.
@billy77: Tech changes so much in a few months
That being said, most recent comparisons on FSR (9070XT) vs DLSS have shown near-equal performance. At these prices? Pick AMD this gen
@Pusheencat: Equal performance? Yeah AMD tune their output to match in frame time… what's the visual quality difference?
FSR 4 leaps past FP16 DLSS CNN models, but it's an FP8 hybrid CNN-transformer model that seemingly has the same frame as NVIDIA's transformer model. DLSS transformer models are also FP8, and the visual quality difference is stark.
AMD's model does maybe 1 thing better by blurring out issues in a visually pleasing manner (though it does this specific aspect better than DLSS CNN), then the transformer model thrashes it around the room and looks like native or better than native (particularly with respect to basic TAA implementations).
I don't like this price for a GB203 die, but NVIDIA's software R&D is top notch and reflects a hardware investment in supercomputing training time, engineer and coding effort, and being years in advance. And it's not the only set of neural models that are available that thrash AMD's efforts.
@billy77: DLSS and FSR are based on public research. It comes down to the models they use and how much the model learns. Like all AI learning, progress is fast at the start but eventually slows down to almost nothing. Nvidia has been at this plateau for 2–3 years now, doing a lot of learning for almost no gain. AMD has more or less just finished the fast phase and has now entered the plateau.
@Shiroi Okami: NVIDIA just flipped over to FP8 transformer, which they label as in beta, and already produced a new preset for it at launch, so we're not at any plateau.
One quick look at ray reconstruction will show you that methods for fixing shadow/lighting ghosting still have a long way to go. Overall the image quality is flattening out, but in terms of bringing light simulation to a greater level of real-time performance without motion artefacts, you can see issues and examples that require more training, and greater compute (potentially an FP4 model, in time).
AMD is blurred by comparison, just not a completely fizzled mess anymore. They've barely made the first step in terms of a practical application, which is standard AMD these days. They love open standards because they really only drill down on devising hardware that meets them, not producing software solutions.
And that's just upscaling, frame gen, and denoising… we've now got a bunch of neural models to massively improve simulating light bounces (subsurface scattering of skin & hair, radiance caches that infer most bounces beyond the first 2, etc). NVIDIA are years ahead on this front.
@billy77: So uh just like deepseek came out ta nowhere to compete against chat gpt which has had year and years more development?
Literally an ai that could keep up to the top dogs using less resources. Sounds familiar here.
@krisspy: Deepseek's magic comes from low-level programming to suit their needs… they basically re-wrote CUDA using the tools NVIDIA released years ago for that (PTX, a near-assembly language aka close to 'metal').
Over time, many specific neural libraries will get further optimisation, but expecting AMD to magically do that with their own SDK and then get everything to work nicely in their current graphics pipeline is absurd.
FSR 4 has nearly caught up and is coming pretty darn close, watch HUB's video on it if you haven't already
Not the same mess FSR 3 and 3.1 was
FSR 3 was not using AI like dlss was, FSR4 is using AI like dlss
@Shiroi Okami: I'm well aware, which is why the notion of 'FsR tRaSh, DlSs GoOd' isn't as true as it was a few months ago (the point I was trying to make in my original comment).
Now only if I can get my hands on an MSRP 9070 XT…
They need nvidia badge for the upgrade path to 9090 Ti.
Wait until you see the price of 9090-Ti ?
It's closer to 15-20% performance difference especially above 1080p, 30% with ray tracing, while using 20% less power. DLSS is still far better than FSR as well.
5070Ti is a better comparison though, and well worth the extra.
9070xt is great, but would've been a better deal if they weren't misleading with their dodgy pricing tactics.
Not always about gaming, I work a lot with 3D Rendering/Animation and AMD cards are of no use because the software utilises CUDA waaaay better. Basically leaves Nvidia as the only option for GPU's
This is becoming more prominent. Rendering with Nvidia is becoming a monopoly. Some software its day and night difference between using the optimised Nvidia program and anything else. Sad that we have put up with this crap. Nvidia pay top dollar to companies to have only thrir cards be able to use certain modes
Because CUDA destroys OpenCL.
Its not sligtly more RT, its plenty more RT - https://youtu.be/VQB0i0v2mkg?si=Yd325P8jXuD99X0w&t=1184
The multi fake frames, just like DLSS and FSR will get better over time.Is it worth 1000 more? That depends on how important gaming is and how much money people have.
I have found NVIDIA cards work much better with CAD software.
I know that's a unique case that the average gamer won't care about, but it's a reason why I'm forced to obey daddy NVIDIA no matter how much they charge.
On the plus side though…..tax deduction
VFIO support :(
Some crazy prices for GPU, also losing $100 interest per year if you buy this :(
But the interest CPLonline will be making on a preorder with no ETA!!! Maybe there is no preorder, they'll just happily invest the $$ until the suckers cancel.
So then buy it on a credit card with 60day payment and split it with Pay pal.
Credit on credit till you get it.
Not sure why you got the neg, its fact. This price is ridiculous, and the drop in value is laughable the moment you walk out the door, you can save $1000 buying a card thats got 99% of the power
$2,264.40 for 16GB ram
I wont upgrade until there is some proper competition around the 80-tier cards. Amd has to close the gap on ray tracing performance.
Until then I'll just settle for my backup 3060ti 😂
Sold my 3080 preemptively thinking 50-series would be a nice jump but it's turned out to be a cashgrab mess, with the same power cable issues because for who knows why GPU manufacturers don't use a few 50c shunt resistors to be able to load balance the cables…
Who and why pay this much for a single component? I mean, Im sure it looks amazing but the price of an entire system for a single component…WOW!
Absolute disappointment the 5080s have been
Been wanting to upgrade my 3080, first gen I skipped and they kneecapped 5080 so much they are actually 5070s in everything but name.
I'm in the same boat as you with a 3080, skipped the 4xxx series because other than the 4090, the performance uplift wasn't really there and here we are 2+ years later and it's all…kinda the same?
Other than the 4090 and 5090, basically everything new that's been released you could have bought similar performance, for similar (or less!) money 2 years ago…and it wasn't all that impressive then.
Pre-order now and receive the card when?!!
Should only have to pay 1/4 of that price
Is cplonline legit? I could see they are taking preorder for other graphic card but no etd.
Is this a good time to buy or will it get more expensive when more Trade Tariffs come to effect? Where are these made/assembled? USA?
Seems like there have only been short windows when it was a good time to buy a GPU in the past 8 years.
2018 was a complete write-off thanks to the Crypto mining boom.
There was a brief period of normality in 2019 when I was able to buy an RTX 2080, but that reprieve was shattered by the covid lockdowns and stay-at-home orders starting in early 2020 that made GPUs as scarce as hens teeth again.
For a brief moment in 2022 it was possible to buy GPUs again so I purchased an RTX 3080 Ti - but when the whole Generative AI boom kicked off in earnest in early 2023, we got smashed again.
The release of the 40-series "Super" cards in early 2024 saw a stretch of somewhat normal supply in GPUs, so I took the opportunity to purchase an RTX 4080 Super.
But they ceased production of the RTX 40-series early in order to make way for the launch of the RTX 50-series.
When that subsequently got delayed and delayed into this year - GPU supply dried up drastically again - and here we are now - still - with no supply and ridiculous prices yet again.
Tech Yes City has stated in one of his recent videos that he thinks supply is going to get worse and worse from this point for at least the whole rest of this year.
I really hope he's wrong - but based on historical precedent… I wouldn't be surprised he turns out to be right.
Which games do you play with these cards?
Some current popular games that require all the GPU are:
- Black Myth Wukong
- Hogwarts Legacy
- Cyberpunk Phantom Liberty
- Alan Wake 2
- Flight Simulator
- Indiana Jones Great Circle
I'm sure there are many more…
For all my impulse buy before OOS and think of cancelling after people: https://www.pgrid.com.au/gpus/geforce-rtx-5080
Has been under $2200 5 times before since end of Jan. HODL!!!