Doesn't include in-built graphics, so discrete graphic card is required.
Intel Core i5-9400F $206.10 + Delivery (Free Delivery with eBay Plus) @ Computer Alliance eBay
Related Stores
closed Comments
Is amd really all over intel these days out I'd it just a price thing?
It's a threads thing.
The only Ryzen you can buy with ~$200 is the last gen 2600, which the 9400F will trounce for gaming and most IPC-dependent applications (e.g. video work in Premiere Pro).
trounce as in mop the floor or just a few negligible fps ?
It depends on the game, if GPU bound, then negligible. If not, it can be significant. For example, there are plenty of CS:GO players with 240 Hz 1080p monitors. The 9400F will significantly beat out the 2600 there. It depends on the sort of games you play.
If they were similar prices the R5 3600 is for sure a better buy, but this is $109 cheaper than the current retail price of the 3600. They perform similarly in gaming and both are more than adequate for 60fps gaming in the vast majority of current games (often both pushing >100fps in some titles). The 9400 perform better in many games than the previous generation (and cheaper) R5 2600.
Basically this is a good deal for someone trying to build an affordable PC. If you can get a 2600 for similar pricing, then that's a viable option (and a clearly better option if you intend to do well threaded productivity workloads), but if not, this is a good chip.
i think the IPC vs threads in gaming is about to shift in very near future where games utilize more and more threads
even console xbox/ps5 uses zen with 8 core 16 threads
I wonder if they'll disable SMT(Hyperthreading) though as it has traditionally hurt gaming performance.
@TheContact: It hurts performance if the extra threads aren't utilized well. It helps performance if they are. Disabling SMT would give a (very small) boost in poorly threaded games, and a big penalty in well threaded ones. Disabling it even on PC CPUs is generally inadvisable except in cases where the game or software has severe compatibility issues caused by higher thread counts.
Since consoles will be a fixed target spec, it's likely they would keep them active, since that gives more (potential) performance at a very very small cost in silicon space.
I'm not sure a 2600 6c/12t is going to stand up long term against 8c/16t as the base standard for consoles (albeit at a slightly lower frequency), so I can't imagine that anybody is buying a budget CPU today and expecting it to last 7 years. But for the first few years after their launch (so basically ~3 years starting now considering a Q3/Q4 2020 release), there won't be many next-gen exclusive games, we'll mainly have cross-generation games that still have to run on the current consoles. That puts a floor in the performance there - it needs to run on crappy jaguar CPUs from 2013. Even if these games become better optimized for higher thread counts, that performance floor guarantees that even older CPUs will be able to run it well, even if 6c12t would run better relative to how it performs on average today.
Trying to plan for the long term future (3-4+ years) is fraught with uncertainties. No telling how the tech market will shift.
i think the IPC vs threads in gaming is about to shift in very near future where games utilize more and more threads
You should watch this video: https://www.youtube.com/watch?v=0nTDFLMLX9k&feature=youtu.be. It goes to show the amount of work required in making a game use threads more efficiently. There is also a cost to making games utilise more threads, it will run more poorly on less cores. This is a risk for developers.
Either way, you're right, but I don't think the progress is going to come as quickly as you would like. As much as enthusiasts like high end hardware, the majority of users are simply not on the cutting edge. Game developers develop for the majority, the largest market for their games.
The Steam hardware survey shows that 54% of all their users are still using quad-core CPUs. Given that the update cycle for CPUs is quite long (I still know people on a 2600K), it'll be a while before this number is low enough that developers can afford to punish lower core counts.
even console xbox/ps5 uses zen with 8 core 16 threads
Consoles are slowly evolving to just basically being PCs in terms of their underlying architecture. Either way, I think that two of the cores are going to be provisioned for the OS and background services, much like what happened with this generation.
2600 is $200 (not sure if discounts are applicable here)
Apparently the 9400f is a tad faster then the 2600x. So with an OC there'd be negligible difference, without there'd be a slightly bigger difference, but likely unimportant.
There's also upgrades, should you want to plan for that. A 2600 could be upgraded to a 3700x if you find performance lacking. A 9400f wouldn't have a reasonable route to the 9700k, as you'd need a z390 to get the best performance (which, tmk, are far more expensive then b450 boards).
If you're going to buy a GPU anyway. Is there any reason not to buy a F series Intel chip?
Other than AMD being the reason? RIP Intel
It’s always handy to have onboard GPU should a dedicated GPU fail and you need to use it.
But in most cases it wouldn’t be an issue. But you’re not really getting the CPU for any cheaper because it’s missing the iGPU, you’re effectively getting a chip that hasn’t passed QA with its iGPU enabled
So can we use that GPU?
No, the iGPU is disabled at a hardware level and I don’t think you can. I believe the f range are chips that failed QA and could have defects with the iGPU, hence why they’re disabled. Intel were struggling to push out stock and resorted to pushing these chips out.
If you don’t need the iGPU then these would be perfectly fine. If resellers are struggling to sell them they may mark them down to get them out the door, resulting in a cheaper price than the regular chips
@sghetti: That is good news and never knew about F series as it was not readily available in shops, I have never used a desktop with IGPU.
Btw, in good old days there were options to enable more shaders in nVidia cards, that is why I asked whether we can enable that disabled IGPU too
Some workloads benefit from an iGPU even with a discrete GPU, eg. Adobe apps will leverage the iGPU for accelerated h264/h265 encoding.
For gaming, unless you stream, not so much.
Can’t look at a single Intel deal without the AMD lurkers coming out. To set the record, I don’t have a preference and would use either.
The funny thing is that Intel is now better for more budget oriented systems and AMD is better for high end ones. I'd go with 3700X and 3900X for higher end systems any day, but the i5 9400F and i3 9100F still beat what AMD is offering at that price point.
Currently running a 9600k in my gaming machine. If I was building now I’d would certainly have looked into the new Ryzen chips as well
i5-8400 is like 1fps less than this
or maybe 3fps most
Yes, but the 8400 is more expensive.
Or am I reading this wrong?
Good price; better for gaming than the 2700, and not a total flop in productivity. I'd prefer the 2700 myself, but this is a good entry point for a starter gaming PC.
What good mobo to use for this?
Funny how $200 or so for an i5 is considered a bargain. When I bought my i5 3rd gen back in 2012, that was the normal price for these. Personally I'd want the 9500 over the 9400 due to the higher 4.4 ghz clock.
It’s almost like you haven’t considered inflation among other various economic factors for changes in consumer product prices
Hmm, 2% or less inflation per year over 7 years… pretty negligible.
Because changes in economic climate, corporate profits to please shareholders, on top of increasing costs of doing business have nothing to do with it. How much more are you paying for electricity these days compared to 2012? How much more are you paying for takeaway foods or restaurant meals compared to 2012? Not everything follows the standard 2% inflation, as stated in my previous comment, inflation among other various economic factors.
2% or less inflation per year over 7 years… pretty negligible
This is a seriously uneducated statement. Something that's $200 7 years ago would be $230 today. I don't see how that's negligible.
@p1 ama: Meh, a 15% increase over 7 years isn't much. But, $200 was the price I paid for my i5, which was the fastest version at the time. There were also several lower clocked versions, which sold for a bit less - about $180. This CPU (9400), however, is not the fastest, not even the 2nd fastest, but it's the slowest i5 out now! In other words, today's slowest i5, on sale, is the same price as the fastest i5 from 7 years ago! This might be a good deal if it was for the 9500.
In other words, today's slowest i5, on sale, is the same price as the fastest i5 from 7 years ago!
Read my below comment - it's not about inflation. It's the fact that the AUD is worth over 30% less than what it was worth when you purchased your i5. In other words, if you paid $200 for your i5, then today, it would actually be $297.
The AUD/USD exchange rate back then (assuming August 2012) was 1.04 USD (per AUD). It's now around 0.70 USD (per AUD). In PPP (purchasing power parity) terms, you paid more for your third gen i5 than how much this is now.
Close enough - it was bought October 2, 2012. But the exchange rate at the time of purchase is largely irrelevant. The chips could have been bought by the retailer much earlier at a different exchange rate.
That's besides the point, the exchange rate back then is much higher than it is now over a very long stretch of time. I've previously done consulting work on foreign exchange risks and hedging practices for some major companies, but if you want to just blindly believe that you got a good deal in 2012, then I can't change your mind.
Hard to recommend this over Ryzen…