Samsung UJ590 32" Inch 4K FreeSync Monitor $449 Delivered
Brightness (typical): 270cd/m2
UHD 60Hz 4ms GT VA panel
DP 1.2, HDMI 2.0 & 1.4
VESA compatible
More specs: https://www.samsung.com/au/monitors/u32j59/
Samsung UJ590 32" Inch 4K FreeSync Monitor $449 Delivered
Brightness (typical): 270cd/m2
UHD 60Hz 4ms GT VA panel
DP 1.2, HDMI 2.0 & 1.4
VESA compatible
More specs: https://www.samsung.com/au/monitors/u32j59/
Do you find the screen bright enough, 'cos the spec is 270cd/m²?
SDR content is usually calibrated around 100 nits (cd/m²), so unless you've got the sun (or a very strong light source) beaming straight onto your screen, you're not making use of the panel brightness unless it has HDR functionality and enough dimming zones to drive the specular highlights you want to see in that sort of content.
Downvoting verifiable information again because you can't be bothered learning what the specs actually mean… good job!
@jasswolf: Some people seem to care more about numbers. The brightness level probably is one thing people now look at because of the HDR specs. Thing is HDR400 isn't a true HDR and the so called 400 cd/m² is peak, not average. Majority of affordable HDR monitors are fake HDR anyway.
Though it probably shows that this monitor has been around for quite some time. Otherwise, Samsung would probably boost the peak brightness up to 400 either through some minor hardware adjustment or firmware trick. It's only peak brightness requirement so quite hackable.
@netsurfer: I'm only interested in HDR400 because it directs me towards wide colour gamut panels, though it's not a requirement of the spec. HDR600 or HDR400 with WCG are basically going to become standard soon enough, and while that's rubbish for HDR content, I'd be keen to see DCI-P3 content in gaming.
I'm only interested in HDR400 because it directs me towards wide colour gamut panels
HDR400 with WCG are basically going to become standard soon enough
That doesn't make sense. There is no benefit, marketing wise to do HDR400 with WCG. The whole reason of having HDR400 is to give panel makers an easy / cheap way to state the panel supports official VESA DisplayHDR. Most makers cheat in the HDR400 space. You expect them to all become good citizens soon?
HDR600 with WCG might make more sense. When we really look at HDR, it is obviously it is allowing VA to make a come back (due to its contrast ratio). If you are a VA fan, then you might be happy with that. OLED, while able to display vivid colours well, also suffers from colour shift.
while that's rubbish for HDR content, I'd be keen to see DCI-P3 content in gaming
Once again, wishful thinking. If HDR400 allows panel makers to cheat, you expect them to put in awesome DCI-P3 specs and cheap out on backlight? Do you really have ways to objectively tell / differentiate those colours or you just want to look at a percentage number to make you happy?
Too many people buy into this HDR400 being better than first gen HDR panels (probably around the 300 cd/m² mark). The thing is that 400 refers to peak and we have seen firmware trickery which allows essentially the same panel that was quoting 300-350 cd/m² to now quoting 400 cd/m² hence having the HDR400 tag.
That doesn't make sense. There is no benefit, marketing wise to do HDR400 with WCG. The whole reason of HDR400 is give panel makers an easy / cheap way to state the panel supports official VESA DisplayHDR. Most makers cheat in the HDR400 space. You expect them to all become good citizens soon?
These are panels that using lower binnings for backlight and LCD tech, it's not about them pushing the bottom of the market up, but the top of the market evolving and that lower rung benefiting. That's why you're seeing Kogan now stock a WCG monitor that accepts HDR input.
OLED, while able to display vivid colours well, also suffers from colour shift.
Every technology has various issues with colour shift and brightness shifts off-axis, and it's OLED that generally wins the day over IPS, but it's subject to burn in. It's an inherent property of stacking light sources together that leads you to dealing with one or both, it's unavoidable without filters that dull the panel and introduce other effects.
It's the same kind of thinking that leads people to cry about super expensive headphones not doing everything perfectly: you have a singular device trying to mimic production of an entire physical medium from a variety of sources and passed through a recording device, there are limitations to every approach that doesn't exactly mimic the physical world and/or how your brain interprets that.
Once again, wishful thinking. If HDR400 allows panel makers to cheat, you expect them to put in awesome DCI-P3 specs and cheap out on backlight?
Too many people buy into this HDR400 being better than first gen HDR panels (probably around the 300 cd/m² mark). The thing is that 400 refers to peak and we have seen firmware trickery which allows essentially the same panel that was quoted 300-350 cd/m² to now being quoting 400 cd/m² hence having the HDR400 tag.
As you can already tell, I don't care about the HDR aspect of these specifications, simply the WCG. I'm not interested in HDR tech that isn't either self-emissive or at least 4:1 pixel to backlight (ala Dual LCD). On a 16:9 desktop monitor, you will always notice halo effects until the specular source is fairly accurate, and this presents most clearly in gaming scenarios.
That's probably not affordably and sustainably coming to the desktop until microLED or QNED, so I'll stick with the option of WCG, good sRGB emulation and fast panels. The Samsung G7 is where I'll be aiming for, or at least someone using that panel variant.
The Samsung G7 is where I'll be aiming for
DCI P3 Rec. 2020 xy: 67.6 %. That's it? That's good enough for you? Sigh… All that WCG talk.
Explains the brightness argument (VA). Honestly, if you picked that, then you care most about fresh rate (and that's a pretty valid reason).
So basically, any decent current gen 144Hz or above gaming monitoring offering DCI P3 mode support gets the tick from you.
Every technology has various issues with colour shift and brightness shifts off-axis, and it's OLED that generally wins the day over IPS
Not for colour shift. OLED has a more much noticeable colour shift over IPS.
The OzBargain favourite monitor from 2012 is the Dell U2412M. That monitor is 300cd/m², and it's an ancient office monitor.
It really depends what you do with your system. If space for applications is your main priority then go 4K because Windows' scaling is terrible but if your priority is gaming 1440p will have larger benefits. I can't deal with 100% scaling on 1080p so I have no idea how you guys do it in 4K.
I mean anything more than a 2080 Super and probably a 3060 is going to put paid to that soon enough, the latter even without DLSS. Entry level 4k 144Hz IPS is already down to $850, and by this time next year, I'd expect $500 offerings.
I wouldn't use the first gen IPS tech myself, but you also have the option of grabbing a 48" OLED and using 100% scaling to treat it like a 4x 1080p monitor array for productivity.
At 60hz I guess but I'd say that 144hz is probably going to be more beneficial in just eye candy at 1440p than 60hz would be at 4K, 4K 144hz isn't going to run too well without relying on DLSS 2.0 which is implemented in very few titles and it'd cost an arm and a leg.
@Void: All of that is going to start shifting very dramatically in the next few months, particularly with Unreal Engine titles. Keep in mind the LG OLED is 120Hz as well, but that's only cost effective right now if you're going to buy 4 monitors.
It should slip into close to $1000, at which point it's a steal for limited productivity and casual gaming if you can fit it on your desk.
It'll also be interesting to see if QNED or QD-OLED shake things up next year.
Entry level 4k 144Hz IPS is already down to $850
Which one(s)?
@netsurfer: My bad, didn't realise the XV273KP had been completely discontinued.
The LG 27GN950-B is arriving soon, and that will place price pressure on the existing Asus offering, with Innolux-based regular and miniLED/dual LCD panels coming in next year, likely to compete at the same (current) price point and cheaper.
Even the EVE Spectrum 4k 144Hz variant should be available for around $1050-$1100, but that may be subject to tax and it needs a monitor stand.
@jasswolf: EVE Spectrum… hm… I am not sure that's worth the trouble…
Not sure if you saw but the RXT 3080 averaged about 120FPS in Doom Eternal at 4k. Might want something with a better refresh rate
This or the Xiaomi 34" UltraWide?
That's a you choice mate, compeltely different thngs. ONe is an ultrawide, and this is a 16:9
I'm personally an ultrawide kinda guy. You'll need to decide how that fits into your workflow. Keep in mind, 4k takes about 20% more resources to run it than an ultrawide does. This screen is also limited to 60hz, if that is something that effects you.
Neither for gaming. Keep an eye out for the Samsung G5 release in this range:
https://www.samsung.com/us/computing/monitors/gaming/27-g5-o…
Likely due here in late October, maybe some sales in December-January.
For productivity, you might be better off with 4k IPS, which starts at around $350-$400 at the moment via the AOC U2790VQ. There are other models using the same Panda IPS panel, but I'm not sure how many of them have hit the Australian market.
27 ≠ 32
Sure, then you're probably looking at about the same price for basic 4k IPS, but there are benefits to 27" 4k, because that's only just reaching so-called retina spec at 53 cm viewing distance.
Though that's heavily disputed, and the longer term target is more like 450-500 PPI @ 30 cm, rather than 300 PPI @ 30 cm.
I just sold this one on eBay and bought the Xiaomi 34 inch. Things I dont like about this monitor: Does not auto switch inputs. Extremely difficult to use a monitor arm, because you will have to take the original stand off. And the only way to do that is to peel off the plastic frame, and unscrew from inside the chassis. Then when you decide to switch back to the original stand, you will have to do this again… Basically requires you to completely disassemble the monitor.
As for the difference, they are two totally different categories, so depends on your user case. I swapped to the Xiaomi because I wanted to try high refresh rate gaming.
Edit: You don't have to disassemble the monitor to take off the stand if you are ok with leave it on while using a monitor arm.
No auto input switching!?
Samsung, 1995 called…
If you are thinking of using it with consoles, go with standard 16:9 ratio monitors.
Personally, I prefer having the extra height & resolution of 16:9 4K monitors also.
Thanks OP. Bought one. Was seriously weighing up between this or the Xiaomi but figured I watch more movies and YouTube and do work overall than gaming so got this instead. Cheers mate. Now to find a monitor arm.
I would have thought the xiaomi ultrawide would be better for productivity.
I was thinking this until … many websites and apps assume the average monitor (1080p) aspect ratio. So you have to run most apps in half a screen. I cant imagine what Microsoft Word or streaming video or my tv app (SichboPVR) would look on an full screen uiltrawide. I'm told youtube has black bars on either side and the video in the centre.
I got a bad news for you. You can use any vesa monitor arm without any issue, but the original stand doesn't come off easily. To take off the stand you will have to peel off the plastic frame which basically means completely disassemble the monitor. Or you van just leave it on, which doesn't affect the monitor arm but will be very ugly…
I have had this monitor for a couple of years now, and this is definitely not true. The original stand comes off easily enough and does not require disassembly of the plastic frame. I have it wall mounted currently, and easily switched to the original stand and then back when my room was being painted. I am not sure what was up with your unit, but from memory you just have to unscrew at the bottom of the stand and that is it.
Yes the stand base is just a toolless screw, but the arm connecting the base and the monitor cannot be removed easily.
I followed this YouTube video, yes I know the model in the video is not this one, but the thing I was trying to remove is the same. https://youtu.be/C5gOtilznu4
If anyone knows a better way to do it please let me know…
@WilliamDGH: As per Vita below, from memory it required a very firm tug, and i remember thinking it was an annoying stand, but it is very much removable.
@witheredcouch: I just checked the manual, and yeah it seems to be removed without disassembly…
I can you should still read the manual instead of searching it on YouTube…
Yep, the stand is very easy to remove with no tools required
Having this mornitor and I got the same experience as witheredcouch, you just have to shake it off horizontally, it's all on the manual.
I feel that someone should point out the freesync range is 40-60hz……for me personally is like no freesync
I have this, it's very good. Colours bright and saturated.
Windows detects it as TV, so make sure that the Windows integrated graphics don't apply reduced colour range (movies apparently don't use the full colour range).
HDR really only looks good on a monitor if it has FALD
Bought one.
Thanks, OP!
I think I sleep for IPS, as I use in a bright room a lot.
Shame there's no good 32' 4k IPS.
Fantastic for work flow. I work with large Excel spreadsheets and PDFs, the screen real estate is godsend.
I bought 2 of these last week for a CAD/modelling workstation that I've just built. I paid $540 each.
They're great IMO. I know little about display specs but for what I use them for, they are excellent screens. 4k video looks fantastic.
At this price, they are awesome value for money.
One thing I don't like about them is the lack of screen adjustment - which is restricted to tilt only. I'm definitely going to invest in some decent monitor stands.
I have one of these.
I’ll vouch for this monitor.
Make that $350 and I'll buy 3, support 2.1 for $400 and I'll buy two.
At its current price not interested
.
Just in time for the new RTX cards, I will say upgrading to a 4K monitor has been the best upgrade I've done and I can't go back to any lower resolutions due to how much space you have to do things in 4k.