Hello! Person who actively dislikes 4k here. In my experience:
1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode. Depending on the OS that can mean it renders tiny, or that the whole things is super ugly and pixelated (WAY worse than on a native 1080p display)
2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.
3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
My setup is 2x24" 1920x1200 monitors - so I get slightly more vertical pixels than true 1080p, but in the form of screen real estate rather than improved density. I also have 20/20 vision as of the last time I was tested.
My argument in favor of 1080p is that I find text to just be... completely readable. At various sizes, in various fonts, whatever syntax highlighting colors you want to use. Can you see the pixels in the font on my 24" 1080p monitor if you put your face 3" from the screen? Absolutely. Do I notice them day to day? Absolutely not.
I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required. If more pixels meant slightly higher density but also came with more usable screen real estate, that'd be what made the difference for me.
> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode
You lost me right here on line 1.
If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do). Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.
Things like this are exactly why I left Linux for MacOS. I absolutely get why you might want to stick with Linux, but this is a Linux + HighDPI issue (maybe a Windows + highDPI issue also), not a general case.
> I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required.
You could say the same for any arbitrary DPI; 96dpi isn't "Required", we got by fine with 72dpi. It's all about ergonomics as far as I'm concerned.
I'm on Windows and can confirm 1. is an issue. Windows lets users scale UIs to be readable at high DPI on small screens. Doesn't work on all UIs (e.g. QGIS). So maybe not all OS's, but two important ones.
> You could say the same for any arbitrary DPI; 96dpi isn't "Required", we got by fine with 72dpi. It's all about ergonomics as far as I'm concerned.
I think the point the parent is making is that human vision has limited resolution. I.e. for a given screen size & distance from the screen, you cannot notice any difference in DPI past a point. The parent is suggesting that 1080p & 27" with a typical viewing distance is already higher resolution than the eye can resolve. Looking at my 1080p 27" screen from a metre away with 20/20 vision I am inclined to agree!
>I think the point the parent is making is that human vision has limited resolution. I.e. for a given screen size & distance from the screen, you cannot notice any difference in DPI past a point. The parent is suggesting that 1080p & 27" with a typical viewing distance is already higher resolution than the eye can resolve. Looking at my 1080p 27" screen from a metre away with 20/20 vision I am inclined to agree!
Are you sure you have 20/20 vision? I can absolutely resolve individual pixels with zero effort whatsoever on 1080p 27-inch displays.
Back when I had a 27-inch 1080p display at work, my MacBook's 13-inch Retina Display effectively became my main monitor. The 27-inch monitor was relegated to displaying documentation and secondary content, because I found its low resolution totally eye straining
Edit: I might have found it so eye straining because MacOS does not support sub pixel rendering. That means a lot of people will need a 4K or Retina monitor to have a comfortable viewing experience on the Mac.
MacOS does support subpixel rendering, has at least since the early to mid 2000s. One or two versions back though they turned it off by default since it isn't necessary on HiDPI "Retina" displays and they only ship HiDPI displays now.
You can still turn it on although it requires the command line.
Subpixel rendering dramatically slows down rendering text. When you have a high res screen, and want everything to be 120fps, even text rendering starts to be a bottleneck.
That combined with the fairly massive software complexity of subpixel rendering is probably why mac dropped it.
Been a while since my eyesight tested, but I think so! I can see pixels if I focus but not when reading text at any speed. I have also checked and my display is only 24" (could've sworn it was more!) so maybe that's why. I retract my comment :)
> I think the point the parent is making is that human vision has limited resolution.
If you can't see the difference between 4k and 1080p on a 24" monitor, then you probably need reading glasses. On a 27" monitor it's even worse. It's not so much that you can "see" the pixels, sub pixel rendering and anti-aliasing go a long way to making the actual blocky pixels go away, the difference is crisp letters versus blurry ones.
Yes, I can see the difference, but I (personally) don't notice that difference while reading. I do notice a big difference when using older monitors with lower DPI compared with 1080p on a normal-sized desk monitor, however.
> I'm on Windows and can confirm 1. is an issue. Windows lets users scale UIs to be readable at high DPI on small screens. Doesn't work on all UIs (e.g. QGIS). So maybe not all OS's, but two important ones.
Haven't seen any scaling issues on Windows in years. Last time was Inkscape but they fixed that.
I see these issues all the time, with enterprise desktop apps. The scaling is only really a problem because it is enabled by default when you plug in certain displays. If the user made a conscious choice (which they would easily remember if they had trouble), it would be fine.
For many, many years there were at the very most 120 dpi monitors, with almost all being 96, and I imagine a lot of enterprise applications have those two values (maybe 72 as well) hard-coded and don't behave properly with anything else.
I'm currently working from home, accessing my Windows 10 desktop machine in the office via Microsoft's own Remote Desktop over a VPN connection. This works fine on my old 1920x1280 17" laptop, but connecting from my new 4k 15" laptop runs into quite a few edge cases, and plugging an external non-4k monitor has led to at least two unworkable situations.
I've now reverted to RDP-ing from my old laptop, and using the newer one for video calls, scrum boards, Spotify and other stuff that doesn't require a VPN connection or access to my dev machine. It mostly works OK in that configuration.
I've seen other weird things happen when using other Terminal Services clients, though.
> Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.
Low DPI monitors are pretty much unusable since MacOS dropped subpixel rendering - with fonts being a blurry mess. You can only really use MacOS with high DPI monitors now for all day working. It’s a huge problem for everyone I know who wants to plug their MacBook into a normal DPI display. Not that the subpixel/hinting was ever that good - Linux has always had much better font rendering in my opinion across a wider range of displays.
> Low DPI monitors are pretty much unusable since MacOS dropped subpixel rendering
Nonsense, fonts look fine on non-Retina monitors; they were fine on my old 24" 1920x1200 monitor and are fine on my new 27" 2560x1440 one. Can I see a difference if I drag window from the external monitor to the built-in Retina display? Yes, but text is not blurry at all on the external monitor.
If it matters, "Use font smoothing when available" is checked in System Preferences (which only appears to have an effect on the Retina display, not the monitor).
That's been my experience, too. I prefer high-DPI monitors, but back when I was going into the office (remember going into the office?) and connecting my MacBook to a 1920x1200 monitor, text was perfectly readable. I suppose if I had two low-DPI Macs, one running Catalina and one running, I don't know, High Sierra, I might be able to tell the difference at smaller font sizes,
As an aside, I wonder whether the article's explanation of how font hinting works -- I confess for all these years I didn't know the point of "hinting" was to make sure that fonts lined up with a rasterized grid! -- explains why I always found fonts to look a little worse on Windows than MacOS. Not less legible -- arguably hinted fonts are less "fuzzy" than non-hinted fonts on lower-resolution screens, which (I presume) is what people who prefer hinted fonts prefer about them -- but just a little off at smaller sizes. The answer is because they literally are off at smaller sizes.
These things are fairly subjective. But it’s hard to argue that Catalina has good font rendering on regular DPI screens. I dealt with it when I had to, but it was very poor. There are also tons of bugs around it. Like the chroma issue - Apple doesn’t support EDID correctly so fonts look even more terrible on some screens. A google search will confirm these problems.
This is an interesting position. I have always thought that font and font rendering were always an especially pernicious issue with Linux and a relative joy on MacOS?
I think that is a historical artifact. Ubuntu had a set of patches for freetype called infinality, developed around mid 2010, which dramatically improved font rendering. Since then, most of those improvements have been adopted and improved in upstream. [1] Any relatively modern Linux desktop should have very good font rendering.
As with most things Apple, it is a joy as long as you restrict yourself to only plugging the device into official Apple peripherals, preferably ones that are available to buy right now. It’s when you start hooking your Mac up to old hardware or random commodity hardware that the problems surface.
I recently started using Linux some on the same 4K monitor I usually have my Mac connected to. I was shocked at how much sharper and easier to read the text was on Linux.
I have been using a 4k monitor and 2 1080p monitors on linux for a while now. The current state of things is that hidpi works correctly on everything I have run including proprietary apps. I'm also surprised when my wine programs scale properly as well.
What does not work perfect is mixing hidpi and lowdpi screens. On wayland with wayland compatible apps it works fine but on X11 or with xwayland apps like electron it will not scale properly when you move the window to the other screen, it will scale to one screen and be wrong when moved over. Overall I don't find this to be too much of an issue and when chrome gets proper wayland support the problem will be 99% solved.
I can confirm this anecdata. Single and dual 27" 4k is fine, but mixing with a 27" 1440p is messy (tried with GNOME and KDE on Manjaro during early Corona home office).
> I have been using a 4k monitor and 2 1080p monitors on linux for a while now. The current state of things is that hidpi works correctly on everything I have run including proprietary apps.
It's good to hear things aren't as bad as some have suggested.
I think it was bad since when I look through the issue trackers a lot of hidipi bugs were closed less than one year ago but I have not really noticed much other than what I noted about multi monitors.
> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do). Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.
PyCharm had high CPU consumption issues with a Macbook connected to a 4k display running in a scaled resolution. Native 4k was fine, using the default resolution was fine but "more space" made it use tons of CPU since it had to rescale text and UI elements on the CPU.
I think all the jetbrains tools did. I remember a few years ago there was some big switch over (high Sierra maybe?) and the jet brains tool fonts were janky for a while - something to do with the bundled jvm and font rendering. I think it's sorted now, but there was an 'issue' there for a while. Maybe it's still an issue in some configurations?
It's why I actively avoid monitors with small pixels. My trusty old Dell U3011 and the two rotated 1600x1200's flanking it suit me just fine.
I've no inclination to change my OS, either, just for the sake of fonts.
Kudos to those commenters still on CRT displays. One complaint I have with LCD's is reset lag time, which can make it tricky to catch early BIOS messages.
Sure. The fastest I've seen lately is that ARMv7-A board submitted the other day which boots in 0.37 seconds, or with networking, 2.2 seconds. That's time to user land, and to achieve it took highly specialized firmware and a stripped down kernel compiled with unusual options. I've yet to see a PC come anywhere close to that, and personally I won't consider the boot problem solved until a cold one completes quicker than I can turn on a lightbulb.
In fact historically, some of the higher-end hardware yielding the best performance during operation (an axis along which I optimize) actually added time to the boot sequence. The storage subsystem on my workstation is backed by a mix of four Intel enterprise-grade SSD's in RAID-0 (raw speed) and 8 big spinning platters in RAID-6 (capacity), plugged into an Areca 1882ix RAID card w/ 4GB dedicated BBU cache. Unfortunately that card adds a non-bypassable 30 seconds to the boot sequence, no matter what system you plug it into. But once there, it screams. It's only just the last couple years PCI-NVMe drives have come out that can match (or finally beat) the performance metrics I've been hitting for ages.
So I actually kind of feel like I've been living in the future, and the rest of the world just caught up ;-).
If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do).
Same. I haven’t run into any apps that don’t support high dpi mode. Even terminal apps look great on my Retina 4k iMac screen.
Before getting this machine nearly a year ago, I couldn't natively view high dpi graphics for web projects I’d work on, which was a problem since there are billions of high dpi devices out there.
> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do).
The version of pgAdmin not based on electron had pixelated fonts on a 5k iMac. I haven't checked recently.
> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them
Audacity has extremely low framerates and chugging when interacting with the waveform on retina screens. Even running in low DPI mode doesn't fix it. Only runs nice on non-retina displays.
Agreed. I have a retina mbp that I use as the 3rd screen plugged into a 4K monitor over usbc and a 1920x1200 over dp. Everything works fine. Windows stay in the right place when plugging/unplugging, etc... My eyes also thank me every time I look at the 4K text on my main monitor. I’m debating buying another one and dumping the 1920 monitor.
The post I quoted was: "No matter what operating system you're on". I quoted that particular bit for a reason.
The ergonomics of better screen resolution don't change just because your OS isn't good at dealing with high resolution.
When you pick an operating system (if you have the choice), there are a lot of factors, it's good to know what's important to you and choose appropriately.
That was a direct reply to the “No matter what operating system you're on” quote. The fact that there are some operating systems that have issues doesn’t make it true that all do.
FWIW, my main monitor is a 43" 4k display, and it works perfectly fine on AwesomeWM - but I' don't use any scaling, the 4k is purely for more screen real estate - literally like having 4 perfectly aligned borderless 24" monitors. I can fit 10 full A4 pages of text simultaneously.
I recently upgraded to a 43” 4k monitor and use it the way you describe. I’m not sure I am happy with it. The real estate is nice but it might be too much. UI elements end up very far away. I rarely need all that space.
I either need a bigger (deeper) desk to sit back farther or just a smaller monitor physically with the same resolution.
What I found helped when I moved to a large 4k screen is when I stopped trying to eke out every last bit of space and started using my desktop as an actual desktop analogue again. Whereas I used to full screen, and snap to half or quarters of the screen, now I have a few core apps open that take up roughly a quarter, usually a browser and email, and other apps I open up and move around to organize as I feel the particular task warrants (generally some terminals that are all fully visible). I often drag the current thing I'm working on to the bottom half of. The screen so it's slightly closer and easier to see directly, and leave reference items or stuff I'm planning on revisiting shortly up top.
I also was thinking I didn't particularly like the large desktop and screen at first, but now that I treat it as a combined wallboard and desk space, I can't imagine going back (and using the large but not quite 4k monitor at my desk at the office always felt like a step backwards).
I do set default scaling for Firefox and Thunderbird to be about 125% of normal though, as I don't like squinting at small text. I generally like how small all the other OS widgets are though, so I don't scale the whole desktop.
Thanks for the input. I’m about a week in and that is the realization I am reaching. My window sizes are almost back to where they were before.
It has been an iterative process but I’m getting a handle on it.
I have a sit/stand desk, I’m considering moving my recliner in to the office and elevating the monitor. Working from a reclined position seems ideal but I also thought a 43” monitor was a good idea...
If you're on Windows, consider the fancyzones power toy. Break up your screens into an arbitrary set of maximizable zones. Highly recommend if the normal 4 corners isn't enough for you.
I'm on MacOS. I basically never maximize anything. I just resize the floating windows and put them where they make sense at the time. The dream would be doing this with the keyboard using something like i3. I have been tinkering with that on weekends but M-F I focus on paying work.
Check out Moom. You can save window layouts and replay, save window positions and apply them to any app, arbitrarily move, maximize to a range with a customizable margin, etc etc.
https://manytricks.com/moom/
Depending on how exact of a similarity you're looking for, gTile for GNOME/Cinnamon might be of interest to you. I've also found PaperWM to be very productive.
I use gTile for gnome. I had to play around to find a setup that I like. I eventually settled on ignoring most of the features that were offered out of box. Now I have configured a few simple keyboard shortcuts.
For example, Super + Up Arrow will move a window into the central third section of the display. Pressing it again will expand the window a little bit.
Nice and simple. Makes working with large displays pleasant.
I use Rectangle. I’ve configured it with a few keyboard shortcuts that let me move a window into specific regions on the display. I use it to quickly have multiple non-overlapping windows.
I cannot imagine using a large display without it!
BetterTouchTool is _excellent_ for window placement/resizing; also can be triggered externally if you want to combine it with Alfred for Karabiner Elements (though you don't have to, you can define the triggers in BTT itself).
The 4K monitor at my office is a fair bit smaller than that, and I haven't used it since the COVID-19 epidemic sent me packing home. Since then I've just been using my laptop's built-in screen.
To be honest, I think I may stick to it. At first, the huge monitor was fun, and initial change to having less screen real estate was definitely a drag. But, now that I'm accustomed to it again, I'm finding that "I can fit less stuff on the screen at once" is just another way of saying, "it's harder to distract myself with extra stuff on the screen." My productivity is possibly up, and certainly no worse.
A major pain (literally) point for me with laptop screens is posture. My neck aches after a day of looking mostly down. I suppose an external keyboard and mouse would help but I would have to get a stand and blah blah.
Also for my particular workload real estate is very handy. I totally agree with there being some virtue to constraints but several times a day I really need the space.
I think this really depends on the work you do, also. Pure development or content creation and I'm good with just a laptop. For research, with team communication, concurrent terminal sessions, debugging, management - I really do want at least 3 screens.
This. After doing some research, as far as I can tell a 32" 4K maximizes the amount of content you can see at one time within a comfortable viewing angle and without needing scaling to make text readable.
At typical desk monitor distances you shouldn't be able to see distinct pixels anyway.
Agreed. My work monitor is a 32 or 34" ultrawide. It works well but I would really like more vertical real estate. I'm definitely shopping for 32-34" 4k displays right now.
Looks nice also, but I already have a 4k 32" Eizo Flexscan that I'm happy with - I'm after the 4:3 aspect ratio, everything just seems to go wider and wider these days.
Vertical is the reason I haven't upgraded from dual 1920x1200 (but of the time single anyway). Although I'm looking at 1440p mainly for 120hz+. (1440p at 30+")
43" seems rather large--how far away do you sit? If it were as close as a more "normal" sized monitor (~2-3 feet), wouldn't you be craning your neck all day trying to see different parts of the screen?
Nah. I have a 49" curved 1440p monitor. Things you look at less often go to the sides. You can fit 4 reasonable sized windows side by side. Code editor holds over 100 columns at a comfortable font size for me 40 year old eyes. It's the best monitor setup I have ever had. You can spend less and get the exact same real estate with two 27" 1440p monitors. Either way, it is a fine amount of real estate and not at all cumbersome for all day use in my case.
I am getting the same Dell 4919DW monitor, transitioning from two 25" Dell monitors. I think the built in KVM will be great addition as I have two workstations. Ordered the new Dell 7750 to pair with a WD19DC docking station. I hope the Intel 630 UHD built in graphics will do, as stated in the knowledge base.
The 4919DW only have 60hz refresh rate but I am not concerned about that. A great alternative would be the curved Samsung 49" C49RG9 at 120hz.
It’s quite a piece of hardware. It lets me plug a USB hub into it. It supports USB-C for its display adapter. So it’s the dream: one USB-C cable to charge the laptop, drive the display, and provide a USB hub for keyboard, mouse, etc.
I'm in the same boat. More real estate is the big win. I made a pandemic purchase of a TCL 43" 4k TV to use as a monitor primarily for programming. I sit a bit further from it: 30" rather than 24ish when working on the laptop. I drive it with a 2019 inexpensive Acer laptop running Ubuntu 20.04 and xfce. Every so often an update kills xWindows, but I can start it in safe mode and get things working.
I do find my head is on a swivel comparatively, but while noticeable without being a negative. Overall I like it. A lot. The only thing that is painful is sharing the desktop over Webex/skype. That does bog the system down and requires manual resizing of font size to inflate it so that viewers on lower resolution systems can cope with it.
I am somewhere in between. I don't go for hi-DPI but am using 28" 4K on the desktop and 14" 1080p on my laptops. So identical dot pitch and scaling settings. I just have more display area for more windows, exactly as you say like a 2x2 seamless array of screens.
I actually evolved my office setup from dual 24" 1920x1200 and went to dual 28" 4K. But with the COVID lockdown, I only have one of the same spec monitor at home for several months, and realize that I barely miss the second monitor. I was probably only using 1.25 monitors in practice as the real estate is vast.
People who complain that a monitor is too large should stop opening a single window full-screen and discover what it is like to have a windowing system...
In the same boat here. I use a $400 49 inch curved 4k TV as my monitor along with i3wm and while I waste a lot of the space on screen to perpetually open apps I don't touch, having the ability to look at my todo list or every app I need for a project at the same time has its benefits. I just wish I could lower the height and tilt the TV upwards a bit so I'm not breaking my neck looking at the upper windows.
I had a similar setup at a previous job -- one of the early 39" TVs. It could only drive 4k at 30Hz, but for staring at text, nothing could beat it. It takes a good tiling window manager to get the most out of this setup. By the same token, a good tiling WM also makes a tiny little netbook screen feel much bigger. So I guess what I'm really saying is, use a tiling WM!
43" 4k is approximately 100dpi, like 21" 2k. It seems like a reasonable form factor to me (at 1x), but there aren't many of them that do high refresh rate, and they're all very expensive.
I've only seriously tested the Dell P4317Q that I have in the office. Others have had good success with small 4k TVs. Can't say I've noticed anything about the latency, but I've never gamed or watched movies on it, so IDK
I use Samsung 4k TV's (55in and 43in) at work and home and the experience is absolutely fantastic. In game mode the latency is reported to be 11ms and there's no difference visible to me compared to 60hz computer monitors.
Do you have the specific model numbers? I'm buying a new monitor soon, and considering using a 4K TV. Would be great to check out the ones you're using!
If you are on macOS, all is good. Never had a problem with any of my 4 monitors (3x4K, 1x5K). I set the scaling to a size I like, and the text is super crisp. I don't see how any programmer can NOT like that.
How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.
In my experience MacOS multi-monitor support is effectively non-existent.
Recently I picked up a 49” ultra-ultra wide monitor (basically 2x27” panels). It is one monitor but MacOS can’t drive it. They just don’t detect that resolution. I switched to a 43” 4k monitor (technically more pixels) and MacOS drives it fine.
My experience with MacOS is not “it just works” unless you are doing something Apple already predicted. That’s fine for me, I just wish they still sold a reasonable monitor themselves so I could be assured it would work properly.
How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.
I finally got sick of this and wrote a Hammerspoon script to deal with this. The config looks like this:
> every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first
Maybe we can get to the bottom of this. What is your use case?
I ask because as long as I plug them into the same ports it remembers how I arranged them previously (2018 macbook pro 15"). I haven't had to arrange them in over a year... even remembered when updating to latest operating system. Occasionally, I even plug in my LCD TV as a third external monitor and it remembers where that one should go in the arrangement too.
MacOS cannot drive one 5120x1440 display using Intel display hardware. It will happily drive two displays at 2560x1440. The monitor had multiple inputs so by putting it in PBP mode I was able to drive one input as USB-C and another as HDMI through a dock converter. This means the wakeup was not in sync. MacOS would see one monitor, arrange everything on that then realize there was a second one and fail to move anything back in this "new" arrangement.
The fact that it was all one physical monitor may have further confused the OS as a sibling comment mentions.
The solution was to sell the monitor to a Windows-using architect friend and buy a different panel with a resolution MacOS supports. She has a macbook too but it's the fancy one with discrete graphics which can drive 5120x1440.
The value proposition of MacOS to me is that I plug things in and they work. Any fiddling beyond that destroys the benefits of using this platform. I'm willing to iterate on hardware until I find something that works.
I do not have a 2020 MacBook so I cannot test but the Pro Display XDR is not 5120x1440, it is 6016x3384. The problem with my current MacBooks ('14 15" RMBP and '17 13" MBP, both with Intel Iris graphics) is that while they can drive 4k displays they cannot drive the 5120x1440 resolution specifically.
This limitation is specific to the MacOS drivers. Windows in Bootcamp is able to drive 5120x1440 on these devices.
Yeah I read through all those. It’s a work laptop so I’m not comfortable doing things like disabling SIP or mucking around in any system settings. That machine is my livelihood so I don’t mind finding devices that just work.
Ah ok, ya maybe it's related to it being the same monitor.
I have two different monitors that wake up at very different speeds and it's no problem here. My 15" 2013 and 2015 macbook pros had no problem with this either, and I've had 4 different monitors in the mix through those years too. I've transitioned to a CalDigit Thunderbolt 3 dock now and still no problem with it remembering.
So there's definitely something unique about that monitor. That is sad news for me too -- I'm hoping they make a 2x4K ultra wide monitor like that someday. Hopefully they've solved this problem by then.
That might work but it breaks my workflow in another way. Physically the display is a single panel. I organize workspaces by task so changing to a new one needs to change "both" panels because I'm actually using them as one.
>How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.
At least for apps that are dedicated to one screen + virtual desktop, right click its icon in the dock and assign it to that display and workspace.
Note that the effectiveness of window restoration also depends on the make/model of your monitors – many manufacturers incorrectly share EDID's across all units of the same model and sometimes across multiple models, making it much more difficult for operating systems to uniquely identify them.
That used to happen occasionally to me as well in earlier macOS versions. Didn't have to do any rearranging since Mojave, I think, definitely not on Catalina.
I use a single 34" 4K monitor with an arm mount on my Mac Mini. The power button on this monitor is one of those touch sensitive ones on the bottom right that I sometimes accidentally brush past. When I switch it back on, every single window gets reduced into an incoherent size and everything gets moved to the top left. It's really annoying.
I'm thinking of flipping my monitor upside down so I'll never accidentally brush that area while picking up something on the table.
That's likely a firmware bug in the monitor. It probably reports some ridiculously small resolution during its boot process and macOS queries it that time and rearranges the windows accordingly.
macOS could implement workarounds of course, but probably it just follows the process whatever the display id protocol prescribes...
What kind of MacBook do you have exactly? Year, size, graphics hardware and OS.
Reports I read stated that while you can select it with SwitchResX it was scaled.
I never tried installing it myself because I’m not a fan of modifying the system on a Mac, especially one I don’t own.
From my poking around I think the horizontal resolution is the problem. The system scans possible resolutions to see what works. Apple just never expected a single display that wide.
There’s some reports that newer MacBooks with discrete graphics on Catalina can indeed run this resolution. It used to not work regardless of hardware, now apparently discrete graphics MacBooks can run it. Maybe because they updated the drivers/system for their new super fancy monitors.
Go to the Displays panel, switch to the "default for display" option, then switch back to "scaled" while holding down the option key. Do you see that resolution in the list of options?
I’d say all is bad with MacOS and external monitors... It can’t manage text scaling like Windows, so you either have to downscale resolution and get everything blurry or keep the ridiculously high native resolution and have everything tiny :(
Is it not visible for you in the displays settings? You DO need all the monitors to have the same DPI or you’d have a window rendered half in one dpi and half in another when dragging across a display boundary.
No, when it’s an external screen I don’t get any scaling options, only the choice of resolution. I have a 24” QHD, so either it’s ridiculously small 2500xSomething or it’s blurry HD :(
I have this problem as well. I actually run my 27-inch 4K screens downscaled on MacOS because the tiny font at native-4K gives me a headache.
The worst thing about it is that scaling seems to use more CPU than running natively and the OS has some noticeably additional latency when running scaled.
Odd, my main setup is an external 4k monitor and I only use it with the “large text” text scaling and I have no complaints, the text is clear and large and easy to read. Perhaps you’re also using your laptop screen as well?
At work I have a mac mini and a Windows box, and I use three crapola Asus monitors between them, and my impression has been that macOS does a better job rendering text on said crapola monitors (the Windows box does a better job at compiling C++ in a timely fashion, though, so I mostly work on that one).
It's just a different stylistic choice. A lot of font nerds prefer the OSX choices because they try to stay true to the original font spacing without regard to the pixel grid.
Missing sub-pixel antialiasing is plain technical deficiency, not a stylistic choice. I agree arguments can be had about hinting and aligning the glyphs to the pixel grid, but not much beyond that.
Yeah they didn't completely remove it, but they did a good job of hiding it by not making it an option to turn on in the GUI. Have to use a terminal command to enable it:
https://apple.stackexchange.com/a/337871
In general with a HiDPI screen I don't find any need for it. But on a low-res display like the typical 24" 1080P models it certainly helps.
Completely agree! Went from a mediocre 2x1440p to high quality 2 x 4K, then back to a pair of equal quality 2x1440p.
I would also add, when it comes to 4K and, for example, MacBooks, things fall apart quickly in my opinion. Cables, adapters/dongles/docking stations just must match up for everything to work in proper 60fps, and it gets worse if you have two external displays.
As for my home set up, also stayed at 25" 1440p. Nice balance for work, hobby and occasional gaming without braking the bank for a top-tier GPU.
>I would also add, when it comes to 4K and, for example, MacBooks, things fall apart quickly in my opinion. Cables, adapters/dongles/docking stations just must match up for everything to work in proper 60fps, and it gets worse if you have two external displays.
I agree it's a bit of a mess, but USB-C monitors solve all those issues. I just plug my MacBook in with USB-C, and instantly my 4K (60 Hz) display is connected, along with external sound and any USB peripherals. No fussing with a million different cables and adapters. It's the docking workstation setup I've dreamed of for a decade.
It doesn’t solve all of the issues. The USB-C port can support DisplayPort 1.2 or 1.4 bandwidths and you have to make sure it matches up for some high-resolution monitors to work.
Why not both? If I'm on linux, with no interest in changing and perfectly happy with my display, and 4k doesn't work easily on my system, why would I be interested in a 4k screen?
Strange. I'm not seeing any issues with Linux and 4k. I'm running a plain Debian 10 with OpenBox running on 4x 4k (3x 28" in a row and one 11" 4k under the right-most) monitors though, granted, I only normally have one web browser that follows me around, across work spaces pinned to the right monitor, mostly maximized Sublime on the middle monitor and a pile pile of alacritty/xterm windows on the left-most monitor. The small monitor which content also follows me around contains clipboard, clocks, Slack and monitoring.
What is the software that people are using that creates problems?
So far, I've never had an issue with KDE Plasma and 4K@60Hz on linux, once I realized that you can't just use any old HDMI cable: you need DisplayPort or HDMI2
FWIW, switching between resolutions in my favorite desktop environment, Xfce, is two steps:
# This affects every GTK app.
xfconf-query -c xsettings -p /Xft/DPI -s 144
The second step is going to about:config in Firefox, and setting layout.css.devPixelsPerPx to a higher value than 1.0. I really need to write an extension to do that in one click.
What is really tricky, though, it's having two monitors with different DPI. Win 10 does an acceptable job with it; no Linux tools I'm aware of can handle it reasonably well. Some xrandr incantations can offer partial solutions.
Even Win10 struggles when you move windows between different DPI domains. Apps will slide in HUGE or tiny until you get past the midway point. And when the system goes to sleep everything can go to hell. You can come back to small message windows being blown up to huge sizes or windows crushed down to a tiny square. You can forget about laying out your icons perfectly on your desktop too, they'll get rearranged all the time. Even more fun when you remote into a high DPI display with a low DPI display. It actually works pretty well, but stuff will get shrunk or blown up randomly when you to back to the high DPI display.
Logically having your OS maintain a consistent UI size makes sense until you try it without.
I'm running a couple medium high density monitors alongside one of the highest density ones available. I don't scale the HIDPI monitor at all, which means when I drag windows to it they are tiny. Instead it works in two ways, as a status screen for activity monitors/etc and as a text/document editing screen. AKA putting adobe acrobat, firefox or sublime/emacs/etc on the high DPI screen and then zooming in gives all the font smoothing/etc advantages of high DPI without needing OS support.
So the TLDR is, turn off dpi scaling, and leave the hidpi screen as a dedicated text editor/etc with the font sizes bumped to a comfortable size. Bonus here is that the additional effort of clicking the menu/etc will encourage learning keyboard shortcuts.
I don't think the reasons you illustrated support that conclusion. You don't actively dislike the extra pixel density of a 4K display. You seem to only dislike the compatibility issues relevant to your use case.
>No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
FWIW, I can't recall the last time I has a problem with apps not rendering correctly in hidpi mode on MacOS. Unless you've got a very specific legacy app that you rely on for regular use it's a non-issue.
>Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming
Ah, I think I found the real issue ;-) If your linux desktop rendered 4K beautifully, seamlessly, and without any scaling issues right out of the box, I could all but guarantee that your opinion would be different.
You know, I was in complete agreement with the article and I was considering a 1440p monitor or something until I saw your comment and reflected on it. The most productive periods/jobs I can remember were on i3wm with a Goodwill 900p/19" monitor and a 20" iMac 10 years old at the time. But it's because I had access to good tools then like Neovim/Atom respectively. My work now requires an RDP Notepad.exe session so there's no monitor that will help me there. I guess software tools are way more important.
I have a 49" curved monitor. It is effectively two 27" 1440p monitors stapled together (5120x1440). It is the best monitor I have ever had. 1440p has a very decent [higher than typical] pixel density but is not "retina". Fonts look pretty smooth, but you can still see pixels if you try really hard. Overall, I do think high density screens look amazing, but the software has not quite caught up to them. The benefits are on the softer side, and if I could just have magical mega-high-DPI displays with no side effects, sure why not? As it stands, 49" curved monitor is pretty fine. It fits four windows side by side at reasonable resolutions.
Primary apps go in the middle, such as code editor, etc.. Tertiary windows, such as documentation go on the outer edges. Still quite usable, but a little out of the way for extended reading.
Hey, do you mind sharing more info. on how to get the monitor? I'm looking to invest in a curved one since it's an experience I've never had. And are there retina models out there, or is it not worth it, in your view?
I couldn’t agree more with this. A 49” 5120x1440 curved monitor is brilliant for productivity. It’s better than two or three separate monitors. I do miss high DPI but I wouldn’t trade this type of monitor for the current batch of smaller high DPI ones.
There’s only two or three things that would make this better. A high DPI variant, more vertical space and a greater refresh rate. Given those two things, I think that’s the endgame for monitors (in a productivity context).
(I think that’s 8x the bandwidth so it’s a while away!)
I have a 4k 27" monitor and I have had to run it in 1440p recently because its all my dell xps can manage and its usable but a noticeable downgrade. My 24" 1440p monitor at work looks perfectly fine though.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
What issues actually? Xmonad on Arch user here and I find the sweet spot for me is 27-32" 4k, 1440p on laptop (I guess 4k would be nice here too, but not sure if it motivates the increased power draw). After getting used to the increased real-estate, I do feel limited on my older 1080p laptop screen; fonts smaller than 8p (which is still all good) noticably impacts the readability to the point I can feel my eyes strain faster. It did take a bit of playing around with Dpi settings to get it right, though, out of the box it's not great. The Arch wiki has some great material.
The only frustration I do have (which IS super-frustrating, specifically for web browsing) is with multi-monitor setups with different pixel density - your point 2, I guess. Even plugging anything larger than 19" with 1080p into my 1080p Thinkpad is annoying.
I think it should be possible to configure it correctly but I just gave up and end up zooming in/out whenever I do this and send windows between screens. Haven't looked at it, but maybe a mature DE like KDE or GNOME (which, if you don't know, you can still use with i3) should be able to take care of this.
Also, this is all on X11, have no idea if and how wayland differs.
My day job involves untangling SQL that was written under fire. I consume more spaghetti than a pre-covid Olive Garden. Every vertical pixel is precious for grokking what some sub query is doing in the context of the full statement.
I used to when I ran dual 24" monitors! We had really sweet old IBM monitor stands that could tilt, raise and rotate. You had to be quick to grab them from the copy room before the electronics pickup. I swiped one from the still warm desk of a colleague on the way to their farewell happy hour.
So I had a '14 13"(? Might have been 15" but probably not) RMBP on the left with email and chat stuff, 24" main monitor in landscape and then another 24" monitor in portrait on the right. I think at some point I put a newer panel on the IBM mount. It was sweet.
Back then we still had desktop PCs at our actual desks so there was a KVM on the main monitor! What a time to be alive!
These days I do prefer a single monitor workflow if possible. It's just cleaner and move convenient.
Being used to MacBooks with retina screens, 4K on at 30” is perfect to me as “retina”. Anything larger needs to be 5k or 6k. 1440p is passable on <24”.
27" x 1440p has been my go-to for a while now. Works well without scaling between win/mac/linux, does not dominate the desk completely, high quality monitors are readily available in this resolution etc etc.
I think my dream set up would be a 27" 1440P (What I currently have at home, with a pair of smaller (19" maybe?) 1080p screens on either side setup in portrait. Basically a similar screen area to 2x27", but without a bezel right in the center of my field of view, and the 1080x1920 screens will be a good size for displaying full page (e.g. PDF) at more or less full screen.
I actually like having multiple screens. I know I'm weird in this but I actually like running certain apps maximized.. but I wouldn't want it maximized across a whole ultrawide.
Plus, if I take a working vacation somewhere it's a lot more practical to schlep around one 24" or 27" than an ultrawide.
Also, just as an ergonomic thing, I could angle in the two outer screens a bit while not having one of those icky curved screens.
>My setup is 2x24" 1920x1200 monitors - so I get slightly more vertical pixels than true 1080p, but in the form of screen real estate rather than improved density.
I'm working on an old 24"16/10 display (the venerable ProLite B2403WS) and an OK 32" 4K display with a VA panel. Both are properly calibrated.
There is no amount of tinkering that can make fonts on the 24" look good. It looks like dog shit in comparison to the 4K screen. It might not be obvious when all you got in front of your eyes is the 24" display, but it's blatant side to side.
On top of it, the real life vertical real estate of the 4K display is also quite larger.
I've never been a big 16/9 fan, but frankly at the size monitors come in today and the market prices, I don't a reason not to pick a few of these for developing.
> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
I haven't had this experience (MacOS, 4K monitor for 2.5 years)
> 2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.
shrug - 4K and 1080p seem to work together just fine for me. I've currently got a 27" 4K monitor and a 24" 1080p monitor both running off my 2015 13" MacBook Pro; the 4K is on DisplayPort (60Hz @ 3840x2160) and the 1080p is on HDMI (and it happens to be in portrait mode). I use all three screens (including the laptop's), and while the 1080p is noticeably crappier than the other two, it's still usable, and the combination of all three together works well for me. A couple of extra tools (e.g. BetterTouchTool) really help with throwing things between monitors, resizing them to take up some particular chunk of the screen, etc. - my setup's quite keyboard-heavy with emphasis on making full use of the space inspired by years of running i3 (and before that xmonad, ratpoison and others) on linux and freebsd.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.
That's a statement about linux and i3, not monitors. (And again, I like i3, but stating this limitation as if it's a problem with monitors not i3 seems... odd.)
> > 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.
> That's a statement about linux and i3, not monitors. (And again, I like i3, but stating this limitation as if it's a problem with monitors not i3 seems... odd.)
It is also wrong. I am a long time i3 user. Never had a problem with it, never done anything special. Most of the time I'm running Debian stable, so I even use software versions that most people consider 'old'.
> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode. Depending on the OS that can mean it renders tiny, or that the whole things is super ugly and pixelated (WAY worse than on a native 1080p display)
Never happened to me in 4 years, see below. That said, I barely use any graphical programs besides kitty, firefox, thunderbird and spotify.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
This is just not true. I have used the same 32" 4k monitor for 4 years running NixOS with bspwm (a tiling window manager, which does even less than i3) on 3 different laptops - thinkpad x230 (at 30 Hz), x260 and x395 and it all worked completely fine.
It depends on a very simple tool I wrote, because I was sick with `xrandr`: https://github.com/rvolosatovs/gorandr , but `xrandr` could easily be used as alternative.
Recently I switched to Sway on Wayland and it could not be smoother - everything just works with no scripting, including hot-plug.
> I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required. If more pixels meant slightly higher density but also came with more usable screen real estate, that'd be what made the difference for me.
Indeed, screen size is way more important than resolution. In fact, even 4k at 27" seemed too small for me when I had to use that in the office - I would either have to deal with super small font sizes and straining my eyes or sacrificing screen space by zooming in.
> 2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.
I have been running two 1440p displays on a 4K retina MBP and the experience has been impressively seamless. Both with Catalina (the latest) and High Sierra
The biggest problem is with Apple port splitters; they are crap and sometimes monitors wake from sleep with a garbled picture
So I'm a little nuts in that I run 2 x 27" 4K monitors side by side with no scaling. 27" is about the smallest I can tolerate 1:1 pixel sizes.
Since aging has forced me into wearing reading glasses, I wear single vision computer glasses that are optimized for the distance range of my monitors' closest and furthest points.
Because I dont have scaling enabled, I don't get any of the HiDpi issues that I've gotten on my laptops with Windows.
I have found that I am still wanting for even more screen real estate, and for a time I had a pair of ultrawide 23" monitors underneath my main monitors, but it created more problems than it solved and I recently went back to only two monitors.
That's an interesting idea. I should look into that when I eventually upgrade. A stubborn part of me left the monitors in landscape because I occasionally play games, but I end up never doing that on my desktop.
Because I prefer to use the same model of monitor when doing a grid, so I don't think I want to add to my existing setup, because my monitors are discontinued, and they're displayport only for 4K60.
I think they have more than a few years of life left in them, but I'll definitely look into a configuration like yours at upgrade time.
> Good luck ever having a decent experience plugging in a 1080p monitor.
A 4k monitor is now $300 (new). Used are even cheaper.
> Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow
I use Gnome 3.36 and many HiDPI issues I was having before are now gone, without any extra configuration.
> My argument in favor of 1080p is that I find text to just be... completely readable.
It is readable but fonts are pixelated, unlike 4k.
My only problem is that macOS has some artificial limitations when it comes to using non-Apple monitors. Like a lower refresh rate. My solution? Use Linux.
$300 is a lot of money where I'm from. And they aren't available at that price here, anyway.
What do you mean, fonts are pixelated at 1080p? Whether you can see the pixels probably depends on pixel size. I certainly can't see them on my 23" LG monitor unless I try really hard.
> No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
I am using using Emacs and tmux under Linux, in the Gnome desktop. Gnome has HDPI scaling. For me, that works fine with a 4K 43 inch display. The thing you need to watch out for is to get a graphics card with proper open source (FOSS) driver support. Some cards are crap and don't come with FOSS drivers. You can get them to run but it is a PITA on every kernel update. Don't do that to yourself, get a decent card.
I miss i3 so much. But I've succumbed to laziness and have been using my various macbooks. Agree that it's a huge productivity gain, moreso than any font improvements.
Another linux+i3 user here, I've not tried 4k yet but you confirmed my suspicions.
I did a lot of research before buying an xps-13 and went with the 1080p version due to basically all the reasons you just stated + poor battery life and video performance.
I have hope for the future though... what would really make transitioning easier is a way to automatically upscale incompatible programs, even if it means nearest neighbor scaling at least it will make them usable on super hi-dpi monitors.
I have 4K monitor on laptop and also as external monitor but I have no problem with linux (using debian testing with gnome 3). I can easily combine it with 1080p monitors. Everything works out of the box.
I still switched to 1440p as 4K is just better looking 1080p. You cannot fit more information on the screen with scaling and without scaling everything is too small. I work as backend developer so space is more important for me than visual quality.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
Happy i3 arch linux 4k monitor user here for over 2 years. I only set an appropriate Xft.dpi for my monitor size/resolution in ~/.Xresources once and that was it.
I'm not going to argue your preferences, but then why don't you get one 50 inch 4k display? That's about four of your current displays at similar density on a single cable. And probably at a similar price point, too.
Or, if you are using decent graphics hardware, you could even get two of them and have four times more display space than you have now.
> No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
It might depend on the program, too. Some might only work in pixels. Fortunately, it is usually not a problem if you are trying to run a program designed for Gameboy; the emulator should be able to scale it automatically, subject to the user setting. I don't know if any X server has a setting to magnify mouse cursor shapes, but it seems like it should be possible to implement in the X server. Also, it seems like SDL 1.x has no environment variable to magnify the image. My own program Free Hero Mesh (which I have not worked on in a while, because I am working on other stuff) allows icons to be of any size up to 255x255 pixels (the puzzle set can contain icons of any square size up to 255x255, and may have multiple sizes; it will try to find the best size based on the user setting, using integer scaling to grow them if necessary), but currently is limited to a single built-in 8x8 font for text. If someone ports it to a library that does allow zooming, then that might help, too. However, it is not really designed for high DPI displays, and it might not be changed unless someone with a high DPI display wants to use it and modifies the program to support a user option for scaling text too (and possibly also scaling icons to a bigger size than 255x255) (then I might merge their changes, possibly).
Still, I don't need 4K. The size I have is fine, but unfortunately too many things use big text; some web pages zoom text by viewport size, which I hate, and a bigger monitor would then just make it worse.
> some web pages zoom text by viewport size, which I hate, and a bigger monitor would then just make it worse.
Not that it excuses bad UX, but you might consider keeping your browser window at something below full width. I find this more comfortable anyway.
Total aside: I've noticed Windows and Linux users tend to keep their windows fully maximized, whereas Mac users don't. Doesn't apply to everyone of course, but enough to be noticed. This was true even before Apple changed the behavior of the green Zoom button, and I've always wondered why.
I find Spaces/Expose/Mission Control (or whatever they call it these days) way more comfortable than dealing with Windows. I especially like that if I hide a window, it doesn't pop up when using Mission Control. Opening the Windows equivalent shows me every window, even stuff I minimized/hid. It feels cluttered.
I don't see how alt-tabbing through maximized windows on macOS is different from Windows and Linux like the OP is suggesting. Though I do keep my browser at half-width on my ultrawide monitor because it's somewhat of an exotic/untested aspect ratio for websites.
Also any power user that cares will use a tool like Divvy on macOS for arranging windows with hotkeys.
> I've noticed Windows and Linux users tend to keep their windows fully maximized
Interesting, I've noticed the exact opposite. Mac devs, especially younger ones, tend to have full-screen IDEs and browsers and constantly flick back and forth between apps. My theory was always that Windows and Linux users had gotten comfortable with the desktop metaphor while a large percentage of newer Mac users grew up using iPads which were all full-screen, all the time.
Quick note I perhaps should have clarified, I wasn't thinking about the Mac's "full screen mode". This was something I noticed about other students in my high school a decade ago (why it's coming to mind now, I have no idea), before full screen mode existed on Mac.
It used to be that if you clicked the green button on Mac, most apps (not all apps, for weird aqua-UI reasons, but certainly web browsers) would grow to fill the screen without outright hiding the menu bar and dock, just like the maximize button on Windows.
My experience pre-full screen on macs was that the green button would do just about any random thing except make the window fill the screen. It would certainly change, usually it would fill vertically (but not always) but almost never horizontally.
To this day I still rarely press that button because of years of it doing nothing but unpredictable nonsense.
Technically, that's true, but "Wayland is inherently better at security/DPI scaling/other" is one of those cultural myths that eventually come true because of the people who believe in it. It would be possible to add these improvements to the X server, but no one wants to maintain or improve the X server anymore. All the developer effort is behind Wayland. So to get those benefits, you have to use Wayland.
Im on Gnome and use fractal scaling . 2x and everything got too big. But 1.6 looks OK. Its actually not on the app layer, its the screen that is scaled up. Although some low level programs can have issues with mouse pointer position if they dont take into account the scaling.
i3 user on a 4K screen here, has worked fine since 2014 for me (with the exception of the random super old TCL/Tk app) ? https://i.imgur.com/b8jVooO.png
Since nobody else has mentioned it, if you like i3 you should give Sway a test drive. Wayland still has some rough edges (screen sharing, for example) but it supports high DPI monitors with fractional scaling almost out-of-the-box.
> I genuinely think 4k provides no real benefit to me
Could you clarify when in your life you've had a 4k screen with good OS support, so that we know what experience you are speaking from when you say that?
One can definitely still see the pixels in a 4K 24'' monitor. That is not the point.
But I do agree with points 1 and 2 (they tend to work better on windows, though).
On the other hand, what about 3? I would find it ridiculous that it'll take you more than 5 seconds to enlarge DPI (no multi-monitor) even on the weirdest of X11 WMs. X11 is designed for this....
I agree with all your points, however I’ve found the mac is extremely variable DPI friendly. I think the games with custom UIs (Europa Universalis IV comes to mind) are the only things that haven’t adapted and it’s hardly a problem if you set the scaling to “large text” or whatever, just a little pixelated like you would see on a 1080p screen.
1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode. Depending on the OS that can mean it renders tiny, or that the whole things is super ugly and pixelated (WAY worse than on a native 1080p display)
2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.
3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
My setup is 2x24" 1920x1200 monitors - so I get slightly more vertical pixels than true 1080p, but in the form of screen real estate rather than improved density. I also have 20/20 vision as of the last time I was tested.
My argument in favor of 1080p is that I find text to just be... completely readable. At various sizes, in various fonts, whatever syntax highlighting colors you want to use. Can you see the pixels in the font on my 24" 1080p monitor if you put your face 3" from the screen? Absolutely. Do I notice them day to day? Absolutely not.
I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required. If more pixels meant slightly higher density but also came with more usable screen real estate, that'd be what made the difference for me.