HiDPI on Linux is pretty much non existent. I am telling from experience. I have booth a decent 4K external monitor and a HiDPI IPS internal display. The only setup which is ok in the sense of least horrible is Gnome mutter Wayland with experimental per monitor fractional scaling enabled. And this one is only ok if you can restrict yourself to the few ported pure GTK3 Gnome apps. Which means not that much. No Firefox. No Chrome. No Thunderbird.
And the speed of fixing HiDPI on Linux has actually slowed down.
Frnakly, and I am a hard core Linux OSS fan - come back in two years and check again.
Huh? I've been running a 2x/192dpi Linux/X11 laptop for a couple of years, and the experience has been near perfect. Hands down better than on Windows. Some DEs even detect the panel DPI and configure toolkit scaling automatically. Firefox and Chrome have been two of the most well-behaved apps, especially since they switched to GTK3, but it's not just GTK3 that is well-behaved, Qt5 is as well. Since most apps I use are Qt5 and some are GTK3, they all get crisp HiDPI rendering automatically from the toolkit.
So I'm not sure what you mean by "HiDPI on Linux is pretty much non existent." Maybe it gets worse if you need fractional scaling or displays with different scaling ratios, but these seem to things all platforms are struggling with. Microsoft only made built-in apps like File Explorer per-monitor-DPI aware in Windows 10 1703 (released a year ago), and macOS doesn't attempt fractional scaling (eg. 1.5x, 2.5x, etc.) at all.
So, what exactly is it that you expect as "HiDPI" support which is missing? As far as I can recall, it has been possible to configure the actual physical size/DPI for a monitor for many, many years. Any properly functioning X application should then be able to draw things a correct size on the screen, e.g. fonts sized in points or images and figures scaled to actual widths. I think this infrastructure may even pre-date the switch from XFree86 to Xorg. Is your complaint that there are still some applications which ignore this monitor DPI metadata or which do other pixel-based techniques?
Like the previous poster, I stick to 1920x1080 on my 14-inch class Thinkpad. In my office I have dual 28-inch 4K monitors. These have identical dot pitch to my Thinkpad, so each monitor is like having a 2x2 array of my Thinkpad screens. When I made sure the monitors were set with the correct DPI, everything worked exactly as I would expect. Whatever rendered as one pixel on my laptop would also be one pixel on my workstation, and I just had 8 times more real estate on my dual monitor desktop. But, I sit further from the monitors than I do my laptop screen while also using them for much longer stretches of time. So, I adjusted the workstation to pretend it had higher DPI so that things would render a little bigger.
It's been a few years, but I think I may have had to separately adjust Firefox because it has some of its own weird assumptions about fonts and DPI that I assume come from their renderer straddling several different platforms. I also had to adjust emacs and xterm to change from my decades old fixed font preferences to start using scalable fonts.
Mix HiDPI screens and non HiDPI ones, lots of very real corner cases that you hit daily make the experience terrible in ways that can only be "fixed" through terrible hacks and tweaks, if at all.
OK, so I guess the problem comes from trying to fuse everything into one screen? In the old days, we would run X with separate screen numbers to get multiple outputs via multiple graphics cards. You could slide your mouse between screens, but you couldn't drag a window across or have a window spanning the two screens. Each window remained confined to one screen, and only a few apps knew how to open windows on more than one screen from the same app instance. (A few, like emacs, even understand opening windows on multiple displays!)
Out of curiosity, what is a non-broken behavior supposed to be if the screens are combined? Do people expect the low-dpi screen to be a blurry version of whatever would appear on the high-dpi screen, or do they expect it to act like a magnifier, perhaps with pan/zoom controls?
> In the old days, we would run X with separate screen numbers to get multiple outputs via multiple graphics cards
Those days are gone. No matter how hard you try - either you are simply not runing on current hardware, have very modest expectations or close your eyes to realize that the Linux desktop in this regard is at least two years behind, minimum!
I am not arguing, I am genuinely wondering what you guys want. I have read multiple assertions that Linux is broken but no clear explanation of what it should be doing differently.
For reference, I have been using Linux continuously on all sorts of hardware since 1994. What I lack is any practical experience using modern Windows or Mac OS, so I have no idea what implicit expectations you may be bringing from those. The last time I ran Windows directly on hardware was before Windows 95 was released, and similarly my only real Mac experience was on monochrome classic Macs before OS X existed.
Over the years, I have used just about every sort of display hardware with Linux, ranging from serial terminals, Hercules monochrome graphics, 800x600 through 1600x1200 CRTs, the first wave of DVI-based LCDs, various HDTVs, the first DLP projectors, and my current dual 4K monitors. I was also involved in the early testing and deployment of 2D and 3D accelerators on Linux, as well as things like clusters driving arrays of projectors. We even had one of those IBM "Big Bertha" displays in our lab at one point, which was one of the first 300 DPI LCD monitors available. Just about the only thing I haven't used with Linux is head-mounted displays nor stereo glasses. My last involvement with VR was 20 years ago when SGI Onyx-based CAVE systems were prevalent in academia, combining head-tracking, active shutters, and multiple wall projectors.
But, to be honest, I have no use case to combine different DPI monitors into a single graphical screen or desktop. If I connect a laptop to a projector or display panel for presentations, I tend to just want to duplicate the presentation view on the internal screen. Otherwise, I use the laptop to be mobile and I use workstations with their dedicated displays.
I did. I recently switched from KDE after more than 15! years to Gnome because KDE on Waykand is broken beyond usability. It comes for a reason that even Neon, which I used back then, doesn't come with Wayland out of the box.
It's Intel and its not the pure hraphics but the ecosystem. Wanted to enter diacretics into Emacs? The quick search bar poped up instead an accent egu in Emacs (compose key). It's that sort of things, not directly bad display. The end result was nevertheless unusable though.
HiDPI on Linux is only broken if you need fractional scaling, or different per-monitor scaling factors. If you stick to screens that work well at 200% (e.g. 4K on 24"), then current desktops will work just fine in my experience.
That sounds like a draconian restriction if you are coming from Windows 10, but OP is switching from macOS. Macs have traditionally used either 100% or 200% as well (except for the latest-gen MBP).
Linux desktop is doing fine. But I prefer to avoid GTK when possible. It's years behind Qt. Some thing like Firefox are necessary though, but it's a pity it's stuck with GTK.
And the speed of fixing HiDPI on Linux has actually slowed down.
Frnakly, and I am a hard core Linux OSS fan - come back in two years and check again.
The Linux desktop IS dead.