That’s because MacOS got rid of subpixel antialiasing sometime after launching Retina screens, which makes non-hiDPI screens have quite awful font rendering.
I sometimes switch to a bitmap font like Fixedsys Excelsior or GNU Unifont when using MacOS with a low-resolution monitor to compensate (with antialiasing off so the bitmap font looks crisp).
Also, JetBrains Mono somehow looks good on lowres screens even though it’s not a bitmap font, it seems to not blur as much as other fonts when it gets antialiased.
Subpixel antialiasing is going to be problematic anyway on newer displays that don't necessarily feature a plain RGBRGB (or similar) subpixel arrangement. For example, many OLED screens use RGBG/BGRG or even more complex "PenTile" subpixels.
> Subpixel antialiasing is going to be problematic anyway on newer displays that don't necessarily feature a plain RGBRGB (or similar) subpixel arrangement.
This will then mean
making the subpixel anti-aliasing algorithm aware of different subpixel layouts. And this ought to be done anyway, because most anti-aliasing is usually at least somewhat hardware-aware. In my opinion, regardless of how subpixels are laid out, more resolution is always better.
But this is an issue that applies to VA panels as well (cheaper than IPS, worse viewing angles, but better contrast ratio), and I have a 27" 4k VA screen that works just fine with it turned on in Linux — text is so much clearer with it on than off, and attaching a MacBook to a 4k screen at 27" or 32" IPS makes me hate MacOS for killing subpixel rendering off.
As for "retina" resolutions, I've tried 24" at 4K as soon as it came out (with that Dell monitor that required 2 DP 1.1 connections for 60Hz IIRC), and turning subpixel rendering off made text and lines jagged — that was ~190 ppi at a normal viewing distance with vision corrected to better than 20/20 (which is what I usually have — can't really work without glasses anyway, and worse correction leaves me with headaches). For the record, 5k at 27" and 6k at 32" is roughly ~216 ppi, so not much better than ~190 ppi: subpixel rendering probably achieves 2x the increase in text clarity for those not sensitive to colour fringing (I am not).
So, subpixel rendering is really not an issue on any displays, but Apple will happily tell you what's the limit of your vision and upsell you on their monitors.
Fontconfig on Linux has an option to set the subpixel layout, though currently only rgb, but, vrgb and vbgr are supported. Maybe this could be extended for OLED monitors
The “sometime” happened in macOS Big Sur. Prior to that in Mojave and Catalina, you could enable it back by twiddling a hidden preference with the word “legacy” in it. It somehow was worse than what you got in High Sierra and prior anyway.
Subpixel antialiasing is kind of overrated anyway. On screens that are low enough DPI to benefit from it, it can cause color fringing (especially for people with astigmatism) that is worse than the blur from grayscale antialiasing.
I disagree: I am not susceptible to colour fringing, and I can tell if subpixel rendering is on or off on 24" 4k screens (~190 ppi) at a regular or even further viewing distance (~70cm/27") — I specifically got that display hoping for subpixel rendering to be turned off.
Haven't tried Apple's big "retina" screens but considering they are ~215 ppi, pretty confident 10% increase in PPI wouldn't make a difference that subpixel rendering does. Laptop screens have higher resolution, but haven't really paid attention to whether M1 Air 13" or 4K 14" X1 Carbon work for me without subpixel rendering (I prefer to be docked).
Before anyone jumps on "you've got incredible vision": I wear either glasses or contacts, and with that my vision corrects to better than 20/20 — slightly lower correction induces headaches for me. Without glasses, I'd probably be happy with 640x480 on 32" so they are kind of a must. :)
On medium-DPI screens, I find that subpixel antialiasing make fonts significantly less blurry than grayscale antialiasing without causing obvious color fringing. On actual low-DPI screens, bitmap fonts are IMO the only really usable option. (YMMV, but I have mild astigmatism and use glasses.)
They don't sell anything without hiDPI for quite some time now (a decade?). Making their software look good on obsolete screens is understandably not a priority for them. And if you are happy to plug in something that old, you are kind of signaling that you don't really care about what things look like anyway. So, why bother to make that look good?
> They don't sell anything without hiDPI for quite some time now (a decade?). Making their software look good on obsolete screens is understandably not a priority for them. And if you are happy to plug in something that old, you are kind of signaling that you don't really care about what things look like anyway.
My apologies for buying 1080p monitors that had no issues with neither my Linux, nor my Windows computers, I guess. I can understand that they might not care about what I care about (supporting the hardware that I have, rather than going out of my way to buy a new monitor just because of a new computer deciding not to work with it well), I'd argue that maybe that's even fine because it's their device and ecosystem, but jeez, that tone is super uncalled for.
As an aside, I use the M1 MacBook at a scaled resolution of 1440x900 because anything finer is hard for me to see. That's a visible PPI of around ~130 because of the 13.3 inch screen. A 1080p monitor of 21.5 inch diagonal size would have a physical PPI of around ~100, so that's around 80% of the pixel density. That's not to say that the panel on the MacBook is not much nicer, but rather that with software anti-aliasing it could definitely be okay. Somehow I don't want to buy a new monitor just for the weekends when I visit the countryside.
I have a perfectly good normie dpi 25x16 display which is extra crisp on windows. On macOS I had to install betterdisplay just to make it not miserably bad; it’s just plain bad now. As far as I can tell Apple removed the feature because of greed and laziness.
There are plenty of non-hiDPI screens from other vendors on the market, especially “large” screens that are “medium” in price. In an office you’re not always free to order a screen from any vendor you want (due to their framework agreements), unless of course you’re paying for that hardware privately.
I care about how things look, and have spent more time than I want to admit configuring MacOS apps to look good on the screens available to me. I just don’t care enough to buy an expensive office screen with my own cash if my employer can’t provide one.
Apple specifically wants that you cannot use non-apple displays by artificially worsening the experience for the user while strengthening the illusion that Apple's hardware looks better - even though the only reason it does is because Apple themselves made sure to make other displays look unnecessarily bad.
It's hilarious there are people that actually think this is totally okay and not just plain anti-competitive with just enough plausible deniability to get away with it
In a few more words: not at all, not even slightly.
To explain briefly:
> Apple specifically wants that you cannot use non-apple displays
No. Apple does not make or sell or offer non-HD displays and has not done for over a decade. Apple mainly sells phones and laptops with built-in hiDPI screens. Desktop computers that use external screens are a small part of its range, and it sells its own very high-quality screens for those.
Because font antialiasing is pointless on a hiDPI screen, and it only offers hiDPI screens, it removed antialiasing from its OSes.
However, the kit does still support old screens and you are free to use them. The antialiasing feature is gone, but to my (not very strong) eyesight it doesn't matter and stuff looks fine.
> artificially worsening the experience for the user
No. This is paranoia.
> It's hilarious there are people that actually think this is totally okay
People think it's okay because your interpretation is paranoid.
> not just plain anti-competitive
How is REMOVING features anti-competitive? In what universe does taking something out of your products hurt your competition? That is absurd.
> How is REMOVING features anti-competitive? In what universe does taking something out of your products hurt your competition? That is absurd.
You're unironically arguing that EEE isn't anti competitive?
The whole strategy is about removing support/features at the right time when users cannot realistically leave, putting the nail in the competitors coffin.
Simply put:
1. initial product supports both equally
2. People start using your product
3. Competitors product work less well
4. People will use the better working product. Despite the fact that the downgrade in quality is artificial.
Or is it only anti-competitive if Microsoft does it, Apple being the last bastion of healthy competition on the market, with groundbreaking examples like the AppStore and the green/blue bubbles in their chat app?
I sometimes switch to a bitmap font like Fixedsys Excelsior or GNU Unifont when using MacOS with a low-resolution monitor to compensate (with antialiasing off so the bitmap font looks crisp).
Also, JetBrains Mono somehow looks good on lowres screens even though it’s not a bitmap font, it seems to not blur as much as other fonts when it gets antialiased.