1. 88
  1.  

  2. 17

    The advice here drastically improved how source code looks on my employer-required mac.

    While that alone deserves an upvote, the rest of the content is also worthwhile. I would like to bribe the writer to produce a post with tips for tuning my personal Linux laptop as well!

    1. 9

      For linux https://wiki.archlinux.org/index.php/HiDPI is a good place to start.

      With a 4k I use xrandr with –scale and then use xrdb to set Xft.dpi to my true monitor DPI.

      Some of the desktops do funny things so that can be a problem.

    2. 17

      Honestly, I don’t get it. Why does it matter what the text looks like as long as it’s satisfactory?

      1. 29

        Different people have different thresholds for “satisfactory”, I guess?

        1. 6

          I don’t really buy this, it’s not satisfaction but habit. Sure, you realize there’s a difference when you change, it’s not like your brain has changed. It’s just the inverse effect of upgrading and admiring what’s better – but after a while you get used to it. Just like you’re not inhibited by this initial admiration, you won’t be by the initial annoyance.

          In the end, it’s not pixels you’re looking at, but like art tells us, whatever we are looking at is in our head. And we’ve long passed the point where this kind of consumer scaring is necessary.

          1. 2

            I don’t really buy this, it’s not satisfaction but habit. Sure, you realize there’s a difference when you change, it’s not like your brain has changed.

            What is “habit”, if not your brain changing to optimize itself for a particular use case?

            1. 3

              Fair enough, my point is that this change isn’t permanent, and all it takes for someone to forget about what resolution the screen is is a week or two (except if actually inhibits your work, of course).

            2. 1

              But what is satisfaction if not informed by habit?

            3. 1

              Something inexplicably obvious about it just doesn’t occur to me, it seems.

              1. 1

                …. which is fine! My wife can’t see the difference, either.

            4. 16

              After using retina and 4k displays for several years, when forced to use a 1080p, 96dpi monitor I find I no longer consider any text on it “satisfactory”. To me, it all looks painfully bad now that I’m accustomed to a sharper, higher quality experience. The eye strain after 8 hours of staring at fuzzy, low res fonts takes a real toll.

              But others would be happy with a super low-res vt100, I’m sure. Everybody’s satisfactory is different.

              1. 6

                This reads to me like advice to avoid 4K as long as possible. If there’s no significant quantitative difference in efficiency/eyestrain between 4K and 1080p, and I’m currently happy with 1080p, switching to 4K will only make it more unpleasant for me to use perfectly useful 1080p monitors, pushing me to needlessly purchase more expensive monitors to replace those that I already have, and increasing consumerism.

                1. 2

                  You’re certainly free to stick with what you’re accustomed to. I have no compunctions about spending a lot of money to get the absolute most comfortable experience possible out of something I’m probably going to spend a year or more of my life, cumulatively, staring at. It’s one of the cheapest possible investments in the pleasantness of my career on a dollars-per-hour-used basis.

                  1. 3

                    Explained that way, I understand where you’re coming from. Even if there’s no objective benefit to upgrading your monitor, and you already feel “perfectly comfortable”, making work slightly more pleasant is desirable.

                    Now, you still need to make the decision as to whether the benefit gained from the monitor upgrade is worth the money you’re spending on it, but that’s much more personal. Thanks for sharing your perspective!

                  2. 2

                    The eyestrain is already there, you are just accustomed to it

                    1. 1

                      Citation needed.

                  3. 6

                    Doesn’t the vt100 use a bitmap font? This being the actual true solution to get sharp fonts on a low res display - just use bitmaps at the correct size.

                    1. 4

                      The original VT100 is quite low-res and fuzzy. Later VT terminals used higher-resolution screens which looked better.

                      1. 4

                        There’s this fascinating story about how DEC used the fuzz to good effect in their bitmap font, as a primitive form of anti-aliasing. ‘Dot stretching’, phosphor response curves… well worth a quick read!

                        1. 2

                          This is wild. Thanks for the link!

                      2. 2

                        Bitmap fonts will avoid loss of sharpness due to antialiasing, but they’re not going to make an extremely low resolution screen any less low res, so I don’t know that I’d call 5 pixels arranged in a vaguely “e”-shape exactly “sharp”.

                        1. 1

                          There are bitmap fonts which are more high res than 5 pixels per “e”. Check out stuff like atarist for alternatives.

                          1. 2

                            We’re talking about the vt100. You can have high resolution bitmap fonts, but you can’t fix a low resolution screen with a high res bitmap font.

                    2. 2

                      I concur. To my eyes text with a 1.5x scaled 4K looks better than text with a 2x scaled 4K. I think the psychovisual system is complex and subjective enough to warrant “if you like it then it’s good”.

                      1. 1

                        Some people fetishize over fonts, font rendering, font shapes, dithering, smoothing and more such visual trickery. The author of this piece has published a programming font so I assume he puts more weight on font-related things than the average font consumer. Other people have other fetishes, my own is to cram as much onto the screen as I possibly can while still being able to distinguish what is written from the fly poop which partially conceals some characters on the screen. This makes that I always have to guffaw a bit when I see people lamenting the bad state of high-dpi support in Linux since the first thing I end up doing is turning all the stuff off so I can get 16 terminals on a 15” display. To each his own, I guess…

                      2. 6

                        I agree that a Hi-DPI display is a good investment, even “just for text”. I spend 80% of my (working) time in either an IDE, a Terminal or an SQL client. Even when I’m using a browser it’s mostly to read (or write) text.

                        But I have to disagree with the “integer scaling” point re: macOS, particularly given that the article is about “upgrade your monitor”, which implies buying something. If you take a screenshot of a UI element or some rendered text and scale them up, yes, you will see that picking non-integer scaled UI in macOS will give you non “crisp” lines. That you can’t show an example at the scale they actually appear (the way he does with the font-smoothing on/off screenshot) is telling. It also ignores the reality of the market.

                        Most traditional “low DPI” displays are around 100 to 110 DPI. All of Apple’s “Retina” displays are right around 220DPI. For a 15 or 16” MBP with a DPI of 220, picking the exact “@2x” resolution (i.e. half the vertical and horizon res of the physical panel) is (a) not much of a change from the default, and will look very readable (and reduce a little GPU strain as well).

                        But let’s be honest, very few people are working on just a laptop screen by choice. If you write software there’s a better than average chance you are, or could be, more productive with more screen real estate. If you want to dispute this, then you may as well just stop reading and start abusing me for making assumptions because the rest isn’t gonna make you any less angry.

                        Ok, so you want an external display, and you want to use Hi-DPI mode so the fonts are all crisp and nice to look at. Great. If your vision is decent, and you want to follow the author’s “recommendation” about scaling, I’d argue that you have realistically just two choices for displays: the 27” 5K LG UltraFine, or the 32” 6K Apple Pro Display. There’s also some weird brand that has the same size/res as the 5K LG, but my understanding is that most people who buy one end up going through several returns until they just have to settle for the least bad unit.

                        But wait, you say. There are are dozens and dozens of 4K displays on the market! Heaps of people have 4K 27” displays, and macOS detects it as Hi-DPI. Yes, it does. But this is where that integer scaling thing comes in. If you run a 4K display at it’s “integer scaled @2x” resolution, it presents an image that “looks like” 1920x1080.

                        Have you seen 1920x1080 on a 27” display? I haven’t. I’ve seen it on 24” displays, where it’s just about liveable if you’re really obsessed with integer scaling. On a 27” I’d imagine it looks like you’re reading a “my first numbers” book for toddlers.

                        Sitting at arms length (literally, if I reach out I can not quite touch the displays in front of me), I use 2, 24” 4K displays. And I use them at the dastardly “non integer” scaled UI of “Looks like 2304x1296”. If I put one back to integer scaling so it’s “looks like 1920x1080”, I can of course see a difference. But probably not the difference the author is suggesting. The difference I see, is that on the one using integer scaling, everything looks weirdly over-sized. I would need to either press my face to the screen or screenshot and scale it up a heap to see any small imperfect pixel placements.

                        IMO this is the biggest benefit of a high-enough DPI. You aren’t fixed to a single resolution matched to the hardware, and beyond that things look like absolute shit (ever tried running a ~110 DPI display at a non-native resolution - e.g for someone with poorer eyesight? “I dont care if its blocky at least I can read it” was the common response to our (support team’s) shock at this phenomenon, circa 2004).

                        Obviously the larger the display goes without increasing resolution, the lower your DPI so the more obvious the negative effects of both modes: As the physical size increases, In “integer scaling mode” you go from “my first numbers” text, to standing on the writing on a football field and trying to read the text you’re standing on. In non-integer scaling mode, the “blurriness” of the rendered pixels will increase. I can’t comment on how visually noticeable this is, because while I accept non-integer scaling as my lord and saviour, I still aimed for the highest DPI (aka lowest physical size compared to resolution) I could find.

                        1. 1

                          That you can’t show an example at the scale they actually appear (the way he does with the font-smoothing on/off screenshot) is telling.

                          Let’s imagine that the reader browses the page on a Mac with non-integer scaling. What’s the screenshot going to look like?

                          1. 2

                            My point is that felt the need to scale the image to 800% (yes really) to show a difference.

                            If you’re on a Mac, open up his image in Preview, and compare it to the Finder toolbar. You need to scale his down to 12.5% to get it to appear the same as the actual toolbar controls. Now look at the content of his image. Apart from one being slightly smaller (due to his process to “simulate” a non-integer screenshot no doubt), There is no way you can tell me you would notice the difference between the two.

                            Unless you’re designing UI’s where you need to see literally one exact pixel, the argument that no one should ever use integer scaling is just blind (no pun intended) to (a) how eyes work, (b) why people use high dpi displays and (c) the market of available products.

                        2. 6

                          time to shell out ~1-2k € for a new monitor or, as this boils down to aliasing is bad, just use a clean bitmap font for long editing sessions :)

                          it’s a shame that bitmap fonts are so rare now.

                          1. 2

                            Reading this article helped me to realize why I enjoy using a bitmap font as much as I do.

                          2. 5

                            I’m a little skeptical about the 120Hz thing. At least validate that your graphics chip and drivers can actually render that fast before spending the extra money. As a Linux user, what made the biggest difference for me (beyond the high-PPI display in the first place) was switching to Wayland. I hadn’t noticed all the tearing and and stuttering from X until they were gone.

                            1. 2

                              Weirdly I get heaps of screen tearing in Firefox in sway but it’s quite easy to configure Xorg to have no screen tearing…

                              1. 1

                                but it’s quite easy to configure Xorg to have no screen tearing…

                                that completely depends on the gpu, and DDX driver you’re using. for example, on intel graphics, I had greater stability using the ‘modesetting’ driver in X, but there was no vsync option for that (at the time, maybe it’s better now?)

                                for video playback tearing in firefox, make sure you have libva and the associated driver installed for your gpu. it could be that firefox is not using hardware acceleration for video playback?

                                1. 3

                                  It’s not even video playback tearing in Firefox on wayland, even just scrolling pages tears it’s bizarre.

                                  1. 1

                                    could be that firefox is not using hardware acceleration for video playback?

                                    FWIW unless they’re using the latest Firefox build under wayland, then ff isn’t using video accelerated decoding. It’s been software only for a while

                                2. 1

                                  And also your eye doesn’t really seem to notice most of the time anyway. I’m a believer of 4k, I swapped last year, but not yet convinced of high refresh rates.

                                3. 5

                                  Does anybody care about the environmental impact? This seems like a giant ad to buy a product instead to deal with a problem of outline fonts in user environments. Anybody who worked with fontconfig and ttf on level more advanced than MacOS menus would consider that this devil’s advocate position is worthwhile.

                                  This position is to return to bitmap fonts. They are crips, easy to read (of course 1 pixel width is going a bit far). Then some people would actually have to downgrade. Please research bitmap fonts and found ones that fit you, and not produce more e-waste. Out of sight, out of mind does not apply to this planet. And displays, contrary to mentioned shoes, pans, beds, do not last.

                                  If somebody wants HiDPI display that lasts, I would recommend using eINK, as for prolonged reading this would be much better. Of course, responsiveness would suffer, but a will towards high responsiveness would lead towards higher and higher power usage of displays.

                                  1. 7

                                    The tone of the article also hit me the wrong way. It also illustrates how wealthy tech geeks normalize new technology very quickly. Low-res to the author means 1920x1080.

                                    The actual most common screen resolution at the moment? 1366x768. I use a 1366x768 screen, and I routinely run into websites that don’t display properly, made by web designers who think as the author does.

                                    1. 3

                                      And displays, contrary to mentioned shoes, pans, beds, do not last.

                                      Why not? My screen has been used for 19084 hours (according to the OSD). At 5 days/week a 10 hours that’s more than 7 years.

                                      1. 1

                                        Of that I can only be jealous and it is, in my view, far from being a rule. If you want, please state what display it was.

                                        1. 3

                                          It’s a 19” Eizo.

                                    2. 5

                                      I am currently thinking of investing in an ultra wide monitor for my work from home office: a completely different direction than what is described in this article. I basically want more space at the expense of the pixel density.

                                      As I am getting older, I enjoy a lot more a clean layout and clean desk. It is most certainly subjective but minimalism brings me joy.

                                      As such, I am excited by the new LG lineup; their 34” and 38” are now good compromises for gamers and software developers. I happen to be both!

                                      1. 6

                                        I’ve been using a 34” ultra-wide display with 3440x1440 pixels for… (checks) my goodness! Almost 5 years now. I’ve tried various multi-monitor setups but using this one display seems to be the sweet spot for most of what I do. The 38” 3840x1600 displays seem to have a similar pixel size (110 dpi vs 109?) so would probably be even better, though they weren’t as readily available at the time I bought this one. I believe these days you can even get 5120x1440 monsters?

                                        For testing purposes, I’ve also got an LG 27” UHD 4K display. (~163 dpi) I can’t get on with this as a primary display with macOS. At native retina resolution (“looks like 1920x1080”) everything seems huge and I’m constantly fighting with the lack of screen real estate. And as the article says, at native 1:1 resolution, everything is too tiny, and the scaled modes are blurry. So I’m going to dissent on the advice of going for a 27” 4k. The ultra-wide 5120x2160 displays have the same pixel pixel size so I’d imagine I’d have similar problems with those, though the bit of extra real estate probably would help.

                                        Don’t get me wrong, I like high resolutions. But I think for this to work with raster based UI toolkits such as Apple’s, you basically have to go for around 200 dpi or higher. And there’s very little available on the market in that area right now:

                                        I can find a few 24” 4K displays which come in at around 185dpi. That wouldn’t solve the real estate issue, but perhaps a multi-monitor setup would work. But then you’ve got to deal with the bezel gap etc. again, and each display only showing 1080pt in the narrow dimension still seems like it might be a bit tight even when you can move windows to the other display.

                                        Above 200dpi, there are:

                                        • The LG Ultrafine 5K. Famously beset with problems, plus only works at 5K with Thunderbolt 3 inputs, and can’t switch inputs.
                                        • Dell UltraSharp UP3218K. This is an 8K (!) display at 31.5”. So it actually comes in at around 280dpi, plus of course it costs over 3 grand. I mean I’d be happy to give it a try… (I suspect I’d have to use an eGPU to drive it from my Mac Mini though - what the article’s author fails to realise is that DisplayPort 1.4 support depends primarily on the GPU, not port type, and to date I believe Intel GPUs only go up to DP 1.2.)
                                        • ASUS ProArt PQ22UC. Only 4K, but higher pixel density as the panel is only 21.6”. 4 grand though! I’m guessing this has awesome colour reproduction, but that’s wasted on me, so if I was to experiment with 4K displays, I’d go for the 24” ones which cost an order of magnitude less.
                                        • Apple’s Pro Display XDR. At 6K and 216dpi, I’m sure this one is lovely, but I don’t think it’s worth the price tag to me, particularly as it once again can’t seem to switch between inputs.

                                        That seems to be it? I unfortunately didn’t seize the opportunity a few years ago when Dell, HP, and Iiyama offered 5K 27” displays.

                                        Now, perhaps 27” 4K displays work better in other windowing systems. I’ve briefly tried mine in Windows 10 and wasn’t super impressed. (I really only use that OS for games though) It didn’t look great in KDE a few years ago, but to be fair I didn’t attempt any fine tweaking of toolkit settings. So for now I’m sticking with ~110dpi; I’ve got a 27” 2560x1440 display for gaming, the aforementioned 3440x1440 for work, and the 4K 27” for testing and occasional photo editing.

                                        I’m sure 27” at 4K is also great for people with slightly poorer vision than mine. Offloading my 27” 4K onto my dad when he next upgrades his computer would give me a good excuse to replace it with something that suits me better. Maybe my next side project actually starts making some money and I can give that 8K monitor a try and report back.

                                        Another thing to be careful with: high-speed displays aren’t necessarily good. At the advertised 144Hz, my Samsung HDR gaming display shows terrible ghosting. At 100Hz it’s OK, though I would still not recommend this specific display.

                                        (Now, don’t get me started on display OSDs; as far as I can tell, they’re all awful. If I were more of a hardware hacker I’d probably try to hack my displays’ firmware to fix their universally horrible OSD UX. Of course Apple cops out of this by not having important features like input switching in their displays and letting users control only brightness, which can be done from the OS via DDC.)

                                        1. 1

                                          I switched from a single LG 27” 4k monitor to two LG 24” 4k monitors for around $300/each. I’m happy with the change. Looking forward to the introduction of a 4k ultrawide to eliminate the bezel gap in the middle; currently all such ultrawides are 1440p.

                                        2. 1

                                          The 34WK95U-W is a high-DPI ultrawide with good color reproduction. It has temporary burn-in problems but I’ve been using two (stacked) for a year and overall I’m happy with them.

                                          They aren’t high refresh though (60hz).

                                        3. 4

                                          This is exactly why I opt to bring in my own 4K display in lieu of my employer-provided ultra-wide. I honestly don’t need more screen real-estate; I need less eye strain.

                                          1. 4

                                            4k monitor only makes sense with 2× / 200% scaling

                                            Yep. Can’t stress enough how non-integer scaling is a bad idea.

                                            1. 2

                                              This is subjective. I use a 27” 4K display at 1.5 and it’s fine. I can’t see pixels from the distance I’m standing away so it looks sharp to me.

                                              1. 1

                                                Similarly, I can’t stress enough, how much assuming everyone else’s priorities are the same as yours, is a bad idea.

                                              2. 3

                                                I’ll upgrade when the backlight or image goes out, as is the natural way. My Yamakasi 30” 2560x1600 IPS went out, so I’m on a NEC 20” 1600x1200 IPS. It’s not great at distinguishing light greys, and is kind of low-contrast in a bright room, but otherwise is great. Both are almost exactly 100ppi.

                                                Sitting further away from the monitor and increasing font size (and using Fira fonts :P), things look good to me. In my younger days I used to be able to see the pixels, like I can see Verdana 9 in my mind, but now I sit further away and antialiasing is less noticeable. I’ve worked on Retina MacBook Pros and while good typography looked great (like NYTimes serifs, mm), a lot of type looked “cold” (not sure how else to describe, like a tree without leaves). Downside is 20” is barely big enough for terminal + editor at distance.

                                                I’m eyeballing a BenQ 3200U, 32” 4K, and a long cable to suit, but it’d be well over $500, so I’ll run the NEC a while longer.

                                                1. 3

                                                  Regarding differences in working setup and environment, I think about three factors:

                                                  • You notice
                                                  • You care
                                                  • You are affected positively or negatively (we can ignore neutral effects)

                                                  Taking monitors as example, here are some examples to show why I make this split:

                                                  • You may notice the viewing angles of your monitor aren’t great, but not care. Whether you’re affected or not
                                                  • You may care about monitor refresh rate, but never actually notice. I’ve seen people report that they were loving 120Hz, but realised after months that they’d actually been running at 60.
                                                  • It took me years to notice that I could see flicker from CRT monitors with refresh rates below 72Hz. Eventually I realised this was correlated with headaches. With appropriate refresh rate my headaches were significantly reduced.

                                                  Monitors can get extremely expensive, so please try them out before committing to buying, if you can. Some people may be just fine with a single 30Hz 1280x1024 TN panel, while others may find they are 100% more productive, healthy and happy only with 6 32” 5k 120Hz IPS HDR panels.

                                                  1. 3

                                                    As a 120Hz monitor user, I’d gladly throw this $1,500 monitor in the trash for a 27in 5K monitor that worked with Windows over a DisplayPort connection, even if it was only 60Hz. I don’t think any connector would even support 5K@120Hz anyway. I’m holding out hope that Microsoft will eventually make the Surface Studio display available as a standalone monitor since that’d get you a little more than 4K.

                                                    1. 3

                                                      all these tricks work. Having them is strictly better than not having them. For low-DPI displays all those are a must

                                                      ehh, I prefer non-subpixel (i.e. monochrome) antialiasing even on low-DPI displays. The subpixel color artifacts are way too noticeable and ugly.

                                                      (Of course I’m talking about freetype and good fonts (Cantarell, Fira, etc) with no hinting; Windows is unusable w/o subpixel, sadly)

                                                      1. 2

                                                        Using Windows after a long time in Linux, and man, I can reproduce a lot of the old environment (especially in modern Windows with WSL, etc.) but font differences are hard to ignore.

                                                        Hard to articulate what bugs me, but I think it’s partly I’m no longer used to hinted fonts in general or used to Arial and friends specifically. The look in modern-era UI (Segoe, big margins, etc. etc.) doesn’t bug me nearly as much, and it all bugs me a bit less at 4K than 1080p (hinting’s effect is weaker?). I do wish I could tweak my way out of the standard look but not holding my breath.

                                                      2. 3

                                                        I sympathize strongly with the author’s efforts. I’ve used Retina MacBooks (largely for their superior font rendering) for most of the past decade but contrary to the author, have come to the conclusion that vector fonts will only look good at >400ppi.

                                                        A few years back I found GohuFont and have been in love with bitmap fonts since. If that doesn’t suit your taste you can find a bunch more on GitHub. After using them for a while, hinted text just looks blurry to me.

                                                        Since I have more control over my environment these days, I’ve been switching my desktops over to Linux and using bitmap fonts in more places. I’ve been able to get lower pixel density displays on my newer laptops since I don’t need it for my fonts to look good. Other bonuses include longer battery life and not having to waste time configuring high-dpi for everything.

                                                        Another thing to note is if you rotate your monitor 90° the subpixel arrangement rotates too. I usually work with a laptop + an external monitor rotated to portrait. I assume it’s a lack of time/effort on my part but I was never able to get vector fonts to render well in this configuration. It could just be due to the reduced effective horizontal resolution of the vertical monitor. Either way bitmap fonts save me from this mess.

                                                        1. 5

                                                          Y’all are spending more than $200 on monitors?

                                                          1. 31

                                                            My grandfather, god rest him, was a frugal man. But he often said the two things you should spend extra on are shoes and mattresses, because “when you ain’t in one you’re in the other!” Maybe monitors are shoes for programmers.

                                                            But the ridiculously high end strikes me as maybe a bit much: my quality of life (legit, less squinting and headaches) improved with a 27” 4K monitor, but that was in the $300s.

                                                            1. 14

                                                              I am a cheapskate. I am loathe to spend money.

                                                              My office chair costs $750. I sit in it for a minimum of eight hours a day.

                                                              1. 3

                                                                I feel this way about monitors and keyboards. I’ll pay much more for good input/output devices, because that’s how I interact with the computer.

                                                                Personally, I would want a 60hz 5k at 27”, or a 60hz 8k at 32-34”. It annoys me that the screen on my computer (16” MBP) is better than any external monitor I could reasonably hope to use.

                                                                1. 1

                                                                  The Dell UP3218K is 31.5” and 8K, but it’s also $3,300, and only works with computers that support DisplayPort Multi Stream Transport over two DisplayPort ports.

                                                                  1. 1

                                                                    Yeah, it’s going to be a few years.

                                                                2. 1

                                                                  Yeah, came here to mention you can get 4K for way less than any of the monitors suggested in the post. I got a matte LG for $250 a while back.

                                                                  I have to admit I thought a game running at 30hz felt “smooth” so I’m not sure I could see 60 vs. 120 without a slow-motion camera. YMMV, of course.

                                                                  1. 2

                                                                    Things with a lot of motion will appear smoother than things sitting completely still. A 30Mhz desktop, with most things not moving (wallpaper, etc) will flicker like crazy since there’s no movement to mask the flicker.

                                                                    1. 2

                                                                      A 30Hz desktop that’d be, we’re still a few centuries away from a 30MHz refresh rate.

                                                                      1. 2

                                                                        pft Your monitor takes longer than Planck time to draw a full frame? n00b.

                                                                        1. 3

                                                                          Nah, mine is so fast the photons end up in a traffic jam trying to work their way out of the screen, talk about red shift. Or maybe the CCFT is going bad, who knows…

                                                                      2. 1

                                                                        Interesting. FWIW, I wasn’t trying to say anyone should go down to 30Hz (or up to 30MHz heh) just that I, personally, probably wouldn’t feel much benefit from 120, given I was able to mix up lower refresh rates.

                                                                      3. 2

                                                                        You will notice running a desktop at 30Hz. When I got my 4k monitor a few years ago, it turned out my USB-C <-> HDMI adapter could only do 4k@30Hz. It was disturbing ;).

                                                                        1. 2

                                                                          Oh, yeah, I wasn’t arguing for actively downgrading to 30hz, just saying I probably wouldn’t feel much benefit from going to 120 given my rough perception of smoothness. I see how it reads different.

                                                                    2. 5

                                                                      I spend >$200 on frying pans, for the same reason others mention shoes and beds. It’s something I use every day, and the slight increase in cost per use is well worth it having a tool I enjoy using.

                                                                      Edit I’d also like to add that I’m in an economic situation that allows me to consider $200 purchases as “not a huge deal”. I do remember a time of my life when this was emphatically not the case.

                                                                      1. 4

                                                                        Funny that, I also care about things like that… which is why I got them for free from abandoned houses and even, once, abandoned in a ditch by the roadside. That is where you’ll find old rusting cast-iron skillets in need of just a bit of TLC with a rotary steel brush, a coat of oil and a bake in the oven. The one I found in the ditch was quite fancy albeit rusty, a large Hackman with a stainless steel handle. How it ended up in that ditch in the Swedish countryside I have no idea, I never saw any mentioning of any unsolved murder case for lack of the evidence in the form of the obviously heavy blunt object used to bash in the skull of the unfortunate victim. It was slightly pitted but the steel brush made it almost like new. I use these on a wood-fired stove, just what they’re made for.

                                                                        Beds I always made myself (including high wall-mounted rope-suspended sailing-ship inspired ones with retractable ladders which you’d be hard-pressed to find elsewhere) , shoes occasionally (basic car tyre sandals). I find it far more satisfying to spend some time in making something from either raw or basic materials (beds, sandals) or revive from abandonment (cookware, computing equipment, electronics, etc) than to just plunk down more money. Another advantage is that stuff you made yourself usually can be fixed by yourself as well so it lasts a long time.

                                                                      2. 3

                                                                        I just upgraded my home office monitor for about $30. Suffice to say it’s not 4k, IPS or any of these things considered ‘essential’ for developers. Fourteen years old, it is, however, significantly better and sharper than the monitors most programmers worked on until the 1990s. And they did better work than I ever did.

                                                                        If you like spending money on monitors, be my guest, but if you write a blog insisting others should do the same, I think we should call this article out for what it is: promotion of conspicuous consumption.

                                                                        1. 2

                                                                          If you write graphical applications or websites it makes sense to have something reasonably good and at least with a high pixel density, because if you work on a website only with a loPDI display and try it on a hiDPI display later you will likely be surprised!

                                                                          It doesn’t have to be top notch, the idea is just to get reasonably close to what Apple calls “Retina”. I can find IPS, 27” 4K displays around €400 on the web.

                                                                          Also, it’s not exactly the same use case but a lot of entry-level phones and tablets have very nice displays nowadays.

                                                                          1. 1

                                                                            if you work on a website only with a loPDI display and try it on a hiDPI display later you will likely be surprised!

                                                                            does this not work both ways?

                                                                            1. 1

                                                                              Not really in my experience. CSS units (px, em) are density-aware and scale nicely, browsers take care to draw lines thinner than one pixel properly, and even downscaling a raster image isn’t always a big deal given how fast modern computers are.

                                                                              1. 3

                                                                                i can only speak for myself, but using a 1024x768 screen for the web has been a pretty poor experience in recent years. a lot of the time fonts are really large and there is a lot of empty space, making for extremely low information density and often requiring scrolling to view any of the main content on a page. sometimes 25-50% of the screen is covered by sticky bars which don’t go away when you scroll. it makes me think web developers aren’t testing their websites on screens like mine.

                                                                                1. 1

                                                                                  Some websites really suck, no doubts about it. But web standards are carefully designed to handle different pixel densities and window sizes properly, even if they can’t ensure that websites don’t suck.

                                                                                  For example, many bad websites change the default size of the text to something ridiculously big or small. This is a really bad practice. Better websites don’t change the default font size much, or (even better) don’t change it at all, and use em and rem CSS units in order to make everything relative to the font size so the whole website scales seamlessly when zooming in and out.

                                                                                  Note that if your browser/operating system is not aware of the pixel density of your display, everything will be too big or too small by default. Basically, zooming in/out is the way to fix it. If you want to test your setup with a reference, well-designed and accessible website you can use some random page on the Mozilla docs.

                                                                                  1. 1

                                                                                    Some websites really suck, no doubts about it. But web standards are carefully designed to handle different pixel densities and window sizes properly, even if they can’t ensure that websites don’t suck.

                                                                                    And you’re saying this means a web designer with a high DPI display can rest assured that his website will look good on a low DPI display, as long as he follows certain practices?

                                                                                    Why doesn’t the same apply in the reverse case, where the designer has a low DPI display and wants their website to be usable on a high DPI display?

                                                                                    I have to say even the MDN site wastes a lot of space, and the content doesn’t begin until half way down the page. There’s a ton of space wasted around the search bar and the menu items in the top bar, and around the headers and what appear to be <hr>’s.

                                                                                    1. 1

                                                                                      And you’re saying this means a web designer with a high DPI display can rest assured that his website will look good on a low DPI display, as long as he follows certain practices?

                                                                                      Yes I think so. In fact Chrome and Safari have low DPI simulators in their dev tools.

                                                                                      Why doesn’t the same apply in the reverse case, where the designer has a low DPI display and wants their website to be usable on a high DPI display?

                                                                                      Well it does to some extent, but typically you have to be careful with pictures. Raster images won’t look sharp on high DPI displays unless you’re using things like srcset. Of course it’s absolutely not a deal breaker but it is something to have in mind if you do care about graphics.

                                                                                      In anyway, I think the vast majority of web designers are using high DPI displays nowadays.

                                                                                      I have to say even the MDN site wastes a lot of space, and the content doesn’t begin until half way down the page. There’s a ton of space wasted around the search bar and the menu items in the top bar, and around the headers and what appear to be ’s.

                                                                                      Indeed, and the header also wastes a lot of space (though not the half) on my high DPI 13” display. It’s a bit funny because I didn’t notice it earlier: When I’m looking for something on this website, my eyes just ignore all the large header and I start searching or scrolling immediately.

                                                                                      But this “big header” effect is less present on “desktop mode” so you should try to zoom out if the font size isn’t too small for you. I’ve tested it with the device simulator in Safari at about 1220x780 and it does not look that bad to my eyes.

                                                                                      1. 1

                                                                                        Well it does to some extent, but typically you have to be careful with pictures. Raster images won’t look sharp on high DPI displays unless you’re using things like srcset. Of course it’s absolutely not a deal breaker but it is something to have in mind if you do care about graphics.

                                                                                        Yeah I guess this is the one area where low DPI displays could be easier to target without personally testing with one. A large image shrunk will look fine, while a small image enlarged will look like dog shit.

                                                                                        The use of high DPI displays by most web designers probably explains why modern sites look so shitty on low DPI displays. But that also means you won’t get fired for making a site that looks shitty on low DPI displays. It also makes sense from a corporate perspective, as high DPI displays are more likely to be used by wealthier people who will be a larger source of revenue, even if low DPI displays are still in widespread use.

                                                                          2. 1

                                                                            A decent monitor lasts a good 3-5 years, possibly longer, but let’s say 3 and be pessimistic. What is a $1,000 monitor worth, as a percentage of your salary over three years? More to the point, what is it as a fraction of the total cost of employing you for three years? According to Glass Door, the average salary for a software developer in the USA is around $75K. Including all overheads, that likely means that it costs around $150K a year in total to employ a developer. Over three years, that’s $450K. Is a $1,000 monitor going to make a developer 0.2% more productive over three years than a $200 monitor? If so, it’s worth buying.

                                                                          3. 5

                                                                            How silly of me; I must have somehow forgotten that bitmap fonts don’t exist. Glad I have this fellow to set me straight.

                                                                            1. 6

                                                                              Care you explain what you mean by this comment? It reads like sarcasm, but not all of us are as enlightened as you to understand it.

                                                                              1. 4

                                                                                The author seems to hate aliasing in its forms, but it really only happens on outline fonts. So if you insist on using outline fonts but also don’t want to see aliasing, you have to resort to a monitor with a resolution high enough to fool your eyes.

                                                                                Or you can just use a bitmap font, and never have this problem in the first place, whatever your screen resolution.

                                                                                1. 4

                                                                                  Do bitmap fonts not work on 4k displays or something? I’m not sure what you’re implying

                                                                                  1. 4

                                                                                    The article implies that a 4K display is the only way to avoid aliasing/blurring when rendering fonts; bitmap fonts are an obvious counterexample.

                                                                                    1. 2

                                                                                      Ah gotcha. What are your favourite bitmap fonts? The only ones on my system are monospaced so I don’t see them very often.

                                                                                2. 1

                                                                                  I just purchased a Unicomp keyboard. Let’s see if I can retrain myself to hit the keys less harshly, and end the day with hands that don’t feel stiff…

                                                                                  If I had to buy a new monitor, it would probably be 120Hz, again in hopes that it would help reduce latency and make the keyboard feel less spongey. But that probably also has to do with the fact that I have (genetic) poor vision, so I run with increased font size anyway.

                                                                                  1. 1

                                                                                    I truly don’t understand why high-resolution monitors are popular. What’s the benefit? You upscale everything anyway, so you don’t get any more screen space. You get a bit more actual pixels per “pixel”, sure. I guess that’s a small benefit.

                                                                                    The huge drawback, though, is compatibility. If you’re on Windows, you use programs all the time that do not support DPI scaling. So they get upscaled “artificially” by Windows, just like you would zoom a picture, making them blurry and resulting in less readable text. I prefer all text looking good, instead of some text looking spectacular and other text looking terrible.

                                                                                    And if you just use the monitor at its “true” resolution, without any scaling, everything will be so small that it is unreadable. I don’t know if it’s just me, but I cannot use laptops with high-resolution screens without straining my eyes, at least unless they use GUI scaling.

                                                                                    1. 2

                                                                                      This is a software issue and some operating systems are actually doing it right: Android, iOS, macOS, … You can use these with high pixel density displays without the problems you describe. On properly configured systems, the pixel density doesn’t change the physical size of the text.

                                                                                      1. 1

                                                                                        Indeed it is a software issue, but it’s such a big and common one that I don’t find higher pixel densities worth the effort. Even if I used an operating system that didn’t have any problems with scaling, perhaps some day I’d want to use or just for fun play with another operating system that does have problems with it, like Windows. That, combined with the fact that I don’t really notice the supposedly improved quality of the text, makes me hesitant.