1. 24
  1.  

  2. 10

    I have a lot of trouble interpreting a table with four different emoji faces.

    1. 3

      It’s not optimized for the least surprise, for sure, but I enjoyed it: the “level of support” can be a bit subjective and smileys convey this subjective emotions very well.

    2. 4

      The tl:Dr is: use Wayland because X just wasn’t designed for this.

      1. 3

        If all monitors have the same scale, X can look fine. But it absolutely cannot handle both low-DPI and high-DPI monitors at the same time.

        1. 1

          Although this does work fine with Qt5.

          1. 1

            It can, using a specialized xrandr configuration, and it looks great. The only problem I had (which stopped me from using it) is that there’s a bug in the Intel driver that makes the cursor on the laptop monitor flicker, which is more annoying than you’d think.

        2. 6

          Wow if I bought a monitor with a dead pixel and weird lines on the screen it’d be back in the shop before you could say ‘Consumer Guarantees Act 1993’. Especially on such expensive high end hardware. I was upset enough that my monitors’ colour balance isn’t quite the same.

          EDIT: I also find it absolutely hilarious that DPI scaling works fine in Qt 5, and works fine in actual web browsers, but doesn’t work in Electron, the supposedly ‘modern’ UI framework.

          1. 4

            He didn’t even align the displays with each other … AAAAAAARGHRGHGHRGH.

            1. 3

              DPI scaling works fine for Electron apps based on a Chromium version that supports DPI scaling. This has been the case for quite some time now, and Chromium’s move to GTK3 has improved support even further. I’m not sure what Electron apps the author was using that didn’t support DPI scaling, however I’ve yet to come across one that doesn’t scale on my 4K laptop screen. Both VS Code and Slack work flawlessly for me.

              I got my XPS 9560 in early 2017 with a 4K screen so I was initially quite worried about scaling issues, however the only apps I ever have issues with are older GTK2 apps (Gimp, and pgAdmin are the only two that I use).

              1. 2

                DPI scaling works in Electron apps, but I often have to specify it per app (often by using Ctrl +/- for the browser zoom). … It is kinda a step backwards when you think about it.

                1. 1

                  I am using Spotify. I have just checked and it’s still not scaling correctly without the appropriate command-line option. I’ll add a note this may depend on the Electron app.

                  EDIT: maybe Spotify is not an Electron app, but a CEF app. Is there still a difference?

                  1. 1

                    The version of Chromium CEF/Spotify uses seems to lag pretty far behind contemporary Electron builds, just based on https://www.spotify.com/ro/opensource/

                    1. 1

                      Chromium 65 is recent enough to have the appropriate code. But maybe CEF doesn’t make use of it. I’ll update the table to mention Electron apps works fine.

                      1. 1

                        Spotify for Linux has been around since before Electron existed, so Spotify not using it isn’t much of a surprise.

                        According to this page, Electron doesn’t make use of CEF, and instead calls Chromium’s APIs directly, which is probably why Electron apps are able to scale correctly while Spotify doesn’t.

                    2. 1

                      I use Spotify every day in a HiDPI environment. Never had an issue. The one thing you might want to do if the first time you load it the text looks too small is use the built in zoom feature (Ctrl+/Ctrl-) to bring the font to a readable size, it’ll be saved and you won’t have to worry about it anymore.

                  2. 1

                    Wow if I bought a monitor with a dead pixel and weird lines on the screen it’d be back in the shop

                    The policy allowing some handful of dead/stuck pixels has been written into the warranties of most monitors literally since LCD computer monitors have been around. Because most people use their monitors for web browsing, email, document editing, etc, where a couple of extremely tiny black specs are truly insignificant and will literally never be noticed among all of the dust such that accumulates on every screen.

                    If you want a monitor that comes with zero dead pixels guarantee, they certainly sell those, but they cost more as well since there’s more QA involved.

                    1. 1

                      The policy allowing some handful of dead/stuck pixels has been written into the warranties of most monitors literally since LCD computer monitors have been around.

                      They can write whatever they like in the agreement that I never signed or agreed to when I bought a monitor from a shop. It’s completely irrelevant. I’m not talking about returning it to the manufacturer under their warranty, I’m talking about returning it to the shop I bought it from under consumer protection law.

                      Because most people use their monitors for web browsing, email, document editing, etc, where a couple of extremely tiny black specs are truly insignificant and will literally never be noticed among all of the dust such that accumulates on every screen.

                      My monitor has no dead pixels. If it got a dead pixel, I would notice immediately. They’re incredibly obvious to anyone that isn’t blind.

                      If you want a monitor that comes with zero dead pixels guarantee, they certainly sell those, but they cost more as well since there’s more QA involved.

                      No, monitors that come with a ‘zero dead pixels’ guarantee are all monitors.

                      1. 1

                        They can write whatever they like in the agreement that I never signed or agreed to when I bought a monitor from a shop.

                        Nobody mentioned an agreement. A warranty is not the same as an agreement or contract.

                        I’m not talking about returning it to the manufacturer under their warranty, I’m talking about returning it to the shop I bought it from under consumer protection law.

                        It would have been useful to mention that you’re apparently in New Zealand. If I understand it, the law you’re talking about requires every retailer to accept returns of purchased merchandise. Not all countries have such a law. In the U.S. for instance, almost every store accepts returns whether or not the merchandise is defective. But this is simply good customer service, it’s not a legal requirement.

                        So now the argument hinges on what is considered defective and who gets to decide that. Is it up to the manufacturer? The retailer? The end user? In your country, I honestly don’t know and don’t care enough to research it right now.

                        They’re incredibly obvious to anyone that isn’t blind.

                        No, not really. Dead pixels are only obvious when the entire area around the dead pixel is one solid bright color, and even then, are generally indistinguishable from dust. Most people will never notice a dead pixel in everyday use, especially as the pixels in monitors get smaller and smaller. I have a huge monitor with a ridiculous resolution at home. It has a couple of dead pixels, it’s been months since I last noticed them. But by god it was like $200 on Amazon. I’ll happily save a few hundred dollars to deal with a couple of dead pixels I very rarely notice.

                        No, monitors that come with a ‘zero dead pixels’ guarantee are all monitors.

                        In New Zealand, maybe, but that’s not at all a universal statement. Nor should it be.

                        The realities of the LCD manufacturing process are such that if every LCD panel manufacturer threw out all of their panels with one or more dead pixels, every monitor produced would cost the end user a lot more. Because not only do you need better QA, you’re throwing into the trash a significant percentage of your yield. Which has a dual negative impact: Not only did you waste precious factory time and expensive resources on the panel, now it has to get thrown away into a landfill or processed for recycling if that’s even possible.

                        It’s far more efficient from a manufacturing, environmental, and market standpoint to just sell the slightly imperfect panels at a discount and sell the perfect panels for whatever the market will bear for zero dead pixels. Which is exactly what most manufacturers do. You want zero dead pixels, buy the one with the zero dead pixels policy. Here is Dell’s version of that: https://www.dell.com/support/article/nz/en/nzbsd1/sln130145/dell-lcd-display-pixel-guidelines?lang=en

                        1. 1

                          Nobody mentioned an agreement. A warranty is not the same as an agreement or contract.

                          A warranty is an example of an agreement. You purchase the thing, and they agree to take it back if it’s faulty. But they can put whatever terms they like, they can define taking it back, define timelines, define ‘faulty’, etc. It’s completely up to them, really. If you don’t like it, don’t buy it.

                          It would have been useful to mention that you’re apparently in New Zealand. If I understand it, the law you’re talking about requires every retailer to accept returns of purchased merchandise.

                          Only if it’s faulty.

                          So now the argument hinges on what is considered defective and who gets to decide that. Is it up to the manufacturer? The retailer? The end user? In your country, I honestly don’t know and don’t care enough to research it right now.

                          The same way anything is decided legally: it starts off a bit fuzzy around the edges, but in the vast majority of cases, it’s pretty obvious what it means for something to be faulty. And in a few edge cases, it gets decided by the legal system which sets a precedent that sharpens the edges for everyone else in the future.

                          No, not really. Dead pixels are only obvious when the entire area around the dead pixel is one solid bright color, and even then, are generally indistinguishable from dust. Most people will never notice a dead pixel in everyday use, especially as the pixels in monitors get smaller and smaller.

                          I can guarantee I’d notice any dead pixels on my 1920x1200, 24 inch monitor. I can guarantee I’d notice any dead pixels on my phone. I think it’s nonsense to claim that most people would never notice a dead pixel in everyday use. A bright dot in the middle of your monitor is going to be obvious if you’re watching something that’s dark. The moment you watch a movie there’s an unmoving bright green dot in the middle of the screen? Everyone is going to notice that.

                          I have a huge monitor with a ridiculous resolution at home. It has a couple of dead pixels, it’s been months since I last noticed them. But by god it was like $200 on Amazon. I’ll happily save a few hundred dollars to deal with a couple of dead pixels I very rarely notice.

                          It’s fine if the manufacturers and retailers sell them at a discount as seconds. But that’s not what they’re doing. They’re selling them as normal and then just hoping people can’t be bothered complaining about them and returning them.

                          The realities of the LCD manufacturing process are such that if every LCD panel manufacturer threw out all of their panels with one or more dead pixels, every monitor produced would cost the end user a lot more.

                          For a start, nobody is saying that they have to throw them away. As I said, they could sell them at a discount as a second. Some people would be fine with that, others wouldn’t, that’s friendly to the customer and lets them make a choice with a tradeoff.

                          It’s far more efficient from a manufacturing, environmental, and market standpoint to just sell the slightly imperfect panels at a discount and sell the perfect panels for whatever the market will bear for zero dead pixels. Which is exactly what most manufacturers do.

                          That’s absolutely not what they do. They sell them all at a price somewhere between those two prices, and when you buy a monitor you roll the dice. Maybe you’ll be lucky, maybe you won’t. People that want a good monitor that actually works as advertised have to roll the dice, and someone that doesn’t care like yourself has to pay a higher cost (for your chance to get a perfect monitor) than they’d pay if they were able to specifically buy a monitor with a couple of dead pixels at a discount.

                  3. 3

                    I have the exact same screen setup, and last week I installed Arch Linux with i3. All I had to do was set Xft.dpi: 144 and **dpi: 144 in ~/.Xresources and xrdb -merge ~/.Xresources. Everything I use gets scaled as a result.

                    1. 2

                      Had my own personal struggles using a HiDPI laptop with dual external 1080p displays connected.

                      My solution was to run my laptop at half the resolution and stick everything at 96 dpi, which plays nicer than trying to make multiple DPI work.

                      Are we going to see that in action soon? For example on my macbook, DPI just works even when devices with different DPI are connected.

                      1. 1

                        Are we going to see that in action soon? For example on my macbook, DPI just works even when devices with different DPI are connected.

                        Aside outdated toolkits, this works now on Linux with Wayland as well. I have used my workstation with HiDPI and LoDPI screens for a while. I set the HiDPI screen to 2x scaling, the LoDPI screen to 1x. This worked nicely, even when dragging windows between screens (with modern Qt and Gtk+3 programs).

                      2. 3

                        The year of the Linux desktop is nigh!

                        Or not. It’s crap like this that makes Linux a non-starter for most people.

                        1. 10

                          What ‘crap’ is it exactly that makes Linux a non-starter for ‘most people’?

                          Absolutely terrible driver support for hardware because Nvidia are a shitty company? Intel chips have free software drivers integrated into the kernel before the systems are even released. Nvidia still doesn’t help nouveau at all essentially. That Nvidia are allowed to distribute their clearly-GPL-violating proprietary kernel modules is baffling to me.

                          Also the author would have probably been completely fine if he just bought the previous model of graphics card, for which nouveau works completely fine as far as I am aware. It’s hardly fair to compare an operating system where you have zero control over hardware whatsoever like Mac to Linux. Linux is expected to work perfectly with all new hardware that comes out, even with zero cooperation from the hardware vendors and all development basically being done by volunteers. Nobody complains that Mac doesn’t work on their random laptop they tried installing it on, or on some hardware it has never been tested with or developed for.

                          You know ‘most people’ don’t have a graphics card, right? That most people don’t need 60Hz 4k displays, and certainly not multiple of them. That most people just use a web browser anyway, and so don’t actually care if the GIMP does DPI scaling properly.

                          This post can basically be summed up as ‘Nvidia drivers are shit’. That’s an issue, but… that’s Nvidia for you.

                          1. 6

                            What ‘crap’ is it exactly that makes Linux a non-starter for ‘most people’?

                            The fact that whether or not any given application will do anything usable or sane with a) text, b) the rest of the interface, or c) both on a monitor resolution that’s been common for years is a complete crapshoot based on which hodgepodge of squirrelly UI frameworks its author happened to personally prefer, consistency be damned?

                            The fact that there’s even separate answers for text and everything else is a usability disaster, let alone the whole matrix of dependencies a user need to dig into to discover why their music player does one thing, their browser another, and their text editor yet a third thing.

                            It’s precisely crap like this that keeps me on OS X, which is by no means perfect, but at least applications look and behave a couple of orders of magnitude more consistently. Life’s too short to dig through this dogshit so I don’t have to squint at a screen. This stuff is a solved problem everywhere else.

                            1. -2

                              The fact that whether or not any given application will do anything usable or sane with a) text, b) the rest of the interface, or c) both on a monitor resolution that’s been common for years is a complete crapshoot based on which hodgepodge of squirrelly UI frameworks its author happened to personally prefer, consistency be damned?

                              It’s not the resolution that’s the issue. It’s that people want pixels on their monitor to not correspond to actual pixels. I have no idea why, but they do. I think it’s mostly a marketing gimmick.

                              It’s precisely crap like this that keeps me on OS X, which is by no means perfect, but at least applications look and behave a couple of orders of magnitude more consistently. Life’s too short to dig through this dogshit so I don’t have to squint at a screen. This stuff is a solved problem everywhere else.

                              If you don’t want to squint, don’t buy a monitor with tiny pixels you have to squint at.

                              1. 7

                                It would be nice if you could avoid condescending comments like this. Consider that your opinion is just an opinion, so it’s not necessarily correct, or at the very least not the best option for every single person out there, especially when literally millions of people clearly consider HiDPI screens useful.

                                Also, if you’re hoping to convince anyone of the merits of Linux, this is emphatically not the way to do it.

                                (FWIW, the comment you originally replied to wasn’t constructive either.)

                                1. 4

                                  It’s not the resolution that’s the issue. It’s that people want pixels on their monitor to not correspond to actual pixels. I have no idea why, but they do. I think it’s mostly a marketing gimmick.

                                  “72 DPI ought to be enough for anybody”

                                  1. 2

                                    It’s not the resolution that’s the issue. It’s that people want pixels on their monitor to not correspond to actual pixels. I have no idea why, but they do.

                                    because reconfiguring/recompiling literally everything to use larger fonts and UI component measurements is not feasible

                                2. 2

                                  I don’t disagree with you. The previous generation of cards doesn’t feature dual HDMI 2.0/DP 1.2 connectors for the low end (notably the passive ones). If I would have known the problem with the drivers, I would have brought a Radeon card, even with the additional fan.

                                  1. 2

                                    To be fair, the article reads like he made every bad decision possible.

                                    • Accepting a faulty product? Check!
                                    • Buying Nvidia? Check!

                                    This sounds a lot like “I want to learn sailing, so I bought this bike, and now I realized it doesn’t even float!”.

                                  2. 3

                                    Not just newcomers. Crap like this made me move to Mac after >20 years of using mainly Linux.

                                    My gaming box is happily running Arch Linux, though. Steam is very good, and Proton is slowly widening the compatibility to a point where I’m missing nothing.

                                    1. 5

                                      Crap like this made me move to Mac after >20 years of using mainly Linux.

                                      What crap specifically is it that made you move to Mac? Because I can name a lot of crap that made me move away from Mac back to Linux after experimenting with it by buying a Macbook Pro a few years ago, like being tied down to a terrible proprietary operating system missing all the useful features I want.

                                      1. 3

                                        I personally don’t like Mac either but it is not a terrible OS, it just isn’t made for people like you and me. (For me the worst parts are the cmd-tab, the keyboard layout and the global menu. Typical Mac users love their cmd-tab and global menu and didn’t complain about the keyboard until recently.)

                                      2. 1

                                        Arch Linux

                                        That might be your problem. If you like to have a working computer and want the latest software, try Fedora. If you like to have a working computer and don’t care about the latest software, use Ubuntu. Very few other distros care about ease of use, and if that’s why you left Linux it’s likely because you made it hard on yourself.

                                        I spent hours configuring Arch and i3 to be just how I needed it, and it still wouldn’t work well. I installed Fedora and everything (even this one ethernet over type-c dongle someone told me simply could not work on Linux) just worked. It took about 5 minutes to setup my keybinds in KDE again.

                                        (Also interesting that I moved the other way as you, Mac -> Linux.)

                                        1. 4

                                          I had typed here a longer reply, but I’ll just say instead that I have hard time believing some hardware worked on Fedora while not on Arch.

                                          1. 1

                                            Why?

                                            1. 2

                                              Hardware compatibility comes primarily from the kernel, and Arch’s kernel follows vanilla in a quite speedy fashion.

                                              1. 2

                                                Ah, the problem likely wasn’t support, but that I needed some config option somewhere to enable it, and I had no idea what package/service would even handle something like that. A ready to use distro already had that configured.

                                                A person who gladly moved to MacOS would probably appreciate something like that not involving configuration.

                                          2. 3

                                            Nah, I like Arch Linux. Its simplicity makes many of the deficiencies of desktop PCs worth it. Almost.

                                            I think it’s mostly that I felt that Wayland promised to remove most of these obstacles, but we’re still waiting. Then I happened to have a Mac forced on me by work for a longer time, so I was sort-of forced to experience it. It took some time, but I warmed up to it enough that now I find myself liking how things just work.

                                            Fedora and Ubuntu and Windows are not there yet, and I suspect they never will. The problem is the amount of hardware and software combinations they have to support.

                                        2. 1

                                          I agree that Linux can be better, but…

                                          Scaling for the types of apps he mentioned (Java, and other weird ones) won’t work on Windows either. Only MacOS got this right.