1. 32
  1.  

  2. 30

    Lesson (I learned > a decade ago): never buy Nvidia in laptops if you want to run Linux, have decent battery life and not have the fan spinning all the time. Definitely never buy it in hybrid setup.

    Sad to see it is still as true today.

    1. 10

      I explicitly bought an AMD GPU in my new desktop because I love their Linux support these days. Even though a NVIDIA card would have worked fine on the single GPU desktop I didn’t want them to get my money.

      1. 5

        with these prices? how’d you manage that? what GPU did you get?

        1. 8

          Ok, you caught me. I haven’t received it yet. But I am in a queue since December. But I got in the AMD queue for this reason.

          1. 3

            That’s cool, and I support your choices. That’s why I’m planning my workstation upgrade. Something along the lines of 5900x or similar, but I’m strugling to find a GPU anywhere.

          2. 2

            Depending on what you need, laptops with AMD APUs are available. I bought a ThinkPad T14 AMD last November or December. The amdgpu driver is really excellent.

            1. 1

              Yes, I also have a tuxedo pulse with a Ryzen 7 4800h which is great. But we were talking about desktop GPUs. Something I can replace my 980Ti with. No APU can touch my 7 year old GPU, but buying a replacement is not that easy.

              1. 2

                I also generally prefer a desktop/workstation for heavy lifting, they are much more powerful and much more quiet with the right cooling. Since I need CUDA, I purchased a RTX 2060 Super last year when it was still 499 Euro. I bought a Radeon RX 580 around the same time for 216 Euro. I recently sold it since I didn’t need it anymore, and was offered around 400 Euro. It’s crazy what a 4 year old GPU costs, even second-hand.

                I hope that the crypto mining craze is soon over and GPUs can be had at normal prices again. But I fear that it will only get worse with Chia also driving up the prices of SSDs.

                1. 1

                  That one looks nice, I still prefer the old XP line with a dedicated display port. Really disappointed that the USB-C port of the pulse model doesn’t even support DP, so it doesn’t have any DP connection.

                  1. 1

                    Yeah, connectivity is kind of a big downside, especially if you mean it as a full desktop replacement - only one external monitor would be too little, I think. But I also have a full desktop so I only need this when I’m not at home.

          3. 2

            To contrast: I prefer laptops with discrete NVIDIA GPUs if the laptop is running either FreeBSD or HardenedBSD. NVIDIA’s drivers are stellar on FBSD/HBSD.

            1. 5

              They’re proprietary (where’s your usual security worries with that? :D) and they always lag behind Linux in enabling new things. It took them over a year to enable Vulkan IIRC. On Linux they’ve added some KMS integration quite some time ago already. On FreeBSD that’s still ongoing unofficial-but-with-some-official-help work. Generally it’s still all custom modesetting initiated by the Xorg DDX, and in that stack they still have that situation where using Xorg once makes the VT console not display anything at all until you reboot! That’s not stellar, that’s bad.

              On the AMD/Intel side, we haven’t been keeping up with kernel updates (so don’t rush to buy a 6900 XT from scalpers just yet lol) but we don’t have any weird problems like that, everything is basically identical to Linux, you can run git versions of Mesa to always keep up with graphics development or whatever. Something like this news? Yeah I can just build and run that code myself.

              1. 1

                As someone who advocates for the BSD license, I have zero issues with proprietary works. I think you’re operating on some outdated info. The console works fine for me when I switch between xorg and the virtual consoles. My Dell Precision 7540 shipped with the absolute latest NVIDIA Quadro discrete GPU. Worked fine on HardenedBSD out-of-the-box for me. The laptop even drives 2x4k monitors in addition to the laptop monitor itself.

                I don’t game, so Vulkan and OpenGL aren’t really important to me. Supporting multiple monitors is what’s important to me.

                So: FreeBSD’s great NVIDIA GPU support works wonderfully for me. Does everything I myself need and more.

                edit[0]: I should clarify that OpenGL works fine with NVIDIA. I do have ioquake3 installed on my laptop, and it works just fine. I mainly have ioquake3 installed to test if/how the exploit mitigations and security hardening techniques I’m developing impacts applications.

                1. 1

                  console works fine for me when I switch between xorg and the virtual consoles

                  I guess they fixed this in new drivers. So it’s limited to older GPUs then. They keep dropping support for those so their users are stuck with old drivers – and no easy way to fix because proprietary. In version 340 the issue’s there, we even got a very recent confirmation in an unrelated issue.

                  zero issues with proprietary works

                  I would expect you to at least consider them inherently less trustworthy because you can’t know what they really are without extensive reverse engineering :) But for me the most frustrating thing is the inability to just modify them.

                  1. 4

                    I would expect you to at least consider them inherently less trustworthy because you can’t know what they really are without extensive reverse engineering

                    We’re getting off-topic here, but: I trust proprietary works the same amount I trust open source works: not much. The openness of the software does not correlate with the quality of the software. I work on proprietary code for my day job. I don’t deliberately add flaws to my code at work. I treat my code at work the same as my open source hobby code.

                    Remember OpenSSL’s Heartbleed? Remember how we all assumed that since it was open source, it was naturally secure and that there were many eyes on it? Heartbleed alone should teach us in general that software sucks, regardless of its status of open or closed source.

                  2. 1

                    Why do you need the absolute latest Nvidia Quadro for running two displays?

                    Wouldn’t even the most mediocre Intel iGPU do that job?

                    1. 1

                      I didn’t need it. The laptop shipped with it.

              2. 1

                I’m using an hybrid setup since 2016 and works most of the time. Can’t deny the battery stuff though.

              3. 6

                Other than MacBooks, are there any UltraBook (or relatively slim) laptops using a 16:10 aspect ratio? I can work with 16:9 on a 13” display and love my PBP and ThinkPads, but I prefer a bit more vertical space (hell, I prefer my iPad Pro’s 4:3 at that size).

                1. 5

                  The latest (gen 9) Thinkpad X1 carbon switches to 16:10.

                  1. 4

                    I use an old IBM ThinkPad with a 4:3 1024x768 display. It’s pretty great. (And I can look at it without squinting!) But it’s twenty years old, so I can’t watch videos on it or use modern web browsers.

                    That said, I’m happy that vendors are finally exploring aspect ratios other than 16:9, which is arguably the worst one, at least for laptops.

                    1. 1

                      And I can look at it without squinting

                      I thought that Thinkpads from the 4:3 era predated LED backlights; isn’t it extremely dim? I’ve honestly been tempted to pick up an older 1600x1200 but the idea of going back to a CCFL just seems like a step too far.

                      1. 2

                        In a sunny room, it’s pretty dim, but workable. I use light themes predominately. Not sure what kind of backlights it has exactly. Definitely worse than my X1 Carbon 3rd gen.

                        But I’d personally take a dim screen over a high-dpi screen. The X1 sees little to no use because GUI scaling never works well and everything is too small without it.

                        1600x1200 might not be too bad, though, depending on the size.

                        1. 2

                          You can run a HDPI display at a lower resolution, and it generally looks amazing since the pixels are so small you see none of them (whereas that’s all you see when running a 1024x768 ~13” native display)

                          1. 1

                            Well, you can only run it at half resolution, right? Doesn’t work out too well unless you have really high dpi. 1920x1080/2 is 960x540, which is a very low resolution for 13".

                            But I don’t know what you mean about pixels. I don’t “see” the pixels on any of my laptops, regardless of resolution. The only screen I’ve ever been able to visually see the pixels on was the pre-Retina iPhone.

                            1. 1

                              Well, you can only run it at half resolution, right? Doesn’t work out too well unless you have really high dpi. 1920x1080/2 is 960x540, which is a very low resolution for 13”.

                              HDPI is not a resolution, it’s pixel density. I don’t think you’re limited to /2 scaling.. I’ve certainly done that (e.g. 4k display at 1080p), but also have run a 4k display at 1440p or a 1080p display at 1280x720.

                              But I don’t know what you mean about pixels. I don’t “see” the pixels on any of my laptops, regardless of resolution.

                              Strange, I see them on my partner’s 1366x768 IPS thinkpad x230 display. Maybe it’s one of those things that once you see, you can’t unsee it.

                              1. 2

                                HDPI is not a resolution, it’s pixel density.

                                Yes, I know, that’s why I specified the size of the screen as well as the resolution.

                                I’ve certainly done that (e.g. 4k display at 1080p), but also have run a 4k display at 1440p or a 1080p display at 1280x720.

                                Hm. A 1920x1080 display should not be able to – properly – run at 1280x720 unless it is a CRT. Because each pixel has an exact physical representation, it won’t align correctly (and the screen will thus be blurry) unless the resolution is exactly half of the native resolution (or a quarter, sixteenth etc.).

                                Strange, I see them on my partner’s 1366x768 IPS thinkpad x230 display. Maybe it’s one of those things that once you see, you can’t unsee it.

                                Yeah, strange! As I said, I saw them on the iPhone <4, so I sort of know what you’re talking about, but I’ve never seen them elsewhere.

                                Perhaps it really depends on some other factor and has little to do with dpi after all?

                          2. 2

                            My home desktop has a lovely 1600x1200 20” monitor that we pulled off the curb for free. It’s actually such a pleasure to use; too bad so much modern software is designed specifically for widescreen.

                      2. 4

                        The frame.work laptop is 3:2 (And the pricing seems not too bad either - Looks close to double the peformance of my ‘12 Retina MBP and nicely confgured for ~1300US$ but my MBP is still running fine for what I’m using it for)

                        https://frame.work/products/laptop-diy-edition

                        1. 2

                          The XPS 13 has a 16:10 display now and even has an OLED option. Developer Edition (aka it with Ubuntu): https://www.dell.com/en-us/work/shop/dell-laptops-and-notebooks/new-xps-13-developer-edition/spd/xps-13-9310-laptop/ctox139w10p2c3000u

                          I’ve been eyeing it up for awhile now myself.

                          1. 3

                            Note that, IIRC, OLED laptop displays are kind of weird on Linux because the traditional model of ‘just dim the backlight’ doesn’t work. I don’t know what the current state of the world is, but I definitely remember a friend complaining about it a year-ish ago. I personally wouldn’t go for it unless I could confirm that there was something working well with that exact model.

                            1. 4

                              Thanks for the heads up. I’m not seeing anything that definitively says it’s fixed now, but it does sound like there’s at least a workaround: https://github.com/udifuchs/icc-brightness

                              Hopefully by the time I actually decide to get one there will be proper support.

                              1. 1

                                Huh, I thought the display panels would translate DPCD brightness control into correct dimming. Looks like I might be right: e.g. for the Thinkpad X1 Extreme’s AMOLED panel there is now a quirk forcing DPCD usage.

                            2. 1

                              Pretty much everything in the Microsoft Surface line is 3:2.

                              1. 1

                                All the 3:2 displays I’ve seen have been glossy; do any exist in matte?

                              2. 1

                                X1 nano

                              3. 5

                                My only hope is that Bill Harding (GitClear), who is working on improving the Linux touchpad experience, will eventually find a magic software tweak or something

                                Has any useful work come out of that crowdfund? Other than backporting gesture support to X11 which I wouldn’t really consider useful because it is against the goals of Wayland world domination :D

                                Generally everything should already be really good (if you use Wayland). One tweak that might seem like magic – depending on hardware – is this GTK patchset for scroll event interpolation. If you have a particularly nasty refresh rate mismatch between the touchpad and the screen (say 90 and 60 Hz) scrolling in GTK would feel jittery and this patch fixes that. Also you might like this libinput accel profile patch.

                                1. 3

                                  I’m running the same machine under Debian and Linux 5.10.19-1 since a year. Generally, I concur with the sentiment of the article. I’m running the proprietary Nvidia driver, though, because I want to make use of the GPU. The downside is that I’m not even getting 2h of battery time in normal usage scenarios…

                                  Did anyone get proper syspend to work in Linux on this machine? I can either hibernate or do a ACPI S2 (software suspend where a lot of stuff keeps running and the battery will be empty within hours). I used to have a Thinkpad X1 with the same issue - Lenovo fixed it with a BIOS update within 6 months, though. For this machine, I’m waiting for a good year now.

                                  1. 2

                                    I have the ThinkPad P1 Gen 3 with 4K screen, Intel i9-10885H, and Quadro T2000 Max-Q. It’s basically the same laptop as this review, but a Quadro instead of GeForce GPU. It’s basically maxed out across the board and I even added a 2nd SSD. Feels great to use, but battery life is not great, it requires a special charger with a special port to charge. Doing just about anything with it makes it warm / hot and the fans spool up quite loud. This happens in both Linux and Windows.

                                    I also have a MacBook Air with an M1. It doesn’t even have a fan, hardly ever gets even warm, beats the Thinkpad on all but the GPU portion of Geekbench. And feels subjectively faster at almost everything, the battery lasts all day, charges fast on standard USB-C (doesn’t have to be a huge wattage charger) and the laptop speakers sound better.

                                    I prefer the ThinkPad screen slightly, especially since it’s 2” larger. The ThinkPad keyboard is a bit nicer, but the MacBook Air keyboard is much improved over the abomination that Apple used to ship. My hatred for those keyboard was what got me on the ThinkPad train.

                                    I end up using the MacBook Air FAR FAR more, even though maybe I prefer Linux a little over macOS.

                                    When Apple ships a 14” or 16” MacBook Pro with >= 32GB of RAM it’s going be really hard to keep me using a Thinkpad for anything other than a bit of tinkering with Linux or OpenBSD (I also have a X1 Carbon Gen7 for OpenBSD).

                                    1. 1

                                      I’m glad to hear that newer ThinkPads are still reasonably Linux-friendly… aside from the NVIDIA badness :(

                                      I have a T450s, and I’ve thought about upgrading since it doesn’t handle a lot of the modern JS-bloated web as well as I’d like, but I definitely don’t want to use Windows or macOS as my daily driver.

                                      1. 1

                                        I’m actually thinking of downgrading my X230 to an X220 or even X200 in order to get the better old-style keyboard. The modern, JS-bloated web can go cut bait.

                                      2. 1

                                        It IS quite nice now that they have ironed out most of the bugs. Pre 5.8.0 (I’m on 5.12.8 now and it works ootb), there were many bugs.