Or maybe rants get more engagement and people are ranting mostly to feel good.
or maybe the entire structure of gamified discussion boards based on upvotes/downvotes incentivizes people to rant and the discussions being had are actually only proximately about their topics but ultimately about servicing the specific incentive structures that mediates the conversations in question and as a community we face a moral imperative to critique how the incentive structures of communications mediums alter the communications they facilitate?
I’ve been on these systems since about 2005 or so, and every time the communities start out with a bunch of people interested in some topic, the platform facilitates that topic, and people’s interest shifts from the topic to the platform itself. Over time, every one of these platforms devolves into people playing the algorithm to maximize their “score”, and so the things being said are not what the people want to be said, they’re the thing the scoring algorithm wants to be said. No amount of human moderation changes this evolution and every democratic discussion board approaches critical insularity past some threshold of user size. In the past seventeen or so years I’ve participated in these systems I’ve watched this cycle repeat itself on digg, reddit (where the phenomenon is largely recognized to exist), HN (where the phenomenon clearly exists but people are mostly unwilling to acknowledge it), and here (where the phenomenon is nascent but in its early stages).
How many times are we going to build this discussion board format and see it devolve the same way before we as a community truly come to terms with the idea that maybe democracy as a system doesn’t actually produce meaningful, deep, and nuanced conversations, but instead always tends towards tribalism and conformance? Anyway the laptop looks neat.
I think it would make a difference too! As a game designer, my immediate reaction to the question is “that sounds like a great thing to playtest”.
I wonder how severe the difference would be, whether it would make the experience more enjoyable or less enjoyable, whether the discussions get better or worse, etc. If it makes the discussions better but makes the population collapse so that they just go to somewhere that does show them the score, that’s probably bad. Would that happen? Honestly it’s too tough to predict without trying it, experiencing it with others, and talking to people when they use it. But definitely something worth considering (or asking if people have tried this before and what sort of results they found).
“Gamification” looks like will happen without and algo as well. Consider “last comment to this topic”, “amount of comments”, comment notifications which hit OP tenfolds if they post some rants compared to some fun weekend project experience.
Algos amplify that feedback loop, for sure, but till there is feedback, there will be a guidance towards what is worth to be posting.
Removing this feature leaves us with RSS, which is also fine, but strips these excellent discussions like what you started :) I’d definitely feel bad if we couldn’t have that.
How many times are we going to build this discussion board format and see it devolve the same way before we as a community truly come to terms with the idea that maybe democracy as a system doesn’t actually produce meaningful, deep, and nuanced conversations, but instead always tends towards tribalism and conformance?
As long as you can make money with it.
Anyway, it may look neat, I just never had confidence in HP laptops’ reliability.
I’m not under the impression that lobste.rs makes any money or exists to make money; I think people build this structure because people like to create and participate in communities and the upvote/downvote structure is intuitively a great idea that you expect to work. In theory, it’s a great structure! In practice … well, history seems to have a way of repeating itself.
and yeah I don’t mess with HP myself. For a non-apple laptop I’ve actually found the Razer Blade to exceed my expectations, although it’s not cheap. Haven’t run Linux on one myself.
Which major PC manufacturer do you have confidence in? I certainly have enough experience with Dell to know they’re not to be trusted to be reliable either.
I don’t know much about Lenovo’s hardware, it’s possible that it’s universally fairly reliable. However, I’d have a hard time buying anything from the kind of company which would install malicious TLS root certificates onto their machines.
Yep, but they all actually do similar things, especially on consumer devices.
ThinkPads as developer machines are awesome, even delivered clean from the factory. In fact, you can just get them with Linux.
This is an important subject, and I’d love to have a high quality discussion about this, where carefully considered long form positions and critiques are posted over the course of weeks. I don’t think that Lobste.rs is the place for that. If anybody has relevant links for this subject area, please post!
It’s why I joined Something Awful a few years back.
A lot has changed since it’s heyday – it depends on subforum of course, but the mods are pretty vigilant and just taking a look down the “leper’s colony” shows probations and bans for racism, sexism, etc. (things that are rife to both Hacker News, and Reddit). It has consistent threads with regulars, most of the topic-based threads have been going for about 6+ years, and while there are sometimes debates being rehashed, a lot of the things people post are reasonably new. Whatever interest you have, there is a thread for it. And of course, there are no upvotes or updoots or karma system – nobody there cares for it anyway, because everyone knows if you post then that post is automatically bad :). Entry to post is a one-time payment of $10, which isn’t too costly but it does fund the site and means that getting banned actually has ate into the person’s wallet, which disincentivizes shitty behaviour. The primary userbase these days is in the 30 - 60 age range, and the age of the site means that most of the “novel” jokes have already been done to death 15 years ago.
The only problem with Something Awful is, if you see me, you’re reading my posts! And whether or not you feel that punishment worth it is only up to you :)
There are threads dedicated to poking gleeful fun at Hacker News, and indeed Reddit, for not only the things you mentioned, but also the sheer banality of them, and the complete and utter tripe that is produced en masse by both (Not to mention, Reddit’s particularly unsavory history – e.g. most of the users of r/teenagers being outed as predators, Reddit’s long love dance with misogynists (A post like this categorically would not survive on Something Awful), Hacker News’ long love of VC bullshit and neoliberalism, etc.)
No internet upvotes required: get any group of otherwise unfamiliar software nerds together in person (say, at $JOB) and say something about laptop hardware, and all these takes come out! Although, they’ll be more self-moderated for polite company.
This is one I keep seeing and don’t get. Especially the “I need 16gb just to run Chrome” people. I’m using 8gb for development, running 4x SQL servers, VS, browsers at the same time - and it works just fine. You really need to have specific use case to require more. It’s fine if they have such use case, but it’s really not a base dev requirement.
They probably take a quick look at how much Chrome is taking memory on the VIRT column and then decide that whatever their RAM is, they don’t have enough to run even just the browser.
You know, you might just be right. Btw, is windows also showing RAM usage in a similar manner? I never bother to look (I’m one of the people the author didn’t mention, which is, running Linux because I like it, it does things I want it to do, and I don’t bother anyone with it if they don’t ask me about it. I will talk a lot about it, I talk about all my favorite things, but I never judge people using windows, and for most people I’d actually suggest macs over linux.
is windows also showing RAM usage in a similar manner?
No, the most visible indicator (first tab of the windows 10 task manager) shows physical pages taken by the process.
The old task manager and the details tab are configurable but I believe still show the physical ram taken, including IO caches.
Assuming that most people looking at this use Resource Monitor (the older Task Manager doesn’t provide as much detail and there’s so many 3rd party tools that there’s no unified way to discuss them), I think the culprit is more Windows’ memory use strategy than information displayed about virtual memory. Windows tends to run with very little “free” memory (currently my Windows system shows ~100 MB of ~12 GB memory as “free”) and most available memory in “standby” (currently ~6 GB for me).
In theory, standby memory is supposed to just be cached information that’s available instantaneously for processes requiring more “in use” memory. My experience is that the only time I actually benefit from clearing standby memory is for playing games - it seems as though Windows does hesitate to free standby for use by currently active applications though I’ve never cared enough to investigate this in detail. Microsoft’s position is that it makes sense to move as much data to memory as the hardware can support. This makes sense though I’m dubious that the way Windows does this makes the most sense: it certainly seems that both of my Linux laptops are snappier in general though obviously memory use is only one component of that. There’s a fairly short SuperUser thread on standby memory in Windows addressing this.
I always pick a machine with the maximum amount of memory that is somewhat affordable at the time of buying, because I have found that most of the time the internal memory is the limiting factor for longevity. The more RAM it has, the longer I will be able to use it. Running newer stuff on an older CPU has never been a big problem, but (non-upgradable) RAM is.
Case in point: I still use a laptop from 2015 on a daily basis without any problem. But that is because I bought it with 16Gb RAM, which sounded back then as an absurd amount.
Linking the debug version of libxul.so (most of Firefox) requires more than 16GB :D Otherwise, yeah, most things I do are not RAM heavy at all. CPU speed is much more useful generally.
I’ve experience multiple times that windows gives me a popup about running out of ram (16GB), then start to force programs to close, usually because of chrome. I’ve never run into this issue on Linux.
What version of Windows are you running? I can never recall coming across such a popup for memory proper though those used to be very common regarding running out of virtual memory on my old laptop from circa 2007 - my one Windows laptop is a ~5 year old Dell running Win 10 with 12 GB memory (that said, I killed a lot of Windows behaviors I regard as resource hogs when I bought this so that might have helped) though anything really memory intensive I shove to my server…
They likely have the swap file disabled. If so, then you can easily encounter that error using memory hungry programs like Chrome. They kind of assume you have a swap file to move cold objects to when memory pressure starts increasing rather than going lean themselves.
If you want to compile it often then probably not. But I get to do it from time to time and 8GB + some swap is just fine. It completes during a lunch break.
I have 32GB on my laptop and routinely run out. (Though less often than I run out on my 16GB work computer.)
A colleague used to have an 8GB machine. He kept having to reboot it because compiles would cause it to thrash the swap, freezing the system completely and effectively permanently.
I have my system set up with earlyoom to aggressively kill compiler processes when it runs low on RAM. That’s stupid, but less stupid than the alternative.
Anyway, I’m happy to see that people found laptops and OS combos they enjoy. The most important part of the article is understanding that choosing different is OK.
Still, what you see every time one of such laptops reach the news is a gazillion complaints.
I think I understand why this happens. There are a lot of Linux users who want a powerful, Linux-friendly laptop with a build quality that lives in the same galaxy as Apple. Since there isn’t any manufacturer working in this space (though that seems to be changing hopefully!), they judge each new non-Apple laptop as a potential “macbook but with Linux”, not as what the manufacturer is shipping (another craptop). It’s a bunch of would-be consumers with nothing to buy, judging each new product in terms of “could I force this into being the laptop I want?”
I might be projecting myself here, because I’m guilty of this too. :)
I would add on that not only do people do this to non-Apple laptops, they’ve historically done it to Apple laptops.
The fact that macOS is Unix-y enough for a certain type of programmer to feel comfortable with is basically an accident of Apple’s corporate history. The fact that Apple’s laptops had decent specs is historically due to Apple going after a certain segment of the “creative professional” market — people who work with video, audio, and graphics for a living.
But a lot of programmers saw the combination of Unix-y OS and decent specs and interpreted it as “I, a programmer, am the sole target customer for this device”. A lot of anger at Apple was then built on feeling that they had “betrayed” or even “abandoned” this “target market” with changes that make more sense once you stop assuming that programmers were the sole target market of the laptops.
I agree mostly with your points, but a lot of complaints also came from the 2016-2020 period (foreboded by the beautiful, but crappy trashcan Mac Pro). I don’t think the butterfly keyboard or the terrible overheating Intel CPUs with loud fans were good for any market. I have been a Mac user since 2007 and I even left Macs for a brief while at the end of that period, because the hardware was just terrible and macOS was getting quite buggy.
It seems like the 2017 Pro meeting was a turning point when Apple realized they need to change course. Of course, since a product pipeline cannot be changed overnight, improvements trickled in slowly (first by eliminating the terrible butterfly keyboard). But now the Mac is in such a good state that both a lot of creative professionals and developers like it.
Put differently, the different target markets of the Mac also have a lot in common (fast, functional hardware and a stable macOS).
But then for the time you want to work away from the desk you need an extra laptop. Not everyone needs that of course, but if you want to work remotely away from home or if you do on-call, then laptop’s a requirement.
Can’t speak about the other poster, but I think power distribution in the US would qualify as risky. And not only in rural areas. consider that even Chicago burbs don’t have buried power lines. And every summer there’s the blackout due to AC surges. I’d naively expect at least 4 or 5 (brief) blackouts per year
i get that, but it’s also not a very productive framework for discussion. i like my laptop because i work remotely – 16GB is personally enough for me to do anything i want from my living room, local coffee shop, on the road, etc. i do junior full-stack work, so that’s likely why i can get away with it. obviously, DS types and other power hungry development environments are better off with a workhorse workstation. it’s my goal to settle down somewhere and build one eventually, but it’s just not on the cards right now; i’m moving around quite a bit!
my solution? my work laptop is a work laptop – that’s it. my personal laptop is my personal laptop – that’s it. my raspberry pi is for one-off experiments and self-hosted stuff – that’s it. in the past, i’ve used a single laptop for everything, and frequently found it working way too hard. i even tried out mighty for a while to see if that helped ((hint: only a little)). separation of concerns fixed it for me! obviously, this only works if your company supplies a laptop, but i would go as far as to say that even if they don’t it’s a good alternative solution, and might end up cheaper.
my personal laptop is a thinkpad i found whilst trash-hopping in the bins of the mathematics building at my uni. my raspberry pi was a christmas gift, and my work laptop was supplied to me. i spend most of my money on software, not really on the hardware.
edit: it’s also hard; since i have to keep things synced up. tmux and chezmoi are the only reasonable way i’ve been able to manage!
Unfortunately I don’t think this is well known to most programmers. Recently a fairly visible blogger posted his workstation setup and the screen was positioned such that he would have to look downward just like with a laptop. It baffled many that someone who is clearly a skilled programmer could be so uninformed on proper working ergonomics and the disastrous effects it can have on one’s posture and long-term health.
Anyone who regularly sits at a desk for an extended period of time should be using an eye-level monitor. The logical consequence of that is that laptop screens should only be used sparingly or in exceptional circumstances. In that case, it’s not really necessary to have a laptop as your daily driver.
After many years of using computers I don’t see a big harm of using a slightly tilted display. If anything a regular breaks and stretches/exercises make a lot more difference, especially in long term.
If you check out jcs’ setup more carefully you’ll see that the top line is not that much lower from the “default” eye-line so ergonomics there works just fine.
(I switched to a tablet PC, the screen is also tilted a bit but raised closer to eye level. Perhaps the photo in the ‘fairly visible blogger’s setup was setup for the photo and might be raised higher normally)
That assumes you’re using the laptop’s built-in keyboard and screen all day long. I have my laptop hooked up to a big external monitor and an ergonomic keyboard. The laptop screen acts as a second monitor and I do all my work on the big monitor which is at a comfortable eye level.
On most days it has the exact same ergonomics as a desktop machine. But then when I occasionally want to carry my work environment somewhere else, I just unplug the laptop and I’m good to go. That ability, plus the fact that the laptop is completely silent unless I’m doing something highly CPU-intensive, is well worth the loss of raw horsepower to me.
I bought a ThinkStation P330 2.5y ago and it is still my best computing purchase. Once my X220 dies, if ever, then I will go for a second ThinkStation.
A few years ago I bought an used thinkcentre m92. Ultra small form factor. Replaced the hard drive with a cheap SSD and threw in extra RAM and a 4k screen. Great set up. I could work very comfortably and do anything I want to do on a desktop. Including development or whatching 4k videos. I used that setup for five years and have recently changed to a 2 year old iMac with an Intel processor so I can smoothly run Linux on it.
There is no way I am suffering through laptop usage. I see laptops as something suited for sales people, car repair, construction workers and that sort of thing. For a person sitting a whole day in front of the screen… No way.
I don’t get the need for people to be able to use their computers in a zillion places. Why? What’s so critical about it? How many people actually carries their own portable office Vs just doing their work on their desks before the advent of the personal computer?
We even already carry a small computer in our pocket att all times that fills up lot of personal work needs such as email, chat, checking webpages, conference calls, etc. Is it really that critical to have a laptop?
I don’t get the need for people to be able to use their computers in a zillion places. Why? What’s so critical about it?
I work at/in:
The office
Home office
Living room
The first two are absolutely essential, the third is because if I want to do some hobbyist computing, it’s not nice if I disappear in the home office. Plus my wife and I sometimes both work at home.
Having three different workstations would be annoying. Not everything is on Dropbox, so I’d have to pass files between machines. I like fast machines, so I’d be upgrading three workstations frequently.
Instead, I just use a single MacBook with an M1 Pro. Performance-wise it’s somewhere between a Ryzen 5900X and 5950X. For some things I care about for work (matrix multiplication), it’s even much faster. We have a Thunderbolt Dock, 4k screen, keyboard and trackpad at each of these desks, so I plug in a single Thunderbolt cable and have my full working environment there. When I need to do heavy GPU training, I SSH into a work machine, but at least I don’t have a terribly noisy NVIDIA card next to me on or under the desk.
The first two are absolutely essential, the third is because if I want to do some hobbyist computing, it’s not nice if I disappear in the home office.
I believe this is the crux of it. It boils down to personal preference. There is no way I am suffering to the horrible experience of using a laptop because it is not nice to disappear to the office. If anything, it raises the barrier to be in front of a screen.
Your last paragraph is exactly my thoughts. Having a workstation is a great way to reduce lazy habits IMNSHO. Mobility that comes with a laptop is ultimately a recipe for neck pain, strain in arms and hands and poor posture and habits.
I have 3 places in which I use my computer (a laptop). In two of them, I connect it to an external monitor, mouse and keyboard, and I do my best to optimize ergonomics.
But the fact that I can take my computer with me and use it almost anywhere, is a huge bonus.
Hot take here, but I find Linux on the desktop to be insufferable. I want it to be good, I really do, but it just isn’t. I feel like windows has no place on a server, but for my desktop? I’ve been quite happy with Win11, Terminal app, and WSL2. I live in the best parts of Linux, and I can actually use my computer with the best parts of Windows.
Two main uses for my computer are unfortunately filled with gotchas on Linux, MS Teams, and Zoom. They’re both terrible.
MacOS is almost the best of both worlds too, but you’re locked into their hardware. Which was a liability until recently (M1).
I won’t dive into certain yellow website discussion about that (even though it is what induced the article itself) so don’t know if people made the same points as me.
As a “target audience” I roughly understand what do I want from the machine - the specs look totally fine, after all these years I know for sure that if I won’t “fit” into 16gb ram then even 64gigs certainly won’t be enough, so 16 is just fine for me. As the rest of the specs too.
My biggest disappointment about that laptop is actually its ergonomics. The 0.5x functional keys row is bad. Arrows that are not t-shaped are bad. The whole trend with abysmal giant touchpad is bad too - make it 0.75 or even 0.5 of the current size and it would be much better because the main input device - keyboard - will be bigger and better.
Touchpad shifted to the left side is bad - keyboard doesn’t even have a numbad yet they shift it (ignoring the existence of left-handed people also).
As even more subjective thing I would like to see a 4x3 or 3x2 display on a laptop aimed for developers, I think that makes sense.
The whole trend with abysmal giant touchpad is bad … the main input device - keyboard - [should] be bigger and better
I think you hit on one of the primary problems with laptop design here: many of us regard the touchpad as an annoying distraction from main input but what seems like the vast majority of the market look at the touchpad as primary input. It seems that the laptop designers either don’t understand that a small but generally wealthy segment of the market has this attitude or simply don’t care.
Walk into any developer-focused conference, even one focused on Linux or Windows development, and 95% of the laptops will be MacBooks. And the rest will be clunky corporate machines which probably weren’t the owner’s choice. Dell and others have offered Linux as a supported option for years, and they’ve had few takers, so I don’t think this is likely to change anytime soon.
I think HN comment rants are just a long-form ‘dislike button’. The author here is right that there are valid complaints, and for a giant OEM to release this underwhelming device with last year’s tech and marketing terminology for non-devs, it’s no wonder people feel knee-jerky about device that feels like it had no input from what many devs actually want. The laptop space, unlike deskop, is not so friendly towards DIY so people feel they gotta choose the least bad option because they can’t customize exactly what they want. Maybe that’s the crux; a dev-oriented device needs to have have many customizable specs to the highest-ends (since dev salaries can afford nice things) if not straight-up skip some of their parts to pick your own after-market RAM/HDD/etc. Heck, I think the Linux crowd would rather just all laptops didn’t come any pre-installed OS instead paying for something you’ll wipe immediately out of lack of trust.
It’s just as if people (even if they’re all developers) have wildly differing needs. For example if you’d ask me what I like or dislike about my current laptop then the most important factor would be my current job and what I am actually doing, so I guess this has completely changed at least three times in the last 10 years. (Pretty stationary with a large ThinkPad until 2013, then very mobile with an x230 until 2017, then I had a severely underpowered i5 ThinkPad, now since 2021 it’s Dell with a worse keyboard, no trackpoint and perma-WFH - so I don’t actually mind).
And I’m saying this as someone who doesn’t usually even look for the ‘perfect’ machine, but if it has one KO criterion I feel like I am entitled to rant. Especially as many of us are stuck with the one brand that our employers let us choose, and sometimes we can’t even test it out beforehand. If you HAVE to use it for 3 years (kinda the standard here, sometimes 4, rarely 2) it better be near perfect.
Lenovo W520 user here, 2 laptops one with 16 and the other with 32 GB ram.
Both have the same Intel Gen 2 I7 cpu. I think they were made around 2012, bought them for about 120bucks each a few years back, and updated memory + SSD (so that was additional $$).
I use laptops because I need to do video/audio conf. call, develop, and move sitting/standing positions through out a week. Otherwise, during months where I work lots of hours –elbows, shoulders and wrists start hurting – so relocating where I sit, periodically, solves that problem for me – which is why I prefer laptops.
What is not ideal about laptops is that you are looking down on screen, and that reduces ability to focus/concentrate
(you need to be looking slightly up).
Where possible, I utilize XRDP and connect to an actual faster workstation with an RDP client.
But the only system where I can get XRDP to work with audio – is Freebsd, however freeBSD does not support android dev, so I cannot host my dev android env there (works well for backend /ansible work, however).
What made me finally pull the trigger and upgrade laptops – is the Android + React-Native + Expo-AV (RN audio/video wrapper). For every iteration of fix-test cycle (especially when a bug shows in release mode only) , on my Lenovo – it would take now, consistently, 6 minutes.
This is not java compiler that’s doing it , it is RN + Expo-AV packaging of resources, shrinking JS code, etc.
This happened overtime, in 2014-15 when I started using React-native – it was bearable. But now it is just not workable. As a side note, the Expo folks that are responsible for dev/build setup – change their mind every 6 months, so upgrading to a new release of Expo-AV is basically a very stressful and painful ‘tooth pulling pain’ kind of experience, that required slow try-error-retry cycle (and takes 6 min each) (as a side note, if you do not have to use RN Expo, do not use it … )
What is not ideal about laptops is that you are looking down on screen, and that reduces ability to focus/concentrate (you need to be looking slightly up).
For proper ergonomics you should not be looking up. A laptop screen likely doesn’t qualify since it is too low, but you definitely should not be looking up.
That’s quite an old system, and yes, you can actually notice the compile times. Personally I have a slightly different problem: an i7-4790k from about 2014-2015. That’s still fast, it crunches stuff normally, no problems. On par with today’s laptops. So basically I should be good. But I know it could go quite a bit faster, and I want that performance. But I can’t justify it :(
Yeah, I think I like the screen/keyboard combo on those laptops so as long as the CPU speed was sufficient, I was ok.
In general, I like the approach to laptop design that frame.work folks are taking.
With their design I am supposed to be able to keep screen/keyboard and many other laptop components, but they enable to replace CPU modules (including moving from one generation to another).
or maybe the entire structure of gamified discussion boards based on upvotes/downvotes incentivizes people to rant and the discussions being had are actually only proximately about their topics but ultimately about servicing the specific incentive structures that mediates the conversations in question and as a community we face a moral imperative to critique how the incentive structures of communications mediums alter the communications they facilitate?
I’ve been on these systems since about 2005 or so, and every time the communities start out with a bunch of people interested in some topic, the platform facilitates that topic, and people’s interest shifts from the topic to the platform itself. Over time, every one of these platforms devolves into people playing the algorithm to maximize their “score”, and so the things being said are not what the people want to be said, they’re the thing the scoring algorithm wants to be said. No amount of human moderation changes this evolution and every democratic discussion board approaches critical insularity past some threshold of user size. In the past seventeen or so years I’ve participated in these systems I’ve watched this cycle repeat itself on digg, reddit (where the phenomenon is largely recognized to exist), HN (where the phenomenon clearly exists but people are mostly unwilling to acknowledge it), and here (where the phenomenon is nascent but in its early stages).
How many times are we going to build this discussion board format and see it devolve the same way before we as a community truly come to terms with the idea that maybe democracy as a system doesn’t actually produce meaningful, deep, and nuanced conversations, but instead always tends towards tribalism and conformance? Anyway the laptop looks neat.
What if the scores were simply never shown to anyone (including authors)? I think it would make a difference.
I think it would make a difference too! As a game designer, my immediate reaction to the question is “that sounds like a great thing to playtest”.
I wonder how severe the difference would be, whether it would make the experience more enjoyable or less enjoyable, whether the discussions get better or worse, etc. If it makes the discussions better but makes the population collapse so that they just go to somewhere that does show them the score, that’s probably bad. Would that happen? Honestly it’s too tough to predict without trying it, experiencing it with others, and talking to people when they use it. But definitely something worth considering (or asking if people have tried this before and what sort of results they found).
“Gamification” looks like will happen without and algo as well. Consider “last comment to this topic”, “amount of comments”, comment notifications which hit OP tenfolds if they post some rants compared to some fun weekend project experience.
Algos amplify that feedback loop, for sure, but till there is feedback, there will be a guidance towards what is worth to be posting.
Removing this feature leaves us with RSS, which is also fine, but strips these excellent discussions like what you started :) I’d definitely feel bad if we couldn’t have that.
As long as you can make money with it.
Anyway, it may look neat, I just never had confidence in HP laptops’ reliability.
I’m not under the impression that lobste.rs makes any money or exists to make money; I think people build this structure because people like to create and participate in communities and the upvote/downvote structure is intuitively a great idea that you expect to work. In theory, it’s a great structure! In practice … well, history seems to have a way of repeating itself.
and yeah I don’t mess with HP myself. For a non-apple laptop I’ve actually found the Razer Blade to exceed my expectations, although it’s not cheap. Haven’t run Linux on one myself.
Which major PC manufacturer do you have confidence in? I certainly have enough experience with Dell to know they’re not to be trusted to be reliable either.
Lenovo would be one. But that’s my experience, personal and close one.
I don’t know much about Lenovo’s hardware, it’s possible that it’s universally fairly reliable. However, I’d have a hard time buying anything from the kind of company which would install malicious TLS root certificates onto their machines.
Yep, but they all actually do similar things, especially on consumer devices. ThinkPads as developer machines are awesome, even delivered clean from the factory. In fact, you can just get them with Linux.
This is an important subject, and I’d love to have a high quality discussion about this, where carefully considered long form positions and critiques are posted over the course of weeks. I don’t think that Lobste.rs is the place for that. If anybody has relevant links for this subject area, please post!
It’s why I joined Something Awful a few years back.
A lot has changed since it’s heyday – it depends on subforum of course, but the mods are pretty vigilant and just taking a look down the “leper’s colony” shows probations and bans for racism, sexism, etc. (things that are rife to both Hacker News, and Reddit). It has consistent threads with regulars, most of the topic-based threads have been going for about 6+ years, and while there are sometimes debates being rehashed, a lot of the things people post are reasonably new. Whatever interest you have, there is a thread for it. And of course, there are no upvotes or updoots or karma system – nobody there cares for it anyway, because everyone knows if you post then that post is automatically bad :). Entry to post is a one-time payment of $10, which isn’t too costly but it does fund the site and means that getting banned actually has ate into the person’s wallet, which disincentivizes shitty behaviour. The primary userbase these days is in the 30 - 60 age range, and the age of the site means that most of the “novel” jokes have already been done to death 15 years ago.
The only problem with Something Awful is, if you see me, you’re reading my posts! And whether or not you feel that punishment worth it is only up to you :)
There are threads dedicated to poking gleeful fun at Hacker News, and indeed Reddit, for not only the things you mentioned, but also the sheer banality of them, and the complete and utter tripe that is produced en masse by both (Not to mention, Reddit’s particularly unsavory history – e.g. most of the users of r/teenagers being outed as predators, Reddit’s long love dance with misogynists (A post like this categorically would not survive on Something Awful), Hacker News’ long love of VC bullshit and neoliberalism, etc.)
No internet upvotes required: get any group of otherwise unfamiliar software nerds together in person (say, at $JOB) and say something about laptop hardware, and all these takes come out! Although, they’ll be more self-moderated for polite company.
This is one I keep seeing and don’t get. Especially the “I need 16gb just to run Chrome” people. I’m using 8gb for development, running 4x SQL servers, VS, browsers at the same time - and it works just fine. You really need to have specific use case to require more. It’s fine if they have such use case, but it’s really not a base dev requirement.
They probably take a quick look at how much Chrome is taking memory on the VIRT column and then decide that whatever their RAM is, they don’t have enough to run even just the browser.
You know, you might just be right. Btw, is windows also showing RAM usage in a similar manner? I never bother to look (I’m one of the people the author didn’t mention, which is, running Linux because I like it, it does things I want it to do, and I don’t bother anyone with it if they don’t ask me about it. I will talk a lot about it, I talk about all my favorite things, but I never judge people using windows, and for most people I’d actually suggest macs over linux.
No, the most visible indicator (first tab of the windows 10 task manager) shows physical pages taken by the process. The old task manager and the details tab are configurable but I believe still show the physical ram taken, including IO caches.
Assuming that most people looking at this use Resource Monitor (the older Task Manager doesn’t provide as much detail and there’s so many 3rd party tools that there’s no unified way to discuss them), I think the culprit is more Windows’ memory use strategy than information displayed about virtual memory. Windows tends to run with very little “free” memory (currently my Windows system shows ~100 MB of ~12 GB memory as “free”) and most available memory in “standby” (currently ~6 GB for me).
In theory, standby memory is supposed to just be cached information that’s available instantaneously for processes requiring more “in use” memory. My experience is that the only time I actually benefit from clearing standby memory is for playing games - it seems as though Windows does hesitate to free standby for use by currently active applications though I’ve never cared enough to investigate this in detail. Microsoft’s position is that it makes sense to move as much data to memory as the hardware can support. This makes sense though I’m dubious that the way Windows does this makes the most sense: it certainly seems that both of my Linux laptops are snappier in general though obviously memory use is only one component of that. There’s a fairly short SuperUser thread on standby memory in Windows addressing this.
I always pick a machine with the maximum amount of memory that is somewhat affordable at the time of buying, because I have found that most of the time the internal memory is the limiting factor for longevity. The more RAM it has, the longer I will be able to use it. Running newer stuff on an older CPU has never been a big problem, but (non-upgradable) RAM is.
Case in point: I still use a laptop from 2015 on a daily basis without any problem. But that is because I bought it with 16Gb RAM, which sounded back then as an absurd amount.
Linking the debug version of
libxul.so
(most of Firefox) requires more than 16GB :D Otherwise, yeah, most things I do are not RAM heavy at all. CPU speed is much more useful generally.I’ve experience multiple times that windows gives me a popup about running out of ram (16GB), then start to force programs to close, usually because of chrome. I’ve never run into this issue on Linux.
What version of Windows are you running? I can never recall coming across such a popup for memory proper though those used to be very common regarding running out of virtual memory on my old laptop from circa 2007 - my one Windows laptop is a ~5 year old Dell running Win 10 with 12 GB memory (that said, I killed a lot of Windows behaviors I regard as resource hogs when I bought this so that might have helped) though anything really memory intensive I shove to my server…
They likely have the swap file disabled. If so, then you can easily encounter that error using memory hungry programs like Chrome. They kind of assume you have a swap file to move cold objects to when memory pressure starts increasing rather than going lean themselves.
Thanks, I hadn’t ever considered disabling my swap file…
Windows 10 LTSC 2019. If it never did setup a swap file, then i cry on the behalf of the entire enterprise world.
If you want to compile LLVM 8Gb doesn’t cut it. Otherwise it’s plenty for almost every task you might throw at a machine.
If you want to compile it often then probably not. But I get to do it from time to time and 8GB + some swap is just fine. It completes during a lunch break.
I have 32GB on my laptop and routinely run out. (Though less often than I run out on my 16GB work computer.)
A colleague used to have an 8GB machine. He kept having to reboot it because compiles would cause it to thrash the swap, freezing the system completely and effectively permanently.
I have my system set up with earlyoom to aggressively kill compiler processes when it runs low on RAM. That’s stupid, but less stupid than the alternative.
This requirement is real.
It’s ironic that the discussion on this article perfectly reproduced all the discussion tendencies that the article critiques.
We’ve got ‘em all:
It is a self-fulfilling prophecy :-)
Anyway, I’m happy to see that people found laptops and OS combos they enjoy. The most important part of the article is understanding that choosing different is OK.
I think I understand why this happens. There are a lot of Linux users who want a powerful, Linux-friendly laptop with a build quality that lives in the same galaxy as Apple. Since there isn’t any manufacturer working in this space (though that seems to be changing hopefully!), they judge each new non-Apple laptop as a potential “macbook but with Linux”, not as what the manufacturer is shipping (another craptop). It’s a bunch of would-be consumers with nothing to buy, judging each new product in terms of “could I force this into being the laptop I want?”
I might be projecting myself here, because I’m guilty of this too. :)
I would add on that not only do people do this to non-Apple laptops, they’ve historically done it to Apple laptops.
The fact that macOS is Unix-y enough for a certain type of programmer to feel comfortable with is basically an accident of Apple’s corporate history. The fact that Apple’s laptops had decent specs is historically due to Apple going after a certain segment of the “creative professional” market — people who work with video, audio, and graphics for a living.
But a lot of programmers saw the combination of Unix-y OS and decent specs and interpreted it as “I, a programmer, am the sole target customer for this device”. A lot of anger at Apple was then built on feeling that they had “betrayed” or even “abandoned” this “target market” with changes that make more sense once you stop assuming that programmers were the sole target market of the laptops.
I agree mostly with your points, but a lot of complaints also came from the 2016-2020 period (foreboded by the beautiful, but crappy trashcan Mac Pro). I don’t think the butterfly keyboard or the terrible overheating Intel CPUs with loud fans were good for any market. I have been a Mac user since 2007 and I even left Macs for a brief while at the end of that period, because the hardware was just terrible and macOS was getting quite buggy.
It seems like the 2017 Pro meeting was a turning point when Apple realized they need to change course. Of course, since a product pipeline cannot be changed overnight, improvements trickled in slowly (first by eliminating the terrible butterfly keyboard). But now the Mac is in such a good state that both a lot of creative professionals and developers like it.
Put differently, the different target markets of the Mac also have a lot in common (fast, functional hardware and a stable macOS).
Stop using laptops. For the same money you can get a kickassssss workstation.
But then for the time you want to work away from the desk you need an extra laptop. Not everyone needs that of course, but if you want to work remotely away from home or if you do on-call, then laptop’s a requirement.
Laptops also have a built-in UPS! My iMac runs a few servers on the LAN and they all go down when there’s a blackout.
Curious in which country you live that this is a significant enough problem to design for it?
Can’t speak about the other poster, but I think power distribution in the US would qualify as risky. And not only in rural areas. consider that even Chicago burbs don’t have buried power lines. And every summer there’s the blackout due to AC surges. I’d naively expect at least 4 or 5 (brief) blackouts per year
i get that, but it’s also not a very productive framework for discussion. i like my laptop because i work remotely – 16GB is personally enough for me to do anything i want from my living room, local coffee shop, on the road, etc. i do junior full-stack work, so that’s likely why i can get away with it. obviously, DS types and other power hungry development environments are better off with a workhorse workstation. it’s my goal to settle down somewhere and build one eventually, but it’s just not on the cards right now; i’m moving around quite a bit!
my solution? my work laptop is a work laptop – that’s it. my personal laptop is my personal laptop – that’s it. my raspberry pi is for one-off experiments and self-hosted stuff – that’s it. in the past, i’ve used a single laptop for everything, and frequently found it working way too hard. i even tried out mighty for a while to see if that helped ((hint: only a little)). separation of concerns fixed it for me! obviously, this only works if your company supplies a laptop, but i would go as far as to say that even if they don’t it’s a good alternative solution, and might end up cheaper.
my personal laptop is a thinkpad i found whilst trash-hopping in the bins of the mathematics building at my uni. my raspberry pi was a christmas gift, and my work laptop was supplied to me. i spend most of my money on software, not really on the hardware.
edit: it’s also hard; since i have to keep things synced up. tmux and chezmoi are the only reasonable way i’ve been able to manage!
Agree. The ergonomics of laptops are seriously terrible.
Unfortunately I don’t think this is well known to most programmers. Recently a fairly visible blogger posted his workstation setup and the screen was positioned such that he would have to look downward just like with a laptop. It baffled many that someone who is clearly a skilled programmer could be so uninformed on proper working ergonomics and the disastrous effects it can have on one’s posture and long-term health.
Anyone who regularly sits at a desk for an extended period of time should be using an eye-level monitor. The logical consequence of that is that laptop screens should only be used sparingly or in exceptional circumstances. In that case, it’s not really necessary to have a laptop as your daily driver.
After many years of using computers I don’t see a big harm of using a slightly tilted display. If anything a regular breaks and stretches/exercises make a lot more difference, especially in long term.
If you check out jcs’ setup more carefully you’ll see that the top line is not that much lower from the “default” eye-line so ergonomics there works just fine.
We discuss how to improve laptop ergonomics and more at https://reddit.com/r/ergomobilecomputers .
(I switched to a tablet PC, the screen is also tilted a bit but raised closer to eye level. Perhaps the photo in the ‘fairly visible blogger’s setup was setup for the photo and might be raised higher normally)
That assumes you’re using the laptop’s built-in keyboard and screen all day long. I have my laptop hooked up to a big external monitor and an ergonomic keyboard. The laptop screen acts as a second monitor and I do all my work on the big monitor which is at a comfortable eye level.
On most days it has the exact same ergonomics as a desktop machine. But then when I occasionally want to carry my work environment somewhere else, I just unplug the laptop and I’m good to go. That ability, plus the fact that the laptop is completely silent unless I’m doing something highly CPU-intensive, is well worth the loss of raw horsepower to me.
A kickass workstation which can’t be taken into the hammock, yes.
I bought a ThinkStation P330 2.5y ago and it is still my best computing purchase. Once my X220 dies, if ever, then I will go for a second ThinkStation.
A few years ago I bought an used thinkcentre m92. Ultra small form factor. Replaced the hard drive with a cheap SSD and threw in extra RAM and a 4k screen. Great set up. I could work very comfortably and do anything I want to do on a desktop. Including development or whatching 4k videos. I used that setup for five years and have recently changed to a 2 year old iMac with an Intel processor so I can smoothly run Linux on it.
There is no way I am suffering through laptop usage. I see laptops as something suited for sales people, car repair, construction workers and that sort of thing. For a person sitting a whole day in front of the screen… No way.
I don’t get the need for people to be able to use their computers in a zillion places. Why? What’s so critical about it? How many people actually carries their own portable office Vs just doing their work on their desks before the advent of the personal computer? We even already carry a small computer in our pocket att all times that fills up lot of personal work needs such as email, chat, checking webpages, conference calls, etc. Is it really that critical to have a laptop?
I work at/in:
The first two are absolutely essential, the third is because if I want to do some hobbyist computing, it’s not nice if I disappear in the home office. Plus my wife and I sometimes both work at home.
Having three different workstations would be annoying. Not everything is on Dropbox, so I’d have to pass files between machines. I like fast machines, so I’d be upgrading three workstations frequently.
Instead, I just use a single MacBook with an M1 Pro. Performance-wise it’s somewhere between a Ryzen 5900X and 5950X. For some things I care about for work (matrix multiplication), it’s even much faster. We have a Thunderbolt Dock, 4k screen, keyboard and trackpad at each of these desks, so I plug in a single Thunderbolt cable and have my full working environment there. When I need to do heavy GPU training, I SSH into a work machine, but at least I don’t have a terribly noisy NVIDIA card next to me on or under the desk.
I believe this is the crux of it. It boils down to personal preference. There is no way I am suffering to the horrible experience of using a laptop because it is not nice to disappear to the office. If anything, it raises the barrier to be in front of a screen.
Your last paragraph is exactly my thoughts. Having a workstation is a great way to reduce lazy habits IMNSHO. Mobility that comes with a laptop is ultimately a recipe for neck pain, strain in arms and hands and poor posture and habits.
I have 3 places in which I use my computer (a laptop). In two of them, I connect it to an external monitor, mouse and keyboard, and I do my best to optimize ergonomics.
But the fact that I can take my computer with me and use it almost anywhere, is a huge bonus.
Hot take here, but I find Linux on the desktop to be insufferable. I want it to be good, I really do, but it just isn’t. I feel like windows has no place on a server, but for my desktop? I’ve been quite happy with Win11, Terminal app, and WSL2. I live in the best parts of Linux, and I can actually use my computer with the best parts of Windows.
Two main uses for my computer are unfortunately filled with gotchas on Linux, MS Teams, and Zoom. They’re both terrible.
MacOS is almost the best of both worlds too, but you’re locked into their hardware. Which was a liability until recently (M1).
I won’t dive into certain yellow website discussion about that (even though it is what induced the article itself) so don’t know if people made the same points as me.
As a “target audience” I roughly understand what do I want from the machine - the specs look totally fine, after all these years I know for sure that if I won’t “fit” into 16gb ram then even 64gigs certainly won’t be enough, so 16 is just fine for me. As the rest of the specs too.
My biggest disappointment about that laptop is actually its ergonomics. The 0.5x functional keys row is bad. Arrows that are not t-shaped are bad. The whole trend with abysmal giant touchpad is bad too - make it 0.75 or even 0.5 of the current size and it would be much better because the main input device - keyboard - will be bigger and better.
Touchpad shifted to the left side is bad - keyboard doesn’t even have a numbad yet they shift it (ignoring the existence of left-handed people also).
As even more subjective thing I would like to see a 4x3 or 3x2 display on a laptop aimed for developers, I think that makes sense.
I think you hit on one of the primary problems with laptop design here: many of us regard the touchpad as an annoying distraction from main input but what seems like the vast majority of the market look at the touchpad as primary input. It seems that the laptop designers either don’t understand that a small but generally wealthy segment of the market has this attitude or simply don’t care.
Walk into any developer-focused conference, even one focused on Linux or Windows development, and 95% of the laptops will be MacBooks. And the rest will be clunky corporate machines which probably weren’t the owner’s choice. Dell and others have offered Linux as a supported option for years, and they’ve had few takers, so I don’t think this is likely to change anytime soon.
I think HN comment rants are just a long-form ‘dislike button’. The author here is right that there are valid complaints, and for a giant OEM to release this underwhelming device with last year’s tech and marketing terminology for non-devs, it’s no wonder people feel knee-jerky about device that feels like it had no input from what many devs actually want. The laptop space, unlike deskop, is not so friendly towards DIY so people feel they gotta choose the least bad option because they can’t customize exactly what they want. Maybe that’s the crux; a dev-oriented device needs to have have many customizable specs to the highest-ends (since dev salaries can afford nice things) if not straight-up skip some of their parts to pick your own after-market RAM/HDD/etc. Heck, I think the Linux crowd would rather just all laptops didn’t come any pre-installed OS instead paying for something you’ll wipe immediately out of lack of trust.
Not sure what to make of this meta rant.
It’s just as if people (even if they’re all developers) have wildly differing needs. For example if you’d ask me what I like or dislike about my current laptop then the most important factor would be my current job and what I am actually doing, so I guess this has completely changed at least three times in the last 10 years. (Pretty stationary with a large ThinkPad until 2013, then very mobile with an x230 until 2017, then I had a severely underpowered i5 ThinkPad, now since 2021 it’s Dell with a worse keyboard, no trackpoint and perma-WFH - so I don’t actually mind).
And I’m saying this as someone who doesn’t usually even look for the ‘perfect’ machine, but if it has one KO criterion I feel like I am entitled to rant. Especially as many of us are stuck with the one brand that our employers let us choose, and sometimes we can’t even test it out beforehand. If you HAVE to use it for 3 years (kinda the standard here, sometimes 4, rarely 2) it better be near perfect.
Lenovo W520 user here, 2 laptops one with 16 and the other with 32 GB ram. Both have the same Intel Gen 2 I7 cpu. I think they were made around 2012, bought them for about 120bucks each a few years back, and updated memory + SSD (so that was additional $$).
I use laptops because I need to do video/audio conf. call, develop, and move sitting/standing positions through out a week. Otherwise, during months where I work lots of hours –elbows, shoulders and wrists start hurting – so relocating where I sit, periodically, solves that problem for me – which is why I prefer laptops.
What is not ideal about laptops is that you are looking down on screen, and that reduces ability to focus/concentrate (you need to be looking slightly up).
Where possible, I utilize XRDP and connect to an actual faster workstation with an RDP client. But the only system where I can get XRDP to work with audio – is Freebsd, however freeBSD does not support android dev, so I cannot host my dev android env there (works well for backend /ansible work, however).
What made me finally
pull the trigger
and upgrade laptops – is the Android + React-Native + Expo-AV (RN audio/video wrapper). For every iteration of fix-test cycle (especially when a bug shows in release mode only) , on my Lenovo – it would take now, consistently, 6 minutes.This is not java compiler that’s doing it , it is RN + Expo-AV packaging of resources, shrinking JS code, etc. This happened overtime, in 2014-15 when I started using React-native – it was bearable. But now it is just not workable. As a side note, the Expo folks that are responsible for dev/build setup – change their mind every 6 months, so upgrading to a new release of Expo-AV is basically a very stressful and painful ‘tooth pulling pain’ kind of experience, that required slow try-error-retry cycle (and takes 6 min each) (as a side note, if you do not have to use RN Expo, do not use it … )
Otherwise, I would be fine with Lenovo W520.
For proper ergonomics you should not be looking up. A laptop screen likely doesn’t qualify since it is too low, but you definitely should not be looking up.
That’s quite an old system, and yes, you can actually notice the compile times. Personally I have a slightly different problem: an i7-4790k from about 2014-2015. That’s still fast, it crunches stuff normally, no problems. On par with today’s laptops. So basically I should be good. But I know it could go quite a bit faster, and I want that performance. But I can’t justify it :(
Yeah, I think I like the screen/keyboard combo on those laptops so as long as the CPU speed was sufficient, I was ok.
In general, I like the approach to laptop design that frame.work folks are taking. With their design I am supposed to be able to keep screen/keyboard and many other laptop components, but they enable to replace CPU modules (including moving from one generation to another).