1.  

    I’m curious on the particulars of why Objective-C is a dreaded language, and what proportion of the responses is “I’d rather be doing Swift.” I’ve worked in a lot of languages and Objective-C is one I really wouldn’t mind having to deal with again. Objective-C messages do look weird compared to most other langauges and there’s the big sharp corner to be aware of is that sending messages to nil is allowed. However, when I last worked in it (around the time Swift came out), the Xcode tooling and ecosystem around it were incredible.

    1.  

      Probably because Objective-C is really unlike almost any other language (both being high level and low level and verbose), and for most people, it’s not their home language.

      Plus, like Perl, it has a high legacy code factor.

      That being said, I know someone who coded it as their first major language, and these days, prefers it over Swift, due to the incredibly unstable nature of Swift’s tooling.

      1.  

        I always liked writing Objective-C over C++ – it has a much clearer distinction between the object and the C worlds, and I appreciated that. Plus, bottom propagation is actually pretty cool, although it takes some getting used to.

        1.  

          I would think because a lot of developers wanted to (or were required to) write an iOS app, and found themselves having to use Obj-C. In other words, they didn’t get to choose the language, and they resent it. This is less prevalent with Swift taking over, but there’s so much existing Obj-C code and documentation out there that you’re still bound to run into it and need to learn to understand it.

          Beyond that, it’s got a weird syntax and some people never get over that.

          1.  

            I think this is right. I have a different relationship with it – I started writing it for real on the Mac, before the iPhone was released, and it was just so sensible there. And, the community of people writing native Mac software in 2003 was, uh, self selecting? so we were all kooks together.

          2.  

            They specifically state the methodology measures “people with experience who do not wish to continue using it” which is loosely correlated with “disliked” but clearly not the same thing. In this case it seems pretty likely that everything Objective C does is done better by Swift, so why would you keep using Objective C?

          1. 3

            Stop supporting and embracing Electron apps, please.

            1. 4

              Serious question: what’s wrong with Electron apps?

              1. 15

                As someone who just spent a little time attempting a port of an Electron app to FreeBSD, only to quit in disgust, I have a few opinions.

                1. Electron apps are huge. Really, really, really big with a gigantic web of dependencies. Think an 18,408 line Yarn lockfile.

                2. Those dependencies are JavaScript libraries. To put it mildly, there is not a large intersection between the JavaScript community and users of non-mainstream OSs (e.g. FreeBSD). And those libraries tend not to be written in a portable fashion. This example (admittedly from a few years ago now) of a library disregarding $PATH is just one.

                3. Platform support in Electron is a gigantic steaming pile of bogosity based upon the wrong set of abstractions. Instead of learning from the autotools people who were doing this decades ago, they detect platforms, not features. So when a new platform comes along (say, FreeBSD) you can’t just specify which features it has and let it compile. No, you have to create a gigantic patch that touches a bazillion files, everywhere those files check for which platform it’s compiling on.

                4. Once compiled and running, they’re still huge (up to 1GiB of RAM for an IM client!). And - although perhaps this is a reflection of the apps themselves, not the framework - many are sluggish as hell. Neither is an attractive prospect for resource-limited Linux machines, like PinePhones.

                I had thought, prior to attempting a port of an Electron app, that people were unfairly criticizing it. Now having peeked under the covers, I don’t think people are criticizing it enough.

                1. 6

                  As someone who isn’t an Electron hater: Electron apps are slow to load and memory hogs, which is something you might live with if you are talking about your IDE or Slack, but starts getting really old when it’s a utility application that should load quickly or spends most of the time in your icon tray. Worse yet: poorly written Electron apps can become CPU hogs as well, but I guess the same goes for all software.

                  1. 3

                    I agree that lots of Electron apps have issues with poor performance and high memory usage. That said, a well written Electron app can perform well. For example, I’m a heavy user of the Joplin desktop application and in my experience it performs well and has fairly low memory usage (currently under about 200MB) and doesn’t seem to have the issues that plague the Slack client. Admittedly the Slack client is a lot more complex…

                    1. 2

                      Oh, I agree that there great, performant Electron apps. VSCode is one of my favorite examples of that. Spotify is another one.

                      One of my biggest gripes with Electron is that - because of the nature of how it’s embedded in binaries - you usually end up with with several full copies of the whole framework in memory. If you are using KDE or Gnome, most of the processes in your desktop are sharing a significant amount of memory in the form of shared libraries. This tends to be fine in systems with 16Gb+ of memory and a fast CPU, but for people with more meager resources… it’s a drag.

                    2. 2

                      I’m sure performance issues will be addressed in time.

                      1. 13

                        Electron has been around since 2013 and still, typing in Slack still has a noticeable latency (that drives me crazy). I also still have to restart it once a day or so, to avoid that it becomes more and more laggy.

                        In the meanwhile, ripcord was developed by a single indie developer in Qt. Has most of Slack’s functionality, only uses a fraction of the memory, and is lightning fast. Oh, and it is multi-platform.

                        People (not you) claim that it is only possible to write cross-platform applications in Electron. This is nothing further from the truth, people have been writing cross-platforms apps in Qt literally for decades. (And it’s not hard either.)

                        1. 2

                          I’m not sure that I would consider Slack a stellar example of an Electron app. Slack is slow even by Electron standards. VS Code’s latency is indistinguishable from typing in the Lobsters comment in Chromium on my middle-of-the-road desktop machine. Discord is a much better Electron-based chat app from a performance standpoint, in my experience.

                          People (not you) claim that it is only possible to write cross-platform applications in Electron. This is nothing further from the truth, people have been writing cross-platforms apps in Qt literally for decades. (And it’s not hard either.)

                          For commercial software, the more important part is not whether it’s possible (or “hard”), but whether it’s commercially viable. Without any hard data one way or another, I’d say that writing Electron apps is much less expensive than writing native Qt apps for most companies (especially since web technology experience is much easier to come by).

                          1. 1

                            I don’t mind electron, but even VS code drops 1-2 frames on keypress on my threadripper desktop (and Chrome/Firefox do not). So far I’m putting up with it for the language server integration.

                        2. 4

                          I’m sure once the performance issues are addressed the complaints about performance issues will subside.

                          1. 1

                            I’m looking forward to the day that systems like Electron will compile everything to WebAssembly as a build step. In a way, I think Gary Bernhardt might have been more correct than I gave him credit for in his famous The Birth & Death of JavaScript presentation.

                        3. 3

                          There are the utilitarian critiques (they are big and slow) and there’s also the sort of Mac critique (they are not in any way native) and there’s my weird “I HATE THE WEB” critique that is probably not widely shared. I have a couple of them that I use daily, but I really, really, really wish I didn’t.

                      1. 6

                        It makes my heart ache that interesting ideas like these have largely been extinguished under a tsunami of ’70s era Unix sludge. Oh well.

                        1. 5

                          I am puzzled why these even exist. What is the point? To have the browser be an OS?

                          1. 9

                            Yes, the dream of a PWA revolution requires the browser to have access to everything like the underlying OS does but that will never happen because it’s too easy to make malicious PWAs when there’s no central store/authority to police them.

                            I want freedom too, but the world is full of idiots who still click on “your computer is infected” popups and voluntarily install malware.

                            1. 4

                              They exist to allow web pages controlled and sand-boxed access to resources otherwise only available to black-box native apps which also happen to award Apple 30% of their revenue, so me personally, I’m taking that privacy argument with a grain of salt.

                              1. 11

                                Web apps are just as black-box as native apps. It’s not like minified JavaScript or WebAssemly is in any reasonable way comprehensible.

                                1. 8

                                  I would somewhat agree if Apple was the only vendor who doesn’t support these APIs, but Mozilla agrees with Apple on this issue. That indicates that there’s some legitimacy to the privacy argument.

                                  1. 2

                                    The privacy reason seem not too opaque, as the standard way of identifying you is creating an identifier from your browser data. If you have some hardware attached and exposed, it makes identification more reliable, doesn’t it?

                                    1. 2

                                      Apple led the way early on in adding APIs to make web apps work like native mobile apps — viewports and scrolling and gesture recognition — and allowing web apps to be added as icons to the home screen.

                                      1. 2

                                        Originally iPhones apps were supposed to be written in html/js only, but then the app store model became a cash cow and that entire idea went down the drain in favor of letting people sharecrop on their platform.

                                        1. 9

                                          I mean, too, the iOS native ecosystem is much, much, much richer and produces much better applications than even the modern web. So, maybe it’s more complicated?

                                          1. 1

                                            Agreed. I think that native apps were the plan all along; progressive-ish web app support was just a stop-gap measure until Apple finalized developer tooling. Also, given that most popular apps (not games) are free and lack in-app purchases, odds are that the App Store isn’t quite as huge of a cash cow as it is made out to be. The current top 10 free apps are TikTok, YouTube, Instagram, Facebook, Facebook Messenger, Snapchat, Cash App, Zoom, Netflix, and Google Maps. The first six make money through advertisements. Cash App (Square) uses transaction fees and charges merchants to accept money with it. Zoom makes money from paid customers. Netflix used to give Apple money but has since required that subscriptions be started from the web (if I remember correctly). Google Maps is free and ad-supported.

                                    2. 1

                                      The browser already is an OS. The point of these is to have it be a more capable and competitive OS. Just so happens that at present there’s only one player who really wants that… but they’re a big one, and can throw their weight around.

                                    1. 2

                                      One of my teams is a remote, contract one, and they’re having trouble with code quality and engineering discipline, so that’s an unfun conversation to start having. Otherwise, my wife is taking the kids camping this weekend so I took Friday and Monday off and am going to just have four days at home by myself, for the first time in five years.

                                      1. 1

                                        Would you mind giving an example regarding your team’s “having trouble”? I’m curious most about the “engineering discipline” issues.

                                        1. 2

                                          So it’s not entirely on them; the company has some pretty primitive systems in place around eg releases; certain parts of the system require coordination between teams and a very specific set of steps for deployments to succeed. This team has been rocky on those systems – so I have to talk about it with the team and reiterate the necessity of following the playbooks to the letter. They’re good guys and not bad engineers, but their team has sort of fallen into a management blind spot before I took them over.

                                      1. 1

                                        How would that be enforced exactly?

                                        1. 4

                                          As I understand it the ruling is “Storing customer data in the US is not compatible with GDPR compliance”, so it would be enforced using the existing GDPR enforcement regime.

                                          1. 6

                                            Sure, but where can you store a chat conversation between European and USA citizens ?

                                            1. 4

                                              In Europe

                                              1. 3

                                                On their own devices. Use end-to-end encryption while you still can (but that’s a good question in general)

                                              2. 2

                                                The CLOUD Act seems to be removing the distinction between data stored in the USA versus data stored abroad when it comes to US companies. As far as I understand it, the act in a way extends American jurisdiction to every country where the server of an American company is located, so perhaps a more important thing EU states can do in this regard is not entering CLOUD Act agreements with the US at all? I’m only partially trolling.

                                              3. 0

                                                Why, by giving EU States complete access to their data feeds, of course.

                                                I wonder if I’m being paranoid by seeing this as a subtle play for warrantless surveillance?

                                                1. 11

                                                  I think it’s far more likely that it will be enforced with the possibility of outlandish fines or loss of market access if found to be in violation of the law. That would (roughly) align with how other data privacy regulations are established in the EU.

                                                  A gross expansion of warrantless surveillance seems quite unlikely in the EU, as there is a cultural belief that data about one’s self belongs to one’s self which is in contrast to the American culture where data about one’s self is typically viewed as belonging to whoever collected the data.

                                                  1. 20

                                                    In case anyone’s wondering what the deal is here: lots of European countries, especially in Eastern and Central Europe, but also some Western European countries (e.g. Germany) have a bit of a… history with indiscriminate data collection and surveillance. Even those of us who are young enough not to have been under some form of special surveillance are nonetheless familiar with the concept, and had our parents or grandparents subjected to it. (And note that the bar for “young enough” is pretty low; I have a friend who was regularly tailed when he was 12). And whereas you had to do something more or less suspicious to be placed under special surveillance (which included things like having bugs planted in your house and phones being tapped), “general” surveillance was pretty much for everyone. You could generally expect that conversations in your workplace, for example, would be listened to and reported. With the added bonus of the fact that recording and surveillance equipment wasn’t as ubiquitous and cheap as it was today, so it was usually reported by informers.

                                                    Granted, totalitarian authorities beyond the Iron Curtain largely employed state agencies, not private companies for their surveillance operations – at least on their own territory – but that doesn’t mean the very few private enterprises, limited in scope as they were, couldn’t be coopted into any operation. And, of course, the Fascist regimes that flourished in Western Europe for a brief period of time totally partnered with private enterprises if they could. IBM is the notorious example but there were plenty of others.

                                                    Consequently, lots of people here are extremely suspicious about these things. Those who haven’t already experienced the consequences of indiscriminate surveillance have the cautionary tales of those who did, at least for another 20-30 years. If someone doesn’t express any real concern, it’s often either because a) they don’t realize the scope of data collection, or b) they’ve long come to terms with the idea of surveillance and are content with the fact that any amount of data collection won’t reveal anything suspicious. My parents fall in the latter category – my dad was in the air force so it’s pretty safe to assume that we were under some form of surveillance pretty much all the time. Probably even after the Iron Curtain fell, too, who knows. But most of us, who were very quickly hushed if they said the wrong thing at a family dinner or whatever because “you can’t say things like that when others are listening”, aren’t fans of this stuff at all.

                                                    Edit: Basically, it’s not just a question of who this data belongs to – it’s a pretty deeply-ingrained belief that collecting large swaths of data is a bad idea. The commercial purpose sort of limits the public response but the only reason why that worked well so far is that, politically, this is a hot potato, so there’s still an overall impression that the primary driving force behind data collection is private enterprise. As soon as there’s some indication that the state might get near that sort of data, tempers start running hot.

                                                    1. 5

                                                      For more details on this, Wikipedia’s entry on Stasi, the security service of East Germany, is a great read. Stasi maintained detailed files (on paper!) on millions of East Germans. Files were kept on shelves, and shelves were >100 kilometers(!) long when East Germany fell.

                                                      It is easy to imagine why Facebook’s data collection reminds people of Stasi files.

                                                      1. 1

                                                        There were some amazing stories floating around in 1989 – like, the Stasi were sneaking across the border into the West to buy shredders, because they couldn’t shred the documents fast enough; and the army of older ladies who have been painstakingly reassembling the bags and bags and bags of shredded documents.

                                                      2. 3

                                                        To be fair with powers shifting, companies consolidating, individuals having the same money and thereby power of whole governments, and individual companies or partnering ones not only being owrking individual sectors anymore and governments outsourcing more and more of their stuff (infrastructure (IT & non IT), security, etc. and corporations creating pretty much whole towns for their employees and oftentimes families they overall become more similar to governments, but usually with fewer guarantees by things like constitutions.

                                                        1. 2

                                                          Absolutely. There’s been talk of a “minimal state” for decades now, but no talk of a “minimal company”. Between their lack of accountability, the complete lack of transparency, and the steady increase of available funds, I think the leniency we’re granting private enterprises is short-sighted. But that’s a whole other story :).

                                                    2. 5

                                                      The US actually claims the right to warrantless surveillance of non-US citizens, through FISA. Additionally, through the CLOUD act, they claim the right to request personal information from US companies, even if this information is not stored on US soil.

                                                      Looking at the political side of things, many EU lawmakers are perfectly fine with engaging in a little protectionism for European IT companies, and if EU privacy law makes life difficult for FAANG, that’s perfect. On the other hand, the US is trying to use the world dominance of its IT companies as a way to extend the reach of its justice and surveillance system.

                                                      Then there are FAANG-paid lobbyists, who keep pushing for treaties that claim the US extends protections to EU citizens’ data, even though it clearly doesn’t. They don’t last long once they get taken to court. This is why some US tech companies, like Salesforce, are now lobbying for a data protection regime in the US - this would be one way to reconcile this difference.

                                                      This is a trade war, and the victims are smaller US companies that shy away from doing business in the EU.

                                                  1. 18

                                                    The essay doesn’t cover the most important way to improve your technical writing: have someone else read a draft. You can’t see your own blind spots.

                                                    1. 8

                                                      Not everyone can do that. Especially young people or when one just starts writing, it’s hard to even get anyone to care let alone thing enough about what you’re writing to improve it.

                                                      In my experience the next best thing is to forget about it for a week, and then re-read the text.

                                                      1. 8

                                                        Given the comments left by @healeycodes and @jakob here, perhaps there’s merit to staring a lobste.rs “proofreading group”? I know we have several authors here that regularly post their articles; perhaps they would benefit from having them reviewed. I worry that the generous proofreading offers in this thread may eventually be forgotten or go unnoticed by the people who’d benefit from them.

                                                        1. 5

                                                          What about a “What are you writing?” thread? It could be created on a weekly/monthy basis, where people share drafts and ask other members to comment.

                                                          1. 1

                                                            I like this idea of a thread that you and @rustybolt suggested! In my opinion, something weekly would be good. Perhaps it can be posted on Mondays, so that people writing over the weekend can get feedback soon after?

                                                          2. 3

                                                            I’d like to say I’d also like to read and review technical writing. I also write occasionally – mostly quite mathy stuff.

                                                            Also, maybe it’s a good idea to start a weekly “who’s writing?” thread where we can all post stuff and review each others’. Alternatively we can start an old-school mailing list (or even just a group mail would work, where everyone just replies to all on the latest one, then everyone could send feedback in private – this has the disadvantage that it wouldn’t be visible on lobste.rs so it would be hard for new people to join).

                                                          3. 6

                                                            I agree that it’s hard (especially when starting out) to find people to read your technical writing before publishing it.

                                                            To anyone reading this comment: I’m happy to review your technical writing and provide feedback.

                                                            (You can find my email via my website’s about page.)

                                                            The last time I posted this on here, I read about some really cool topics and concepts, as well as new experiences, that I would have not read otherwise!

                                                            In my experience the next best thing is to forget about it for a week, and then re-read the text.

                                                            This helps me. However, it’s hard to step away from a piece of writing that is exciting for you!

                                                            1. 5

                                                              That’s a kind offer, and I’m happy to see it being made. I feel quite indebted to the wonderful community here, so I will offer the same. I am also happy to review your technical writing and provide feedback. Similarly, my email address can be found on my website.

                                                              1. 3

                                                                Same!

                                                          4. 2

                                                            An alternative I’ve found that works (for me at least) is to wait a couple of weeks, print the page, and read it very carefully.

                                                            1. 1

                                                              I was going to post the same thing! You can be your own affordable second set of eyes. Time is a safe way to clear your head. A lesser alternative is to work on another significant mental problem while not thinking of the sitting draft, so that when you look back at your draft, your mind is somewhat clear and you can review.

                                                          1. 3

                                                            There is nothing stopping you from doing the same thing and running Emacs as a launchd service for those who still want the Mac but also a persistent emacs server in the background.

                                                            1. 3

                                                              Yeah, you can do it, but using the client is not as convenient - e.g. you need to use Automator to create some wrapper around emacsclient -c so you can invoke this from Spotlight. (https://stackoverflow.com/questions/39464975/launch-emacsclient-with-gui-from-dock-on-mac-os-x) And coming from Linux launchd is not the most appealing service manager. :-) This is one area where there’s plenty of room for improvement in macOS.

                                                              1. 3

                                                                This is probably Stockholm Syndrome from my days on the farm, but I always really liked launchd.

                                                                1. 1

                                                                  Managing launchd with Nix is a breeze.

                                                                2. 2

                                                                  Homebrew includes the launchd definition (brew services launch emacs). Invaluable for Spacemacs.

                                                                1. 11

                                                                  Why not fork etcd using the last commit before the gRPC changes? Surely, if this author is right, there is still a market for a simple, easy to use, consensus driven database with an HTTP API.

                                                                  1. 11

                                                                    On the one hand, sure, decent idea, but on the other hand, I think the point of this article is that there’s much more of a market for complex, overengineered solutions which stroke your ego by telling you that your situation is just like Google’s and that it justifies putting up with a great deal of tedium.

                                                                    Following the market is how we got into this mess in the first place.

                                                                    1. 5

                                                                      I believe that the ecosystem is getting the behavior it incentivizes. :)

                                                                      It seems that there is a lot more money to be made playing ball with this than there is in actually engineering things efficiently.

                                                                      1. 7

                                                                        Full agreement except I’d replace “money to be made” with “money to be milked from credulous investors”

                                                                        1. 11

                                                                          My hot take is basically that investors are going to wise up to this when remote work becomes more common and when some sufficiently large chunk of tech gets serious about unionizing. That day there will be a reckoning and our compensation will be readjusted to be more in-line with other white-collar (or possibly even blue-collar) professions.

                                                                          So, make hay while the sun shines and try to think of responses to the future generations who are annoyed that they can’t make as much money as we did.

                                                                          1. 3

                                                                            “You had to be there, yo.”

                                                                            1. 1

                                                                              I think that thought is deserving of a bit more than a ‘hot take’. Our entire industry seems to be marrying increased expectations to diminishing returns.

                                                                          2. 1

                                                                            It seems that there is a lot more money to be made playing ball with this than there is in actually engineering things efficiently.

                                                                            “Here’s a pile of dirt, build your own solution.” – “But, I don’t have the tools for this…” – “That’s OK… I’m building a new shovel that will allow you to dig new holes to shovel dirt into.”

                                                                            The market cap in complex tools is unstoppable.

                                                                          3. 1

                                                                            A simple solution can usually only cover a limited set of use cases. A more complex solution can often cover a wider range or use cases, and while it may arguably be more difficult (or “worse”) for various simpler use cases, it rarely makes them impossible. In that sense, a complex solution is “better”. I think this explains most of the drift towards complexity, and you see this in many projects not necessarily because of the market or whatnot, but just because a project wants to be useful for many cases. It can be kind of a difficult trade-off to make.

                                                                            At any rate, I don’t think that a complicated etcd really takes away anything from a “simple etcd”. Like apg said, just fork the old etcd or some such and let etcd be etcd.

                                                                          4. 2

                                                                            if the public API has changed, now for every client or consumer you want to use, you potentially have to fork those too, and now you have to maintain them. it’s not really the same thing if you have to give up the ecosystem, and there’s probably a lot in the ecosystem that expects the newer stuff.

                                                                            1. 1

                                                                              I am pretty sure curl was an acceptable etcd client back in the day.

                                                                              1. 1

                                                                                sure but at that rate why bother with forking an old version of etcd at all? if you don’t care about the ecosystem at all you might as well just build your own thing at that point.

                                                                                1. 3

                                                                                  Because it solved the hard part (consensus). Why would you build that again and again?

                                                                                  1. 1

                                                                                    The hard part was figuring out Raft. That’s been figured out. You don’t have to start over from Lamport’s papers and figure out Raft from scratch. You can import etcd’s Raft implementation as a package if that’s all you want out of etcd. Indeed, many projects do exactly that. That’s a totally fine conclusion.

                                                                                    A lot of the value of using something that’s widely used like etcd is that someone new comes along and when you say “we use etcd” they can say “I’m familiar with its features and with building things with etcd” or “I have a good library that uses that” or “I built a tool for that” and get up and running quickly but, no, surprise, whether their knowledge or techniques or libraries or tools work on your forked version is dependent on whether or not they worked for the version that you’ve decided is “good”. If all of those tools still work on your forked version and this isn’t a problem, then the argument for forking is pretty weak because the old APIs that you like are still supported.

                                                                                    1. 1

                                                                                      Then we’re not disagreeing I think. Of course etcd uses Raft, that’s what I meant. It does what it does and uses a well-known working implementation for consensus.

                                                                                      I just don’t see a point in reinventing 90% of etcd using Raft as well AND having a different api. Unless it’s so different it brings something new to the table.

                                                                                      1. 1

                                                                                        Just a nitpick, Paxos was invented by Lamport. Raft was invented as an alternative to Paxos by Ongaro and Ousterhout (Stanford University): https://raft.github.io/raft.pdf

                                                                                        1. 1

                                                                                          yes, I know, that’s … the same paper that I linked. When I say “figure out raft” I didn’t mean “implement raft from the whitepaper”, I meant “author the original raft whitepaper”. I can see how my original statement was kinda ambiguous though.

                                                                                          1. 1

                                                                                            ah, sorry, should’ve read it more closely. I agree with you, tho – the hard part was Raft, that was done elsewhere. One can now add any functionality to it (which, ironically, is what the original post complains about). However, lately I’ve been thinking that specialized/custom software wins over any general, popular software, just for the sake of simplicity and understanding.

                                                                                            1. 1

                                                                                              have you read Fast key-value stores: An idea whose time has come and gone? It basically argues that point, it’s a really good read. I work on a stateful server at work that keeps its state in memory and replicates it to other nodes in its cluster with CRDTs, it’s a lot of fun and it works! But also it’s a multiplayer game server so I don’t really have to persist anything which makes the problem a lot easier.

                                                                                    2. 2

                                                                                      if you don’t care about the ecosystem at all you might as well just build your own thing at that point.

                                                                                      The author was happy with etcd before they went and made it all complicated. No reason to make something new. Reuse what was previously good.

                                                                                2. 1

                                                                                  Then I will instead go with Consul which, while being a little bit more than ETCD, keeps simple HTTP API.

                                                                                  1. 0

                                                                                    Why even do that?

                                                                                    etcd still has an http api

                                                                                  1. 1

                                                                                    I mean, there are enough Linux distributions trying to be Solaris already. A different take would be welcome.

                                                                                    1. 9

                                                                                      If we’re gonna open that can of worms I’d also assert that rather than creating new distributions it’d be awesome if folks would pour their efforts into improving one of the umpteen skittledezillion that already exist.

                                                                                      However, that’s the beauty of open source right?

                                                                                      1. 5

                                                                                        The beauty and the tragedy of open source.

                                                                                        1. 4

                                                                                          I don’t like distro proliferation, either.

                                                                                          But this doesn’t fit the “New DE with distro centered around it” pattern nor the “Let’s make another Debian/Arch derivative and market it as user friendly so that some suckers donate to us” pattern.

                                                                                          This one seems to have some merit to it, by going llvm/libc++/musl.

                                                                                          1. 1

                                                                                            Why not, say, an Ubuntu variant though? Or a spin of Fedora?

                                                                                            SOMETHING to help these new developments broaden an existing community.

                                                                                        2. 2

                                                                                          Which ones? I’m curious.

                                                                                          1. 1

                                                                                            That was poorly phrased. I was thrashing around for the idea that they’re a lot of distributions trying to do the same thing, not so much that they’re trying to be like Solaris.

                                                                                        1. 6

                                                                                          Very nice to see the fully-LLVM thing, especially libc++.

                                                                                          1. 10

                                                                                            Why’s that nice? I have nothing against LLVM, but I don’t feel it’s better than GNU either, so I’m curious about your reasoning.

                                                                                            1. 12

                                                                                              I use FreeBSD, so my main selfish reason: I want Linux people to adopt libc++ more so that they stop writing software that fails on libc++! Often that’s due to silly stuff like missing includes (relying on libstdc++’s incidental transitive includes).

                                                                                              1. 1

                                                                                                I thought they fixed that in gcc 10?

                                                                                                1. 1

                                                                                                  That’s a cool change, but I’m not sure stdexcept is the only such thing. And of course not every developer has tested everything on this version of libstdc++ yet..

                                                                                              2. 4

                                                                                                I like to see software built with many different kind of compiler, linkers, assemblers, kernels, libcs, other libs… as a way to reveal bugs or unportable features.

                                                                                                Also, I like to see multiple implementations of some essential components, partially an indicator that the good abstraction was found as it is easy to re-implement.

                                                                                                1. 2

                                                                                                  LLVM has nicer debugging tools for certain things, a C++ interpreter among them :)

                                                                                                  1. 1

                                                                                                    Not the OP, but the fact that there are plenty of other distributions stressing the GNU toolchain means that LLVM gets short shrift, to some degree.

                                                                                                1. 8

                                                                                                  Can someone please explain how it is possible that one malfunctioning SDK can break the entire app? IIUC this is due to Facebook login but still, why can’t the app continue to function regularly?

                                                                                                  Or is it broken only for whoever actually logged in with Facebook in the first place?

                                                                                                  1. 24

                                                                                                    This is due to Facebook’s idiosynchratic engineering practices.

                                                                                                    At least last time this happened (https://github.com/facebook/facebook-ios-sdk/issues/1373), two months ago*, just including the SDK was enough to bring your app down before it had even initialised because Facebook ran code in the Objective-C class load method, which no sane person would override, let alone do network calls in.

                                                                                                    That’s the idiosynchratic part, but it spells disaster when combined with Facebook’s amateurish development practices: The SDK will load objects from the FB backend, parsing them with brute force, expecting them to always be well-formed. This, combined with no working internal CI/CD system, leads to situations like this, where anyone doing a configuration error on a backend service can bring down millions of clients without them even calling the code in question.

                                                                                                    1. 10

                                                                                                      “Idiosyncratic” seems like an exceedingly polite way to put it.

                                                                                                      1. 5

                                                                                                        Realized today that I’m more dismayed by the tech public’s reaction to this than the failures of engineering displayed. Specifically, a bunch of people over at the orange site believe it isn’t fully preventable when the fix is simply to see bad engineering decisions for what they are: static initialization is dangerous and has no place in a SDK deployed at FB’s scope. Full stop.

                                                                                                        1. 2

                                                                                                          Unfortunately, there are tons of SDKs out there that does static initialization. When brought this issue up with another big-name SDK during integration, the response is in the line: we need to do some initialization, and relying on SDK users to call “init” method properly is a no-go based on what we experienced. That sounds plausible, but the solution is to make sure you have init’ed upon the first call to your library, and this can be solved through tooling (adding a custom LLVM pass to call your init method if not through your public API).

                                                                                                        2. 2

                                                                                                          Do you have a citation for "no working internal CI/CD system"?

                                                                                                          1. 2

                                                                                                            They have a CircleCI instance, but if it did continuous integration and delivery, it would not be all-green.

                                                                                                          2. 2

                                                                                                            The SDK will load objects from the FB backend, parsing them with brute force, expecting them to always be well-formed

                                                                                                            I spit water out of my mouth as I read it.

                                                                                                          3. 9

                                                                                                            It’s just shoddy code from Facebook, and it’s a hot mess, because their SDK calls home in the initializer, IIRC, and a “bad” result that causes it to throw will blow up any app that includes the SDK, even if the app doesn’t use FB login. It’s a total catastrophe.

                                                                                                            1. 3

                                                                                                              To add my speculative point to the good substantive answers above…

                                                                                                              Another part of the problem is our industry has never really adopted REST as a way of architecting big public APIs. So instead of accessing them over HTTP using common patterns every API is expected to have a client library implementing the API’s custom, closed RPC-over-HTTP architecture.

                                                                                                              (I don’t have any solutions, I’m just lamenting.)

                                                                                                              1. 1

                                                                                                                Which industry are you referring to? Serious question.

                                                                                                                1. 2

                                                                                                                  Software. Or that part of it that creates Web APIs.

                                                                                                              2. 4

                                                                                                                I’m guessing this is why a bunch of apps on my iPhone weren’t working this morning (GroupMe, Spotify, etc.). I wasn’t logged in with FB, for what it’s worth.

                                                                                                              1. 0

                                                                                                                I love this thing so much.

                                                                                                                1. 2

                                                                                                                  I mean, the answer is basically “path dependence”, no?

                                                                                                                  1. 2

                                                                                                                    My laptop (running Linux) died the other day. My boss has been pressuring me to get a Mac for forever and since I needed a new machine right now (in the middle of a big project) and Apple could courier a Mac to me in two hours (I’d prefer to not go to a store what with COVID-19 and all)…I got a Mac.

                                                                                                                    So far so good, since it’s UNIX under the hood. My only complaint is that when I hook up an external display it gets real hot and real loud…way hotter and louder than it has any right to be for a mostly-idle system. Apparently this is a known issue with the MacBook Pro 16”…

                                                                                                                    1. 8

                                                                                                                      It’s depressing how macOS is worse at power management and external monitors than Linux these days.

                                                                                                                      1. 9

                                                                                                                        I know what you mean, but macOS still does mixed-DPI way better than Linux, which is important. Wayland is making significant improvements in this space. I hope that the ARM transition helps with MacBook Pro thermals (the real issue in this particular case).

                                                                                                                        1. 1

                                                                                                                          Are they going to drop Nvidia for their home grown GPUs?

                                                                                                                          1. 2

                                                                                                                            They haven’t been using NVIDIA for years now. I’m guessing that they will be dropping AMD graphics for their home-grown ones on most of the Mac line. We’ll have to wait and see whether this holds true for the Mac Pro.

                                                                                                                      2. 7

                                                                                                                        Try using the right-hand ports. Seriously.

                                                                                                                        1. 1

                                                                                                                          Not that I want to turn lobste.rs into a support forum, but for the record, I’ve got just the external display plugged in and nothing else (not even the power supply); it’s plugged in on the right-hand side. The system is 96-99% idle. With the external display plugged in, the CPU temperature bounces between 60 and 75 degrees C. From what I’ve been able to find, at idle it should be no more than about 45 degrees C. Doing anything even moderately intense (e.g. listening to music) gets the machine hot enough that the fans are blasting and the CPU is getting throttled.

                                                                                                                          Searching around online, this is apparently a universal problem: using an external display with the MBP 16” causes huge power draw and heat problems, regardless of the resolutions/refresh rates/whatever involved. It’s kinda to the point that I feel like I was lied to. I don’t feel like this Mac is truly capable of being used with an external display. It feels like false advertising.

                                                                                                                          I’m very strongly considering returning the machine. I might consider getting the 13” model (which doesn’t have the Radeon GPU that is apparently the source of the problem) if it turns out it doesn’t have the same thermal problems…

                                                                                                                          Anyway, rant over. Sorry.

                                                                                                                          1. 1

                                                                                                                            Oh, no worries. I just discovered the right hand side thing myself.

                                                                                                                        2. 6

                                                                                                                          MacBooks have bad thermal issues when plugging in a monitor and a charge cable both on the left-hand side. Consider putting the two cables on opposite sides of the machine.

                                                                                                                          https://apple.stackexchange.com/a/363933/349651

                                                                                                                          1. 2

                                                                                                                            Apparently there is a possible work around if your monitor supports displayport. Maybe worth a shot? Apparently that uses the igpu, and runs much cooler – presumably a bad bug with the dedicated gpu and an external display.

                                                                                                                            1. 1

                                                                                                                              Apple seems to always let things get hotter than I would like.

                                                                                                                            1. 10

                                                                                                                              He makes some points about why Python mightn’t be the best choice but… SCALA? REALLY?

                                                                                                                              It’s clearly an amazingly capable language, but to quote Christian Beedgen “Scala is a VERY sharp tool.”.

                                                                                                                              I honestly don’t understand how a language with so many potential pitfalls for the unwary could be a good choice for a first language?

                                                                                                                              1. 2

                                                                                                                                I think it’s a great choice, but I agree it’s an imperfect razor sharp tool. What language would you choose instead?

                                                                                                                                1. 4

                                                                                                                                  Thinking about this, this is the ULTIMATE bikeshed question, is it not? :)

                                                                                                                                  So, let me first admit that I am poorly qualified to make that distinction. I am not a teacher and by most measures I’m not even a computer scientist.

                                                                                                                                  I’m a sysadmin -> release engineer -> software developer who was raised by wolved cobbling together process oriented automations using shell, Perl, Ruby and Python as my skill set evolved, in that order :)

                                                                                                                                  The thing I keep coming back to, and where I part ways with most better qualified folks on this topic is: I like abstraction, and I know for myself that my interest in programming only REALLY blossomed when I encountered languages that presented themselves to me at a high level of abstraction.

                                                                                                                                  So, I can see where Scala could actually be nice, in that you could teach basic OOP which is a nice easy conceptual framework for students to absorb, but get into some of the more ascetic levitation and flying functional stuff later as they develop their chops.

                                                                                                                                  As to what I would pick? I’d pick Python. I don’t personally feel stunted for life from learning other languages because of my preference for it, and I think that giving students a tool that will allow them to experience high velocity development is a great way to start.

                                                                                                                                  1. 2

                                                                                                                                    I agree that Scala is a sharp tool, but I think that’s the point. Those pitfalls typically show themselves right away in the form of syntax errors. When speaking of languages like Scala or Haskell, people often say: if it compiles, it’s probably correct. I definitely can see that feedback being valuable for teaching students. And because Haskell’s errors can be quite arcane, Scala seems like a better choice.

                                                                                                                                    Anecdotally, my partner went to Trinity. The author of this post was her professor, and she learned Scala. Though she only took that one class, and hasn’t written any code in the years since, she still understands concepts like types, OO, and functional programming when I talk to her about them.

                                                                                                                                    1. 1

                                                                                                                                      You’re actually supporting my bikeshed argument :)

                                                                                                                                      Given a superlative teacher you can derive tremendous value whatever the choice of language. Scala is probably an excellent choice for this person and their students for all the reasons you outlined.

                                                                                                                                      The question is - for everyone who won’t have the benefit of Professor Amazing at Trinity, is Scala the right first language?

                                                                                                                                      The answer may well be yes. I really don’t know.

                                                                                                                                  2. 1

                                                                                                                                    I don’t think the language per se makes any difference; what matters is the tooling, and the instructor. I did SICP at Chicago back in the dark ages, but aside from leaving me with a deep fondness for Lisps, the fact that it was Scheme and not, say, Pascal was almost irrelevant.

                                                                                                                                1. 17

                                                                                                                                  Honestly, I don’t get it. Why does it matter what the text looks like as long as it’s satisfactory?

                                                                                                                                  1. 28

                                                                                                                                    Different people have different thresholds for “satisfactory”, I guess?

                                                                                                                                    1. 5

                                                                                                                                      I don’t really buy this, it’s not satisfaction but habit. Sure, you realize there’s a difference when you change, it’s not like your brain has changed. It’s just the inverse effect of upgrading and admiring what’s better – but after a while you get used to it. Just like you’re not inhibited by this initial admiration, you won’t be by the initial annoyance.

                                                                                                                                      In the end, it’s not pixels you’re looking at, but like art tells us, whatever we are looking at is in our head. And we’ve long passed the point where this kind of consumer scaring is necessary.

                                                                                                                                      1. 2

                                                                                                                                        I don’t really buy this, it’s not satisfaction but habit. Sure, you realize there’s a difference when you change, it’s not like your brain has changed.

                                                                                                                                        What is “habit”, if not your brain changing to optimize itself for a particular use case?

                                                                                                                                        1. 2

                                                                                                                                          Fair enough, my point is that this change isn’t permanent, and all it takes for someone to forget about what resolution the screen is is a week or two (except if actually inhibits your work, of course).

                                                                                                                                        2. 1

                                                                                                                                          But what is satisfaction if not informed by habit?

                                                                                                                                        3. 1

                                                                                                                                          Something inexplicably obvious about it just doesn’t occur to me, it seems.

                                                                                                                                          1. 1

                                                                                                                                            …. which is fine! My wife can’t see the difference, either.

                                                                                                                                        4. 15

                                                                                                                                          After using retina and 4k displays for several years, when forced to use a 1080p, 96dpi monitor I find I no longer consider any text on it “satisfactory”. To me, it all looks painfully bad now that I’m accustomed to a sharper, higher quality experience. The eye strain after 8 hours of staring at fuzzy, low res fonts takes a real toll.

                                                                                                                                          But others would be happy with a super low-res vt100, I’m sure. Everybody’s satisfactory is different.

                                                                                                                                          1. 6

                                                                                                                                            Doesn’t the vt100 use a bitmap font? This being the actual true solution to get sharp fonts on a low res display - just use bitmaps at the correct size.

                                                                                                                                            1. 4

                                                                                                                                              The original VT100 is quite low-res and fuzzy. Later VT terminals used higher-resolution screens which looked better.

                                                                                                                                              1. 3

                                                                                                                                                There’s this fascinating story about how DEC used the fuzz to good effect in their bitmap font, as a primitive form of anti-aliasing. ‘Dot stretching’, phosphor response curves… well worth a quick read!

                                                                                                                                                1. 2

                                                                                                                                                  This is wild. Thanks for the link!

                                                                                                                                              2. 2

                                                                                                                                                Bitmap fonts will avoid loss of sharpness due to antialiasing, but they’re not going to make an extremely low resolution screen any less low res, so I don’t know that I’d call 5 pixels arranged in a vaguely “e”-shape exactly “sharp”.

                                                                                                                                                1. 1

                                                                                                                                                  There are bitmap fonts which are more high res than 5 pixels per “e”. Check out stuff like atarist for alternatives.

                                                                                                                                                  1. 2

                                                                                                                                                    We’re talking about the vt100. You can have high resolution bitmap fonts, but you can’t fix a low resolution screen with a high res bitmap font.

                                                                                                                                              3. 5

                                                                                                                                                This reads to me like advice to avoid 4K as long as possible. If there’s no significant quantitative difference in efficiency/eyestrain between 4K and 1080p, and I’m currently happy with 1080p, switching to 4K will only make it more unpleasant for me to use perfectly useful 1080p monitors, pushing me to needlessly purchase more expensive monitors to replace those that I already have, and increasing consumerism.

                                                                                                                                                1. 2

                                                                                                                                                  You’re certainly free to stick with what you’re accustomed to. I have no compunctions about spending a lot of money to get the absolute most comfortable experience possible out of something I’m probably going to spend a year or more of my life, cumulatively, staring at. It’s one of the cheapest possible investments in the pleasantness of my career on a dollars-per-hour-used basis.

                                                                                                                                                  1. 3

                                                                                                                                                    Explained that way, I understand where you’re coming from. Even if there’s no objective benefit to upgrading your monitor, and you already feel “perfectly comfortable”, making work slightly more pleasant is desirable.

                                                                                                                                                    Now, you still need to make the decision as to whether the benefit gained from the monitor upgrade is worth the money you’re spending on it, but that’s much more personal. Thanks for sharing your perspective!

                                                                                                                                                  2. 2

                                                                                                                                                    The eyestrain is already there, you are just accustomed to it

                                                                                                                                                    1. 1

                                                                                                                                                      Citation needed.

                                                                                                                                                2. 2

                                                                                                                                                  I concur. To my eyes text with a 1.5x scaled 4K looks better than text with a 2x scaled 4K. I think the psychovisual system is complex and subjective enough to warrant “if you like it then it’s good”.

                                                                                                                                                  1. 1

                                                                                                                                                    Some people fetishize over fonts, font rendering, font shapes, dithering, smoothing and more such visual trickery. The author of this piece has published a programming font so I assume he puts more weight on font-related things than the average font consumer. Other people have other fetishes, my own is to cram as much onto the screen as I possibly can while still being able to distinguish what is written from the fly poop which partially conceals some characters on the screen. This makes that I always have to guffaw a bit when I see people lamenting the bad state of high-dpi support in Linux since the first thing I end up doing is turning all the stuff off so I can get 16 terminals on a 15” display. To each his own, I guess…

                                                                                                                                                  1. 5

                                                                                                                                                    Y’all are spending more than $200 on monitors?

                                                                                                                                                    1. 31

                                                                                                                                                      My grandfather, god rest him, was a frugal man. But he often said the two things you should spend extra on are shoes and mattresses, because “when you ain’t in one you’re in the other!” Maybe monitors are shoes for programmers.

                                                                                                                                                      But the ridiculously high end strikes me as maybe a bit much: my quality of life (legit, less squinting and headaches) improved with a 27” 4K monitor, but that was in the $300s.

                                                                                                                                                      1. 14

                                                                                                                                                        I am a cheapskate. I am loathe to spend money.

                                                                                                                                                        My office chair costs $750. I sit in it for a minimum of eight hours a day.

                                                                                                                                                        1. 3

                                                                                                                                                          I feel this way about monitors and keyboards. I’ll pay much more for good input/output devices, because that’s how I interact with the computer.

                                                                                                                                                          Personally, I would want a 60hz 5k at 27”, or a 60hz 8k at 32-34”. It annoys me that the screen on my computer (16” MBP) is better than any external monitor I could reasonably hope to use.

                                                                                                                                                          1. 1

                                                                                                                                                            The Dell UP3218K is 31.5” and 8K, but it’s also $3,300, and only works with computers that support DisplayPort Multi Stream Transport over two DisplayPort ports.

                                                                                                                                                            1. 1

                                                                                                                                                              Yeah, it’s going to be a few years.

                                                                                                                                                          2. 1

                                                                                                                                                            Yeah, came here to mention you can get 4K for way less than any of the monitors suggested in the post. I got a matte LG for $250 a while back.

                                                                                                                                                            I have to admit I thought a game running at 30hz felt “smooth” so I’m not sure I could see 60 vs. 120 without a slow-motion camera. YMMV, of course.

                                                                                                                                                            1. 2

                                                                                                                                                              Things with a lot of motion will appear smoother than things sitting completely still. A 30Mhz desktop, with most things not moving (wallpaper, etc) will flicker like crazy since there’s no movement to mask the flicker.

                                                                                                                                                              1. 2

                                                                                                                                                                A 30Hz desktop that’d be, we’re still a few centuries away from a 30MHz refresh rate.

                                                                                                                                                                1. 2

                                                                                                                                                                  pft Your monitor takes longer than Planck time to draw a full frame? n00b.

                                                                                                                                                                  1. 3

                                                                                                                                                                    Nah, mine is so fast the photons end up in a traffic jam trying to work their way out of the screen, talk about red shift. Or maybe the CCFT is going bad, who knows…

                                                                                                                                                                2. 1

                                                                                                                                                                  Interesting. FWIW, I wasn’t trying to say anyone should go down to 30Hz (or up to 30MHz heh) just that I, personally, probably wouldn’t feel much benefit from 120, given I was able to mix up lower refresh rates.

                                                                                                                                                                3. 2

                                                                                                                                                                  You will notice running a desktop at 30Hz. When I got my 4k monitor a few years ago, it turned out my USB-C <-> HDMI adapter could only do 4k@30Hz. It was disturbing ;).

                                                                                                                                                                  1. 2

                                                                                                                                                                    Oh, yeah, I wasn’t arguing for actively downgrading to 30hz, just saying I probably wouldn’t feel much benefit from going to 120 given my rough perception of smoothness. I see how it reads different.

                                                                                                                                                              2. 5

                                                                                                                                                                I spend >$200 on frying pans, for the same reason others mention shoes and beds. It’s something I use every day, and the slight increase in cost per use is well worth it having a tool I enjoy using.

                                                                                                                                                                Edit I’d also like to add that I’m in an economic situation that allows me to consider $200 purchases as “not a huge deal”. I do remember a time of my life when this was emphatically not the case.

                                                                                                                                                                1. 4

                                                                                                                                                                  Funny that, I also care about things like that… which is why I got them for free from abandoned houses and even, once, abandoned in a ditch by the roadside. That is where you’ll find old rusting cast-iron skillets in need of just a bit of TLC with a rotary steel brush, a coat of oil and a bake in the oven. The one I found in the ditch was quite fancy albeit rusty, a large Hackman with a stainless steel handle. How it ended up in that ditch in the Swedish countryside I have no idea, I never saw any mentioning of any unsolved murder case for lack of the evidence in the form of the obviously heavy blunt object used to bash in the skull of the unfortunate victim. It was slightly pitted but the steel brush made it almost like new. I use these on a wood-fired stove, just what they’re made for.

                                                                                                                                                                  Beds I always made myself (including high wall-mounted rope-suspended sailing-ship inspired ones with retractable ladders which you’d be hard-pressed to find elsewhere) , shoes occasionally (basic car tyre sandals). I find it far more satisfying to spend some time in making something from either raw or basic materials (beds, sandals) or revive from abandonment (cookware, computing equipment, electronics, etc) than to just plunk down more money. Another advantage is that stuff you made yourself usually can be fixed by yourself as well so it lasts a long time.

                                                                                                                                                                2. 3

                                                                                                                                                                  I just upgraded my home office monitor for about $30. Suffice to say it’s not 4k, IPS or any of these things considered ‘essential’ for developers. Fourteen years old, it is, however, significantly better and sharper than the monitors most programmers worked on until the 1990s. And they did better work than I ever did.

                                                                                                                                                                  If you like spending money on monitors, be my guest, but if you write a blog insisting others should do the same, I think we should call this article out for what it is: promotion of conspicuous consumption.

                                                                                                                                                                  1. 2

                                                                                                                                                                    If you write graphical applications or websites it makes sense to have something reasonably good and at least with a high pixel density, because if you work on a website only with a loPDI display and try it on a hiDPI display later you will likely be surprised!

                                                                                                                                                                    It doesn’t have to be top notch, the idea is just to get reasonably close to what Apple calls “Retina”. I can find IPS, 27” 4K displays around €400 on the web.

                                                                                                                                                                    Also, it’s not exactly the same use case but a lot of entry-level phones and tablets have very nice displays nowadays.

                                                                                                                                                                    1. 1

                                                                                                                                                                      if you work on a website only with a loPDI display and try it on a hiDPI display later you will likely be surprised!

                                                                                                                                                                      does this not work both ways?

                                                                                                                                                                      1. 1

                                                                                                                                                                        Not really in my experience. CSS units (px, em) are density-aware and scale nicely, browsers take care to draw lines thinner than one pixel properly, and even downscaling a raster image isn’t always a big deal given how fast modern computers are.

                                                                                                                                                                        1. 3

                                                                                                                                                                          i can only speak for myself, but using a 1024x768 screen for the web has been a pretty poor experience in recent years. a lot of the time fonts are really large and there is a lot of empty space, making for extremely low information density and often requiring scrolling to view any of the main content on a page. sometimes 25-50% of the screen is covered by sticky bars which don’t go away when you scroll. it makes me think web developers aren’t testing their websites on screens like mine.

                                                                                                                                                                          1. 1

                                                                                                                                                                            Some websites really suck, no doubts about it. But web standards are carefully designed to handle different pixel densities and window sizes properly, even if they can’t ensure that websites don’t suck.

                                                                                                                                                                            For example, many bad websites change the default size of the text to something ridiculously big or small. This is a really bad practice. Better websites don’t change the default font size much, or (even better) don’t change it at all, and use em and rem CSS units in order to make everything relative to the font size so the whole website scales seamlessly when zooming in and out.

                                                                                                                                                                            Note that if your browser/operating system is not aware of the pixel density of your display, everything will be too big or too small by default. Basically, zooming in/out is the way to fix it. If you want to test your setup with a reference, well-designed and accessible website you can use some random page on the Mozilla docs.

                                                                                                                                                                            1. 1

                                                                                                                                                                              Some websites really suck, no doubts about it. But web standards are carefully designed to handle different pixel densities and window sizes properly, even if they can’t ensure that websites don’t suck.

                                                                                                                                                                              And you’re saying this means a web designer with a high DPI display can rest assured that his website will look good on a low DPI display, as long as he follows certain practices?

                                                                                                                                                                              Why doesn’t the same apply in the reverse case, where the designer has a low DPI display and wants their website to be usable on a high DPI display?

                                                                                                                                                                              I have to say even the MDN site wastes a lot of space, and the content doesn’t begin until half way down the page. There’s a ton of space wasted around the search bar and the menu items in the top bar, and around the headers and what appear to be <hr>’s.

                                                                                                                                                                              1. 1

                                                                                                                                                                                And you’re saying this means a web designer with a high DPI display can rest assured that his website will look good on a low DPI display, as long as he follows certain practices?

                                                                                                                                                                                Yes I think so. In fact Chrome and Safari have low DPI simulators in their dev tools.

                                                                                                                                                                                Why doesn’t the same apply in the reverse case, where the designer has a low DPI display and wants their website to be usable on a high DPI display?

                                                                                                                                                                                Well it does to some extent, but typically you have to be careful with pictures. Raster images won’t look sharp on high DPI displays unless you’re using things like srcset. Of course it’s absolutely not a deal breaker but it is something to have in mind if you do care about graphics.

                                                                                                                                                                                In anyway, I think the vast majority of web designers are using high DPI displays nowadays.

                                                                                                                                                                                I have to say even the MDN site wastes a lot of space, and the content doesn’t begin until half way down the page. There’s a ton of space wasted around the search bar and the menu items in the top bar, and around the headers and what appear to be ’s.

                                                                                                                                                                                Indeed, and the header also wastes a lot of space (though not the half) on my high DPI 13” display. It’s a bit funny because I didn’t notice it earlier: When I’m looking for something on this website, my eyes just ignore all the large header and I start searching or scrolling immediately.

                                                                                                                                                                                But this “big header” effect is less present on “desktop mode” so you should try to zoom out if the font size isn’t too small for you. I’ve tested it with the device simulator in Safari at about 1220x780 and it does not look that bad to my eyes.

                                                                                                                                                                                1. 1

                                                                                                                                                                                  Well it does to some extent, but typically you have to be careful with pictures. Raster images won’t look sharp on high DPI displays unless you’re using things like srcset. Of course it’s absolutely not a deal breaker but it is something to have in mind if you do care about graphics.

                                                                                                                                                                                  Yeah I guess this is the one area where low DPI displays could be easier to target without personally testing with one. A large image shrunk will look fine, while a small image enlarged will look like dog shit.

                                                                                                                                                                                  The use of high DPI displays by most web designers probably explains why modern sites look so shitty on low DPI displays. But that also means you won’t get fired for making a site that looks shitty on low DPI displays. It also makes sense from a corporate perspective, as high DPI displays are more likely to be used by wealthier people who will be a larger source of revenue, even if low DPI displays are still in widespread use.

                                                                                                                                                                    2. 1

                                                                                                                                                                      A decent monitor lasts a good 3-5 years, possibly longer, but let’s say 3 and be pessimistic. What is a $1,000 monitor worth, as a percentage of your salary over three years? More to the point, what is it as a fraction of the total cost of employing you for three years? According to Glass Door, the average salary for a software developer in the USA is around $75K. Including all overheads, that likely means that it costs around $150K a year in total to employ a developer. Over three years, that’s $450K. Is a $1,000 monitor going to make a developer 0.2% more productive over three years than a $200 monitor? If so, it’s worth buying.

                                                                                                                                                                    1. 5

                                                                                                                                                                      I will wait until there is some affirmative reason for me to switch that has to do with my use of software applications on the platform – if I’m not going to switch from Mac to Windows, I’m going to need some strong reason that’s not “we have partly reimplemented Windows 2000s user experience”. I admit that there is a strong reason – and that’s the principle of open/free software. But it’s not strong enough for me.

                                                                                                                                                                      1. 1

                                                                                                                                                                        They had one of these at the Museum of Questionable Medical Devices in Minneapolis. That place was great.