Threads for dvk

    1. 2

      Pizza! Trying out a new dough recipe. Also dealing with my first cold and consequent sinus issues in well over 8 months, had a good run.

      Maybe spend some evening time on my static site generator “Gozer” too. Such a fun input -> output project.

      1. 1

        Trying out a new dough recipe

        What’s the recipe? And did it work?

        1. 2

          I’ll know in 3 hours! Basically I was always using an overnight pre-fermentation phase (poolish) before coupled with relatively short rise and rest times (~6h) for the final dough. This time I didn’t use pre-fermentation but went for a slow rise (~30h) of the final dough right away.

    2. 3

      How to fix my damn sinus issues. I have vasomotor rhinitis and as long as I don’t get a cold or the flu, it’s fine.

      As soon as I catch either of those, I am unable (literally unable, like zero) to sleep for at least a few nights. It is driving me to the edge.

      1. 3

        I had the same problem. I’ve managed to solve it via:

        • breathing (wim hoff method, pranayama, running / jogging) – effects were immediate, I could breath easily for days
        • ear and body candles (no English articles, perhaps you can translate this: http://talp-masszazs.hupont.hu/31/ful-es-testgyertya) – it took me two years to clean up completely my head, ears and face; however this can easily take just weeks or months
    3. 4

      Amazing, I wonder what caused the sudden jump?

      I moved from OSX to Ubuntu + Gnome in 2016, then Arch + Gnome a few years later and then Arch + Sway recently. Been very pleased with the overall experience. Running all this on a €900 Lenovo Yoga with AMD Ryzen 7 and it blows my previous MacBook (which was more than 3x as expensive) out of the water on all fronts.

      1. 16

        It’s simply because it’s not the same initial question from previous years.

        In the past, you could only select one main OS, while in 2022 you could select more than one.

      2. 4

        I wonder what caused the sudden jump?

        I have two bets, WSL2 getting really usable with working GUI, and Steam Deck.

        I’m running NixOS on Ryzen 7 5850U, with Gnome.

        1. 1

          I also agree, WSL isn’t exactly “Linux desktop”.

          Anyway, next year I’ll gonna contribute in the other direction, the new company offers me either windows or Mac, so count one Linux less :(

          1. 1

            Both run full screen Linux VMs quite well ;⁠) I’m actually not joking. VMware if you can pay or something like utm https://mac.getutm.app/ works great.

            This of course assumes you’re not bound by some os specific tools, but if there’s a win/Mac choice, that’s unlikely.

      3. 1

        I wonder too, it’s a really big jump. Perhaps people gave up on their COVID sourdough starters and started a Linux install instead?

    4. 13

      The best tool to move IMAP accounts is https://imapsync.lamiral.info/ .

      Actual working example, with the host names changed:

      imapsync \
        --host1 outlook.office365.com \
        --ssl1 \
        --user1 user1@example.com \
        --passfile1 "${CONFDIR}/user1-examplecom-passfile" \
        --host2 mail.example.net \
        --ssl2 \
        --user2 user1@example.net \
        --passfile2 "${CONFDIR}/user1-examplenet-passfile" \
        --delete2 \
        --addheader \
        --exclude "(?)Calendar" \
        --exclude "(?)Sync Issues" \
        --exclude "(?)Deleted Items" \
        --exclude "(?)Conversation History" \
        --exclude "(?)RSS Feeds" \
        --exclude "(?)RSS Subscriptions" \
        --noemailreport1 --noemailreport2 \
        --logdir "${LOGDIR}" \
        --logfile "o365-sync.log"
      
      1. 3

        imapsync has never failed me. Used it to transfer my backlog of 20 years worth of emails and it did not miss a single one.

        It seems the author was confused over its price, which I understand looking at the website. For me it always was available in my distro’s package manager, so is the price listed really only for support and perhaps people on Windows then?

        1. 1

          Officially, a license for the software costs €60. However, you don’t have to look too far on the website to find that there’s a copy sitting in /dist/. And like you say, some package managers (including brew) include it.

          The software is released under the No Limit Public License, and there’s this note in the source code:

          Gilles LAMIRAL earns his living by writing, installing, configuring and sometimes teaching free, open, and often gratis software. Imapsync used to be “always gratis” but now it is only “often gratis” because imapsync is sold by its author, your servitor, a good way to maintain and support free open public software tools over decades.

    5. 24

      In the “Why PHP?” Section:

      The answer is simple: because it was there. I’m self-taught, and I don’t have much in the way of formal training. Except maybe for the occasional online course I’ve taken, I have no piece of paper with a stamp on it from a prestigious university that says I can tell computers what to do.

      This is the crux of it, and there’s a lot of implicit things going on in these sentences. First off, there is a clear jab at people who do have degrees and formal training. “Prestigious” is used pejoratively and sarcastically here. This wasn’t the author’s path, so they resent people who did take that path. Of course when you are self taught, you skew towards any tool that can get you up and running the easiest and quickest. Note how I said “up and running” - it’s not the tool that is best in the long run, it’s the tool that gets you a picture on the screen the quickest. By the way, there’s value in that too, but I wouldn’t base all of the dimensions of my evaluation just on something “being there.” Availability is valuable, but it’s not the only valuable quality.

      This is a viewpoint that’s very common in the industry, I personally meet a lot of people who share this mindset. I don’t think it’s entirely wrong, but I think it’s a very limited way of thinking, and the people who hold it tend to be self-righteous like this. I understand that PHP might have been your path, and it might have worked for you. But a tone like this reeks of criticizing and minimizing other people’s path. I get that they feel defensive because people are attacking PHP, but I don’t think that’s causing this philosophy, I think this is many people’s true philosophy under the hood, this was just an excuse to write about it.

      Where does this philosophy come from though - can anyone name any popular programming language that was designed for “CS graduates?” Python? Java? Javascript? These are the most pragmatic and un-academic languages on Earth. The pragmatists and proudly un-educated have won, why are they claiming to be the ones that are being persecuted?

      Btw, if it’s important to anyone, I don’t have a CS degree, I studied Electrical Engineering. I definitely took CS electives, but I also consider myself mostly self-taught in terms of actual programming and CS. But I don’t knock the academic side of CS, on the contrary I think it’s responsible for every single good idea that makes its way into “practical” programming languages.

      1. 12

        I don’t necessarily have a problem with people using something that gets content on a screen quickly. But I don’t accept the excuse that self-taught means you can’t or shouldn’t grow beyond that. I’m self taught. I don’t even have a college degree and yet I learned Haskell. I can write Rust code. I started out in PHP but I outgrew it eventually. Anyone who can become an expert in PHP can do the same.

        It’s fine to get payed to write PHP. There is code out there that needs maintaining in PHP. But PHP earned it’s reputation as a deeply unsafe language to develop in and even with the improvements the language has made much of that unsafe core still remains.

        1. 5

          While the author of the article is being contemptuous to those with more educated backgrounds, I think you’re doing a bit of the converse here. Programming is as wide as humanity itself; if there’s a way for a computer (for some definition of computer) to accept input and provide output, I can guarantee you that someone will have probably programmed it. There doesn’t need to be a single, good path to programming. Whether your path involves writing PHP, unsafe C, or Haskell, it doesn’t really matter.

          1. 3

            It’s difficult in a comment forum to give an appropriately nuanced take on stuff like this. I didn’t intend to come off as contemptuous. If you get hired to help maintain a PHP codebase then the responsible appropriate thing to do is to work on PHP code. There is no shame or condemnation for it.

            Sometimes though I think people get stuck or pigeonholed as “PHP developer” or “Python developer” and never learn other tools or approaches. I want to encourage those people that they can be more than just $LANG developer. There are better tools than PHP out there that you can use when you get the opportunity. Learning to protect yourself from the flaws of a given language is a valuable skillset. There is no shortage of work for people who became experts in avoiding the pitfalls.

            But ,when you have the opportunity, it is hugely valuable to be able to choose a language with less pitfalls. Where the defensive programming is less about defending against the language itself and more about defending against the environment your software has to run in. Being able to choose those languages is also a valuable skillset that no one should feel is out of their reach.

            1. 1

              But ,when you have the opportunity, it is hugely valuable to be able to choose a language with less pitfalls. Where the defensive programming is less about defending against the language itself and more about defending against the environment your software has to run in. Being able to choose those languages is also a valuable skillset that no one should feel is out of their reach.

              Great explanation of this idea.

            2. 1

              It’s difficult in a comment forum to give an appropriately nuanced take on stuff like this. I didn’t intend to come off as contemptuous. If you get hired to help maintain a PHP codebase then the responsible appropriate thing to do is to work on PHP code. There is no shame or condemnation for it.

              I figured which is why I tried to keep my reply soft. I agree with everything you just said.

      2. 11

        Also, a CS degree doesn’t teach you programming anyway. It’s not meant to. It teaches you CS (or at least tries to). You probably self teach some programming along the way but it’s harly a focus of coursework.

        1. 5

          I don’t think that’s universally true. The first two years of required classes for a CS degree at the universities around me (US) were heavily focused on programming (Java… C++…), and failing any of those would have meant no CS degree.

      3. 8

        But a tone like this reeks of criticizing and minimizing other people’s path.

        I don’t entirely agree with your interpretation, but I will note for sake of irony that this is more or less how un-credentialed (in the sense of not having a degree in CS or other “relevant” field) developers feel for pretty much their entire careers. There’s a huge and powerful trend in tech hiring of prioritizing people who have a degree from one of the handful of trendy top universities, and a feedback loop wherein people who work at major companies help develop and teach “how to pass our interview” courses at those universities. The result is that if you are not someone who has a CS (or other “relevant”) degree you are constantly the odd one out and near-constantly being reminded of it.

        I still feel this coming up on 20 years in to the industry and with a résumé that largely lets me avoid a lot of the BS in interviewing/hiring.

        1. 4

          I still feel this coming up on 20 years in to the industry and with a résumé that largely lets me avoid a lot of the BS in interviewing/hiring.

          (I’m hoping this comment isn’t too off-topic.) I certainly agree. I myself come from one of those trendy elite CS universities (though it’s been a good while at this point) and am well credentialed, but I’ve started to use large numbers of junior engineers out of good schools to usually be a light negative signal when applying for a company. The culture of hiring in software is such that companies often overselect for credentials while ignoring effectiveness, at least in my opinion.

        2. 2

          There’s a huge and powerful trend in tech hiring of prioritizing people who have a degree from one of the handful of trendy top universities

          This depends entirely on your experience. I’ve never seen this trend, I’ve seen quite the opposite - a large chunk of people I work with don’t come from a CS or engineering background. Nor do I see anyone being hired over someone else because of where they went to school.

          1. 3

            The fact that you’ve never seen FAANGs go on-campus at certain universities (but not others) to recruit, help develop “how to pass the interview” curriculum to be taught at certain universities (but not others), etc., doesn’t mean that it doesn’t happen or that it doesn’t have an effect both on their resulting workforce/company culture and on everyone who emulates their hiring (which is unfortunately a large chunk of the industry).

            1. 1

              Let me repeat what I said:

              This depends entirely on your experience.

      4. 7

        can anyone name any popular programming language that was designed for “CS graduates?”

        My glib answer: yes. Go. From Rob Pike, the creator: “The key point here is our programmers are Googlers, they’re not researchers. They’re typically, fairly young, fresh out of school, probably learned Java, maybe learned C or C++, probably learned Python. They’re not capable of understanding a brilliant language but we want to use them to build good software. So, the language that we give them has to be easy for them to understand and easy to adopt.”

        More seriously, I think this “self-taught persecution” attitude is related to imposter syndrome and the increasing credentialism in society. A former coworker of mine tended towards a self-defense attitude and I chalk that up to his not having a CS degree as a developer (he did, however, have formal training as a jazz musician).

        It’s also not as if PHP is alone in this hate—C is probably hated just as much, if not more, than PHP.

        1. 3

          Yet if you know C well, you’re probably seen as much better “programmer” then if you knew PHP well. But yeah, perceptions.

        2. 3

          Rent seeking isn’t hate. C dominates everything. So if you invent a new programming language, there’s only two paths to being successful. You can either (1) appeal to the new generation of young people to learn your language instead of C. That’s how PHP and Node did it. Or (2) you can run PR campaigns vilifying C until you’ve convinced enough VPs to convert their departments. That’s how Java did it.

      5. 4

        I agree with everything you said here. I even share your perception that many people seem to be insecure and/or defensive about not having a CS background or not really understanding some of the popular concepts we read about in blog posts, etc. Like you said: The “pragmatists” won- what are they worrying about?

        In some ways, computer programming is very much like engineering– engineering jobs/tasks/people fall on a wide spectrum of “how academic” from “glorified assembly line work” all the way to “indistinguishable from pure science research.” And attitudes similarly span that spectrum. That’s not a bad thing–just an observation.

        What’s extra frustrating to me is that it’s now turned a corner where if you do criticize a language like PHP, you’re seen as an irrational troll who just thinks you’re smarter than everyone else. It’s become anti-cool to shit on PHP, even though it’s still literally the least capable backend language being used today besides maybe Python (no async, no threads, no generics in its tacked-on type system, object equality is not customizable and will crash your program if you have a ref cycle, etc).

        1. 3

          As a developer that has experience building web backends in PHP but also some experience with Go, Python, Ruby (w/ Rails) and Rust, I wonder what language you deem capable as a backend language that still provides the same level of abstraction as, say, Laravel or Symfony. Rails obviously comes to mind, but what else is there (if Python w/ Django is out of the question)?

          1. 3

            Keep in mind that I’m not trying to evangelize “the one true backend language” or anything like that. My main point was just to gripe that there are still (what I consider to be) legitimate criticisms of PHP as a (backend) programming language, but whenever I point them out as reasons that I might suggest someone avoid starting a project in PHP, I’m met with more skepticism and defensiveness than I think is justified.

            The secondary point is that I truly believe that PHP is strictly inferior to other backend programming languages. What I mean by that is that we all work differently and have different preferences: some people will like dynamically typed languages, some will like statically typed; some like OOP, some like FP. That’s all great. Scala is a very different language than Go, which is a very different language than Clojure. But if you look at “modern” PHP projects and “best practices” advice, it’s almost literally the same thing as the “best practices” for Java 7 in 2010, except that PHP doesn’t have good data structures in its standard library, has no threading or async, no generics, etc. And if you compare modern PHP with a recent version of Java… oh, boy! Java now has ADTs, Records, Streams, and a few other really nice additions that make writing code a little less painful.

            So, it’s not just that I think PHP is a bad language, it’s that it’s, IMO, a less capable subset of another language. If it were actually different, then I might just shrug it off as a preference thing. I mean, PHP even stole the class semantics right from Java (single inheritance, interfaces, static methods, abstract classes, final keyword, etc).

            I know people love Laravel, but my contention isn’t that Laravel isn’t good (I don’t honestly have much experience with Laravel, but I have worked with Symfony). But Laravel is not PHP. I’m talking about the language. If there’s something written in PHP that’s so valuable to your project that you choose to work with it, that’s great. But in my opinion, the sane attitude should be one of “Well, I guess it’ll be worth it to deal with PHP so that we can use XYZ.” and not “PHP is a good language and @ragnese is just an anti-PHP troll. What backend language could possibly want async IO or easy fire-and-forget tasks, anyway?”

          2. 3

            I don’t like rails either for slightly different reasons. I would rather develop in Go if I’m creating a backend web service than Python, Ruby, or PHP. I don’t find that I usually need much more abstraction than say gorilla most of the time. And Go doesn’t suddenly surprise me with strange behavior as often.

            1. 2

              The person they’re asking definitely doesn’t use Go though, because of:

              no generics in its tacked-on type system

          3. 3

            Phoenix in Elixir is pretty good — very batteries-included, high-velocity, and extremely good with async/concurrency/parallelism. I don’t know if “object equality is not customizable” applies, it’s immutable with value-based equality at all points, so…

            The tacked-on type system is pretty awful (Dialyzer is well worse than, say TypeScript, mypy, or Sorbet) but I don’t think it’s strictly worse, considering what you get from the framework/library.

          4. 2

            Based on their complaints (“no async, no threads, no generics in its tacked-on type system, object equality is not customizable and will crash your program if you have a ref cycle, etc”), and assuming they’re not using a really esoteric language, I’m willing to bet they use Scala, Java, C#, or Rust. I’m definitely curious about this too.

            Whether or not those have a web framework that competes with Laravel, I’m not sure. That’s also not the measure of success though, for example if people are working with a lot of services a monolithic web framework isn’t as important or even desired. That’s one other thing to consider here - I think it’s accurate to say that PHP is synonymous with building monolithic web apps, and not everyone builds monolithic web apps.

    6. 14

      PHP isn’t dead. That much is obvious if you look around the industry a little bit. I’ve personally never said PHP was dead.

      However PHP lost my trust years ago. I’ll never write another line of PHP nor will I willingly work on maintaining an existing PHP codebase. Fool me once shame on me. Fool me twice… Well you know how that quote ends.

      1. 7

        I would love to hear more about how you lost trust in a language that has grown considerably in the last couple of years (decade?) to be a more “mature” language with a pretty in-depth feature set. I’ve built a career working in PHP (and other languages too, but mostly PHP), so statements like this always pique my interest.

        1. 7

          This is probably in part because it lost my trust several decades ago. There are so many great languages that I can be employed to write that don’t have any of that historical baggage to deal with. Java, Go, Rust, C#. Why go back to a language that due to backwards compatibility will always have footguns waiting in the codebase for me. Layering on safer ways to do things can improve a language over time but unless you also remove the highly unsafe bits as you go the language will always fundamentally be working against you.

          I have the luxury of not having to choose PHP ever. I’m grateful for it.

          1. 3

            That’s understandable. I began my career with PHP in the early 2000s, migrated towards C/C++, C#, node and eventually Go before coming back to PHP.

            The PHP of today is very much not the language it was ten years ago*, doubly true for 20 years ago. While it’s still my daily driver I do enjoy working in Golang, TypeScript and both React and Vue on the front end.

          2. 1

            Its very true to look at some legacy PHP codebases and wince in terror at all the bad stuff. I agree as well, that they need to seriously look at removing some legacy from the codebase - but you have to admit its kind of impressive how well they have managed to keep a lot of backward compatibility. I always recommend every PHP developer, especially new ones, read A Fractal of Bad Design to understand how PHP got the reputation it has. But then to also read PHP The Right Way so they can contribute to our larger application codebase efficiently.

            Like everyone else has said though - the langauge and ecosystem has evolved quite a bit and when you look at it as say, middleware between a client (browser) and other services (databases, queues, what ever), you see it fits nicely as a solution for certain problems. If I’m thinking about building a website, or webapp, or simple RESTful service, I will always grab PHP first but I would never say some one not using is wrong. I don’t care what brand of hammer you use, as long as we’re hitting nails.

        2. 2

          Seconded. I’ve been working with PHP (but not exclusively PHP) for over a decade and have never experienced anything that would cause me to put down such a statement. Very much interested to hear what happened.

      2. 8

        I don’t do web programming in general, but I won’t even work for a company that uses PHP. I take that as a sign the company does not make good decisions. Cheers.

    7. 5

      Planting some chillies. Looks like they’re going to take a few months to grow, that’s even longer than it takes to compile Firefox from source which I didn’t know was possible :’(

      1. 1

        I wonder, is this due to Servo being written in Rust or was it always slow to begin with?

      2. 1

        Worst case is around 80 minutes on my Lenovo x series laptop. You can get it down to 10 with a regularly updated tree, a decent machine or an sccache cluster.

        Good luck with your chili project. Sounds like a better plan than compiling Firefox Ina weekend for sure :-)

      3. 18

        This just isn’t true. Within the next four weeks, there will be a million cases at this rate.

        The panic is entirely rational. https://bedford.io/blog/ncov-cryptic-transmission/

        The virus isn’t the problem. The problem is that our social structures are at risk of breaking down under the strain of trying to care for everyone.

        1. 19

          Again, you’re not understanding that the risk isn’t the virus. The risk is everything you take for granted breaks down.

          Grocery stores. Hospitals. Public transit.

          Let me ask you this. Who do you think is going to take care of a million patients globally? Are there even enough beds? And when there’s not, then what?

          And that doesn’t even get into the question of what we should do with people who have the virus. Put ‘em in isolation, okay. Then what? People in isolation need food. Who’s going to bring it to them? When are they safe to release? Are they ever safe to release? If not, what then?

          Work out the exponential growth.

          It’d be one thing if the virus was only targeting certain subsets of the population. But it’s targeting all of us. Anyone can be a carrier, even if they’re not at risk of dying.

          If you don’t believe how fast this thing spreads, look at this: https://twitter.com/luisferre/status/1235235076193112070

          Merely driving someone to the hospital was enough. Close proximity = infection.

          1. 10

            My wife said something along the lines of “I have a conference in the summer, I guess we have all had the virus by then and life can go on as usual”.

            I did a quick back-of-the-envelope calculation to show that this is probably not what we want. I saw recent estimates that during a pandemic (which would probably happen if no further measures are taken) an expected ~60% of the population would be infected. I think the current statistics state that ~20% of the ill need to be hospitalized. The Netherlands has ~17 million inhabitants, so roughly ~2 million people would need to be hospitalized. Even if the 20% somewhat overestimated (because cases where COVID-19 is like a mild cold are not reported), this is going to be absolutely brutal. For reference, NL had 37.753 hospital beds in 2017 [1]. Of course, most hospital rooms are not going to be prepared for this scenario (fitted with proper fans, etc.). Besides that, this would also result in ~200,000 deaths (at the 2% mortality rate).

            Everything possible should be done to slow down this corona virus. First to have headway to scale up capacity and to have time to study existing cases to see what therapies are effective and what not (a vaccine is probably still too far away to be helpful).

            At least the officials in our country have been honest honest on what to expect. They try to avoid panic, but they have also stated in very clear terms that they believe that scenarios where a significant chunk of the population will be infected are realistic.

            [1] https://www.staatvenz.nl/kerncijfers/ziekenhuisbedden

            1. 1

              Besides that, this would also result in ~200,000 deaths (at the 2% mortality rate).

              Nah, way higher. It’s 2% when those who need emergency care do get hospitalized. When the amount of available beds is a rounding error of the amount of needed ones, you’re in deep trouble wrt mortality rate.

            2. 15

              Since the serious cases require respirators and oxygen, there’s a threshold where mortality will increase sharply: when there are no more respirators available doctors are going to have to start making hard decisions.

              If we manage to keep the rate of new cases low enough, this will be not much worse than a normal flu. If the rate is too high, things look bleak for those over 70 years old.

              But it’s not like it’s an airborne ebola with an R0 rate of 5. The current media frenzy (social and traditional media alike) is overblown, IMHO.

      4. 1

        The number of active cases is on a down trend or stable at ~40.000. Certainly no trend towards a million. https://www.worldometers.info/coronavirus/coronavirus-cases/#active-cases

        1. 6

          That’s because the majority of cases so far have been in China, which appears to have gotten the disease under control through strict lockdown/quarantine policies. If you look at the charts for the rest of the world on the same page, they tell a very different story: lagging behind China in absolute number of cases but a clear exponential growth rate.

    8. 7

      I mean, you’ve already established in past threads that you place a high value on your travel-centric lifestyle.

      Wash your hands, comply with local public health officials, and tell the truth about where you’ve been during border screenings.

      1. 0

        Sitting in SEA for three months doesn’t feel particularly travel-centric. I’m not backpacking. I’m renting an apartment and living here for an entire quarter.

        Also, I’m not sure if the command at the end of your comment was directed specifically at me. If it was, then that’s a pretty weird thing to say to a stranger.

        1. 5

          Ah, I think the ending that said “that’s the best any frequent traveler can do” was eaten by a grue.

          1. 2

            Ah, sorry. Yes, I think you’re right.

            Sorry for being defensive.

    9. 1

      I wonder, do you still hold this opinion?

      1. 0

        Yes. I still hold this opinion.

        I am in Vietnam, where they have done a relatively good job of containing the virus. People here are not self-isolating, and there is absolutely no panic-buying. There’s a seemingly endless supply of toilet paper and hand sanitiser.

        People here are wearing face masks in public places like supermarkets, but not at cafés or the beach.

        People are still working. People are still going into offices every day. We are going into an office every day.

        The sensationalism prevalent in Western society — and apparently also among the members of this forum — is the reason why people are panic-buying and our grandparents are put in a compromised position where they may have to do without basic supplies. I still think many people on this forum are very stupid. I won’t point fingers, but I will say again that the implication that COVID-19 is a life-long affliction is one of the stupidest things I’ve read in this community. We’re all meant to do some kind of scientific work, aren’t we? Doesn’t that mean measuring and looking at real data? Don’t we already have a huge amount of data on the recovery rate? Even if you [not you specifically] do think the panic is rational, surely nobody’s case is helped by just making shit up?

        Naturally I expect to get downvoted into oblivion again, as people are awfully touchy here.

  1. 2

    Battling this flu that is currently still very much winning. Finding someone that is both skilled enough at PHP and JavaScript and willing to sit in my place and take over development and support for my WordPress plugins. The first part on its own is easy, but combining it with the latter makes it incredibly hard. Even though I’m willing to pay (really) well. It strengthens my resolve for getting out of this golden cage, because if others are “picky”, why not me?

  2. 4

    Finalizing my tree-walking interpreter for the Monkey language (written in C, code here: https://github.com/dannyvankooten/monkey-c-monkey-do) from the wonderful “Writing an interpreter in Go” book by Thorsten Ball. Had so much fun building this, I can’t wait to get started on his second book in which the project continues to build a bytecode VM.

    1. 2

      Totally love the name! Now you need to figure out which companion tools you can call “see-no-evil” and “hear-no-evil” :)

  3. 13

    I found this to be a lovely 30-minute read on C’s motivation, history, design, and surprising success. I marked down over 50 highlights in Instapaper.

    If you haven’t written C in awhile, you should give it a try once more! Some tips: use a modern build of clang for great compiler error messages; use vim (or emacs/vscode) to be reminded C “just works” in editors; use a simple Makefile for build/test ergonomics.

    In writing loops and laying out data in contiguous arrays and structures, remind yourself that C is “just” functions, data atoms, pointers, arrays, structs, and control flow (plus the preprocessor!)

    Marvel at the timeless utility of printf, a five-decade-old debugging Swiss army knife. Remember that to use it, you need to #include <stdio.h>. As Ritchie laments here, C regrettably did not include support for namespaces and modules beyond a global namespace for functions and a crude textual include system.

    Refresh your memory on the common header files that are part of the standard and needed for doing common ops with strings, I/O, or dynamic heap allocation. You can get a nice refresher on those in your browser here:

    https://man.cs50.io/

    Overall, you’ll probably regain some appreciation for the essence of programming, which C represents not due to an intricate programming language design with an extensive library, but instead due to a minimalist language design not too far from assembly, in which you simply must build your own library, or rely directly on OS system calls and facilities, to do anything useful. There is something to be said for languages so small they can truly fit in your head, especially when they are not just small, but also fast, powerful, stable, ubiquitous, and, perhaps, as timeless as anything in computing can be.

    1. 5

      I don’t think that a lack of namespaces is really something to lament. _ is prettier than ::. For all the handwringing you see about them, I’ve literally never seen a symbol clash in C ever.

      1. 6

        I love C and it continues to be my first language of choice for many tasks, but namespaces are the first thing I’d add to the language if I could. Once programs get beyond a certain size, you really need a setting between “visible to one file” and “visible EVERYWHERE”. (You can get some of the same effect by breaking your code up into libraries, but even then the visibility controls are either external to the language or non-portable compiler extensions!)

        And for the record, I’d prefer an overload of the dot operator for namespaces. Or maybe a single-colon operator – some languages had that before the double-colon became ubiquitous.

      2. 2

        I tend to agree that this isn’t a huge issue in practice, especially since so very many large and well-organized C programs have been written (e.g. CPython, redis, nginx, etc.), and the conventions different teams use aren’t too far apart from one another. As you noted, they generally just group related functions together into files and name them using a common function namespace prefix, like ns_. But, clashes are possible, and it has meant C is used much more as a starting point for bespoke and self-contained programs (again, CPython, redis, and nginx are great examples), rather than as a programming environment to wire together many different libraries, as is common in Python, or even Go.

        As dmr describes it in the OP, this is just a “smaller infelicity”.

        Many smaller infelicities exist in the language and its description besides those discussed above, of course. There are also general criticisms to be lodged that transcend detailed points. Chief among these is that the language and its generally-expected environment provide little help for writing very large systems. The naming structure provides only two main levels, ‘external’ (visible everywhere) and ‘internal’ (within a single procedure). An intermediate level of visibility (within a single file of data and procedures) is weakly tied to the language definition. Thus, there is little direct support for modularization, and project designers are forced to create their own conventions.

        1. 3

          I don’t really think that namespaces are the reason people don’t use C for gluing together lots of other C programs and libraries. I think people don’t do that in C because things like Python and Bash are a million times more suitable for it in a million different ways, only one of which is namespaces.

          Large systems don’t need to all be linked together with one big ld call. Large systems should be made up of small systems interacting over standardised IPC mechanisms, each of which of course have their own independent namespaces.

          There’s also the convention we see of lots of tiny files, which is probably not actually necessary today. It made more sense in there days of centralised version control and global file locking in very old version control systems where merging changes from multiple people was difficult or impossible and one person working on a file meant nobody else could. But today, most modules should probably be one file. Why not?

          For example, OpenBSD drivers are usually a single .c file, for example, and they recommend that people porting drivers from other BSDs merge all the files for that driver into one. I actually find this easier to understand: it’s easier for me to navigate one file than a load of files.

    2. 4

      If you haven’t written C in awhile, you should give it a try once more! Some tips: use a modern build of clang for great compiler error messages; use vim (or emacs/vscode) to be reminded C “just works” in editors; use a simple Makefile for build/test ergonomics.

      I am going through the Writing An Interpreter In Go book but in C (which is totally new to me, coming from a JavaScript background) and it’s been the most fun I had in years. I’m actually starting to get quite fond of the language and the tooling around it (like gdb and valgrind).

      1. 2

        I recommend you take a look at http://craftinginterpreters.com as well, if you want something similar for C. The book is in two parts: the first part a very simple AST-walking interpreter written in Java, the second part a more complex interpreter that compiles the language to bytecode and has a VM, closures, GC, and other more complicated features, written in C. If you’ve already read Writing An Interpreter In Go you can probably skip the Java part and just go straight to the C part.

        1. 3

          Thanks, I will (after I’m done with this). I actually really liked it that the book is in Go but my implementation in C, as it made it a bit more exciting for me to think about how I would structure things in C and see what the tradeoffs are versus doing it in Go. Otherwise I’d be tempted to skip entire chapters and just re-use the author’s code, which obviously doesn’t help if my goal is to learn how it’s actually done.

    3. 4

      so small they can truly fit in your head

      Very true. One thing I’ve noticed, going to C from Rust and especially C++, is how little time I spend now looking at language docs, fighting the language or compiler itself, or looking at code and wondering, “WTF does this syntax actually mean?”

      There’s no perfect language though. I do pine sometimes for some of the fancier language features, particularly closures and things which allow you to directly concepts directly in code, like for(auto i : container_type) {...} or .map(|x| { ...}).

      1. 1

        One thing I’ve noticed, going to C from Rust and especially C++, is how little time I spend now looking at language docs, fighting the language or compiler itself, or looking at code and wondering, “WTF does this syntax actually mean?”

        It’s also really nice being able to type:

        man $SOME_FUNCTION
        

        to get the documentation for any function in the standard library (and many others not in the standard library). I do a lot of my development on the train (without access to the internet) and man pages are my best friend.


        On the topic of “wtf does this syntax actually mean” I do think C has some pain points. East const vs west const is still a point of confusion for many, and C’s function pointer syntax will melt your brain if you stare at it too long.

        At one point I wrote a C backend for a compiler I was working on and needed to really understand how declarations worked. I found that this article does a really good job explaining some of the syntax insanity.

    4. 4

      If anyone is looking to give modern C a try I would recommend reading How to C. It’s a short article that bridges the gap between C from K&R Second Edition and C in $CURRENT_YEAR. The article doesn’t cover the more nuanced details of “good” C programming, but I think that K&R + How to C is the best option for people who are learning the C language for the first time.

      1. 2

        Awesome recommendation! As someone who is picking up C for some programming fun & tasks again after a decade-long hiatus (focused on other higher-level languages), this is super useful for me. I have been re-reading K&R 2nd Ed and been looking for something akin to what you shared in “How to C”.

        I also found these two StackOverflow answers helpful. One, on the various C standards:

        https://stackoverflow.com/questions/17206568/what-is-the-difference-between-c-c99-ansi-c-and-gnu-c/17209532#17209532

        The other, on a (modern) set of useful reference books:

        https://stackoverflow.com/questions/562303/the-definitive-c-book-guide-and-list/562377#562377

  4. 4

    Helping my brother renovate his home and hacking away on a static site generator I’m building in C. In between the usual parental obligations that come with having 2 toddlers to care for.

    Current working name for the site generator is Coconut, unless I can think of something better.

    Enjoy your weekends everyone!

  5. 0

    It perhaps is easier for you as a developer but I’d argue that a lot of web applications should be local software that can be used without an internet connection and GB’s of RAM. Let’s not forget what the web is doing with regards to surveillance capitalism (ie messing up democracies because it makes some people money) and climate change.

    1. 6

      Isn’t that less a problem with the web platform and more of a problem with specific web sites? Additionally, surveillance conducted on the user of a web site is much easier to see, simply by dint of the developer tools built into all browsers. The effort you need to put into seeing what a desktop application is phoning home about is much greater.

    2. 5

      In some cases, that’s completely true. But for some other use cases, i refuse to install a software when the service could be equivalently used as a website (this especially applies to mobile apps). Also, be aware that applications are as capable of surveillance (if not more) as web apps. The app that hosts all others for most people being Windows, is actually a a spying machine.

  6. 2

    Probably Rust, although I wish the compiler could be (a lot) faster.

  7. 6

    Learning C and a game of paintball with friends cause one of them is becoming a dad, followed up by an Indian restaurant which will probably end with me spending my Sunday on the toilet. :-)

    1. 4

      There’s a resurgence of interest in C apparently, looking through this thread. Kinda inspiring me to rehash it myself. After I’m done with Haskell, maybe I should go back to the kilo project I abandoned. How are you doing your learning. Do you have a structure in mind or like a book or something?

      PS - As for the Indian restaurant, apart from asking it to be less spicy, I suggest getting a generous helping of “ghee” and finishing your meal with a “dahi” dish. The restaurant should know those words. That’s how my folks keeps the burn in check :)

      1. 3

        That’s how it feels to me too, my Mastodon feed is full of people picking up C (again). Personally I just want to broaden my understanding and having recently picked up Rust, better see what it attempts to improve on. I’m going by the book, doing the basic exercises and then hopefully a small toy project.

        Thank you for the tips! I actually really do like spicy (Vindaloo is my favorite) and normally the afterburn isn’t that bad for me, but this particular restaurant somehow had that effect last time. It’s a restaurant in The Netherlands so my guess is that they’ve toned down all their dishes a lot compared to “the real deal”. Definitely going to try the dahi finisher!

        1. 2

          You’re welcome :) and btw been a lomg time since I read that book but I remember it being a really good one - so compact, simple and well written! It’s like the programming equivalent of The Elements of Style. Wish you well on your endeavor!!

  8. 5

    Is this motherfuckingwebsite.com clocking in at 5 kB in total really that bad in comparison? I don’t think so.

    You’ve got the number wrong:

        <!-- yes, I know...wanna fight about it? -->
        <script>
          (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
          (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
          m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
          })(window,document,'script','//www.google-analytics.com/analytics.js','ga');
        
          ga('create', 'UA-45956659-1', 'motherfuckingwebsite.com');
          ga('send', 'pageview');
        </script>
    

    This is something I fail to understand. He doesn’t even use CSS to prevent the text from spreading over the entire width of the screen, but then happily references a JS blob that amounts to 44 KB to spy on the users. Káže vodu, pije víno..

    1. 3

      Haha, shoot. The script was triple blocked: first by uBlock, then by uMatrix, then by my Pi Hole. So it did not show up on my browser’s Network tab…

      And yes, I’m with you there. 44 kB… Nearly 10 times the size of the rest of that site. For what I think is vanity.

      1. 5

        What I found particularly amusing about this was the comment. It’s like if someone saw him get into his private airplane a few minutes after he gave an emotional talk about why people should take extreme measures to lower their carbone footprint and he just stayed there with a guilty face: “I know, I know…”.

        It creates an illusion that there’s nothing we could do, practically speaking. We could theoretically build better websites, but not even the strongest advocates do actually bother.

        I don’t actually think using GA is that bad. I don’t like it on a personal/ideological level, but I’m not fanatical about it and can see myself using it too in some scenarios. Here, it’s all about the contrasts: no styling, the page that’s to be perceived as ugly and boring by many; everything is as minimalistic as it can be. And then bum, let’s load 44 KB worth of some JS blob.

        Had the objective been to criticize the worst-of-worst bloated websites that take hundreds of milliseconds to load on a modern computer with a decent connection for seemingly no reason and demonstrate that things can be simpler (on something people could actually imagine using; such as a news portal or magazine, which commonly contain the most bloatware), then it wouldn’t be such a big deal to add some extra 40 KB. But taking extreme measures only to throw any advantage away a few seconds later doesn’t make much sense.

        Oh, and an interesting article of yours (I forgot to mention).

  9. 11

    I’m very skeptical of the numbers. A fully charged iPhone has a battery of 10-12 Wh (not kWh), depending on the model. You can download more than one GB without fully depleting the battery (in fact, way more than that). The 2.9 kWh per GB is totally crazy… Sure, there are towers and other elements to deliver the data to the phone. Still.

    The referenced study doesn’t show those numbers, an even their estimation of 0.1 kWh/GB (page 6 of the study) is taking into account a lot of old infrastructure. In the same page they talk about numbers of 2010, but even then the consumption using broadband was estimated as 0.08 kWh/GB and only 2.9 kWh for 3G access. Again, in 2010.

    Taking into account that consumption for 2020 is totally unrealistic to me… It’s probably a factor of at least 30 times less… Of course, this number will go down as well as more efficient transfers are rolled out, which seems to be happening already, at an exponential rate.

    So don’t think that shaving a few kbytes here and there is going to make a significant change…

    1. 7

      I don’t know whether the numbers are right or wrong, but I’m very happy with the alternative direction here, and another take at the bloat that the web has become today.

      It takes several seconds on my machine to load the website of my bank, a major national bank used by millions of folks in the US (Chase). I looked at the source code, and it’s some sort of encrypted (base64-style, not code minimisation style) JavaScript gibberish, which looks like it uses several seconds of my CPU time each time it runs, in addition to making the website and my whole browser unbearably slow, prompting the slow-site warning to come in and out, and often failing to work at all, requiring a reload of the whole page. (No, I haven’t restarted my browser in a while, and, yes, I do have a bunch of tabs open — but many other sites still work fine as-is, but not Chase.)

      I’m kind of amazed how all these global warming people think it’s OK to waste so many of my CPU cycles on their useless fonts and megabytes of JavaScript on their websites to present a KB worth of text and an image or two. We need folks to start taking this seriously.

      The biggest cost might not be the actual transmission, but rather the wasted cycles from having to rerender complex designs that don’t add anything to the user experience — far from it, make it slow for lots of people who don’t have the latest and greatest gadgets and don’t devote their whole machine to running a single website in a freshly-reloaded browser. This also has a side effect of people needing to upgrade their equipment on a regular basis, even if the amount of information you require accessing — just a list of a few dozen of transactions from your bank — hasn’t changed that much over the years.

      Someone should do some math on how much a popular bank contributes to global warming with its megabyte-sized website that requires several seconds of CPU cycles to see a few dozen transactions or make a payment. I’m pretty sure the number would be rather significant. Add to that the amount of wasted man-hours of folks having to wait several seconds for the pages to load. But mah design and front-end skillz!

      1. 3

        Chase’s website was one of two reasons I closed my credit card with them after 12 years. I was traveling and needed to dispute a charge, and it took tens of minutes of waiting for various pages to load on my smartphone (Nexus 5x, was connected to a fast ISP via WiFi).

        1. 2

          The problem is that Chase, together with AmEx, effectively have a monopoly on premium credit cards and travel rewards. It’s very easy to avoid them as a bank otherwise, because credit unions often provide a much better product, and still have outdated-enough websites that simply do the job without whistling at you all the time, but if you’re into getting the best out of your travel, dealing with the subpar CPU-hungry websites of AmEx and Chase is often a requirement for getting certain things done.

          (However, I did stop using Chase Ink for many of my actual business transactions, because the decline rate was unbearable, and Chase customer service leaves a lot to be desired.)

          What’s upsetting is that with every single redesign, they make things worse, yet the majority of bloggers and reviewers only see the visual “improvements” in graphics, and completely ignore the functional and usability deficiencies and extra CPU requirements of each redesign.

    2. 9

      Sure, there are towers and other elements to deliver the data to the phone. Still.

      Still what? If you’re trying to count the total amount of power required to deliver a GB, then it seems like you should count all the computers involved, not just the endpoint.

      1. 4

        “still, is too big of a difference”. Of course you’re right ;-)

        The study estimates the consumption as 0.1 kWh in 2020. The 2.9 kWh is an estimation in 2010.

        1. 2

          I see these arguments all the time about “accuracy” of which study’s predictions are “correct” but it must be known that these studies are predictions of the average consumption for just transport, and very old equipment is still in service in many many places in the world; you could very easily be hitting some of that equipment on some requests depending on where your data hops around! We all know an average includes many outliers, and perhaps the average is far less common than the other cases. In any case, wireless is not the answer! We can start trusting numbers once someone develops the energy usage equivalent of dig

      2. 3

        Yes. Let’s count a couple.

        I have a switch (an ordinary cheap switch) here that’ll receive and forward 8Gbps on 5W, so it can forward 3600000 gigabytes per kWh, or 0.0000028kWh/GB. That’s the power supply rating, so it’ll be higher than the peak power requirement, which is in turn will be higher than the sustained, and big switches tend to be more efficient than this small one, so the real number may have another zero. Routers are like switches wrt power (even big fast routers tend to have low-power 40MHz CPUs and do most routing in a switch-like way, since that’s how you get a long MTBF), so if you assume that the sender needs a third of that 0.1kWh/GB, the receiver a third, and the networking a third, then… dumdelidum… the average number of routers and switches between the sender and receiver must be at least 10000. This doesn’t make sense.

        The numbers don’t make sense for servers either. Netflix recently announced getting ~200Gbps out of its new hardware. At 0.03kWh/GB, that would require 22kW sustained, so probably a 50kW power supply. Have you ever seen such a thing? A single rack of servers would would need 1MW of power.

        1. 1

          There was a study that laid out the numbers, but the link seems to have died recently. It stated that about 50% the energy cost for data transfer was datacenter costs, the rest was spread out thinly over the network to get to its destination. Note that datacenter costs does not just involve the power supply for the server itself, but also all related power consumption like cooling, etc.

          1. 2

            ACEEE, 2012… I seem to remember reading that study… I think I read it when it was new, and when I multiplied the numbers in that with Google’s size and with a local ISP’s size, I found that both of them should have electricity bills far above 100% of their total revenue.

            Anyway, if you change the composition that way, then at least 7000 routers/switches on the way, or else some of the switches must use vastly more energy than the ones I’ve dealt with.

            And on the server side, >95% of the power must go towards auxiliary services. AIUI cooling isn’t the major auxiliary service, preparing data to transfer costs more than cooling. Netflix needs to encode films, Google needs to run Googlebot, et cetera. Everyone who transfers a lot must prepare data to transfer.

    3. 4

      I ran a server at Coloclue for a few years, and the pricing is based on power usage.

      I stopped in 2013, but I checked my old invoices and monthly power usage fluctuated between 23.58kWh and 18.3kWh, with one outlier at 14kWh. That’s quite a difference! This is all on the same machine (little Supermicro Intel Atom 330) with the same system (FreeBSD).

      This is from 2009-2014, and I can’t go back and correlate this with what the machine was doing, but fluctuating activity seems the most logical response? Would be interesting if I had better numbers on this.

    4. 2

      With you on the skeptic train: would love to see where this estimate:

      Let’s assume the average website receives about 10.000 unique visitors per month

      it seems way high. We probably will be looking to a pareto distribution, and I don’t know if my intuition is wrong, but I’ve the feeling that your average wordpress site sees way way lower visitors than that.

      Very curious about this now, totally worth some more digging

  10. 6

    This has never crossed my mind before, thank you for getting me (and others) thinking about it :)

    Now I wonder what the power overhead of interpreting PHP is over a language that gets turned into native code AOT.

    1. 6

      You may enjoy this article (Lobsters discussion) on the energy usage of various languages. PHP isn’t the best but it isn’t the worst, either.

    2. 5

      That’s awesome - I’m glad it was of value! Me neither. I stumbled upon a number that said a GB of data costs about 5 kWh to transfer (about half of it was datacenter, rest spread out across the network) and it blew my mind. If my home network was that inefficient it would mean an hour of streaming House of Cards in Ultra HD is just as bad as spending that same time in a moving gasoline car…. Luckily, that number seems way too high nowadays and fixed broadband connections are a lot more efficient.

      And yeah, Rasmus Lerdorf did a talk a few years ago talking about the CO2 savings if the entire planet updated to PHP7. Here’s a link to the relevant section.. TLDR: at 100% PHP 7 adoption, 7.5B kg less CO2 would be emitted.

      1. 5

        I stumbled upon a number that said a GB of data costs about 5 kWh to transfer

        Anyone else think this sounds wildly implausible? The blog in question estimates 2.9 kWh based on 3G, and even that seems absurd imo.

        Per here 2017 IP traffic amounted to 1.5ZB, or about 171,100,000 GB per hour. A 5kWh per GB that would work out to ~7500 TWh, or about 1/3 of 2017’s energy consumption being spent on data transfer alone.

        This would be more energy than we spend on all transportation of people and goods, worldwide combined (about 26% of energy consumption worldwide, per the EIA here).

        It’s hard to directly find data breaking down energy usage by segment to the point where you could directly pin a number on “IP data transfer” by itself (which in and of itself raises questions about where this 5 kWh came from), but just looking at the breakdowns I can find, 5 kWh doesn’t seem to pass the smell test.

        1. 6

          That was my reaction too, but I think the main thing is that this study was old (2007 or so). It was this study, although the link seems to have died very recently.

          I actually just found another study that seems more up to date and seems credible: Electricity Intensity of Internet DataTransmission. Main line (according to me):

          This article derives criteria to identify accurate estimates over time andprovides a new estimate of 0.06 kWh/GB for 2015. By retroactively applying our criteria toexisting studies, we were able to determine that the electricity intensity of data transmission(core and fixed-line access networks) has decreased by half approximately every 2 yearssince 2000 (for developed countries), a rate of change comparable to that found in theefficiency of computing more generally.

          1. 2

            Oh nice, thanks for the link! That makes a lot more sense to me just squinting at power consumption of various parts of the chain.

  11. 27

    Was anyone else surprised Bram works on Google Calendar?

    1. 14

      I’ve been using Vim for almost 20 years and had absolutely no idea (1) what Bram looked like or (2) that he worked for Google.

    2. 3

      Definitely.

      Though I shouldn’t be, it seems like they hired a ton of the previous generation of OSS devs: thinking of things like vim, afl (uncertain, though the repo is under Google’s name now), kismet, etc.

      1. 2

        It’s just not what I would’ve guessed would be the highest and best use of his talents.

        I’m not saying I believed he was working on vim, I know better than that. I’m just surprised it was something so…ordinary and corporate.

    3. 3

      Yes! And that he sounds as if Google is still a start-up and not one of the biggest companies in the world. Had to check the date of the article. Of course it doesn’t feel like a startup, Bram…

      1. 2

        Maybe he means Google Zurich, which seems to have expanded by a lot lately?

    4. 2

      Me, honestly.

  12. 4

    Challenge 7-25 (not necessarily all this week) of adventofcode.com, in Rust.