1. 18

    From the License Agreement:

    You may install and use any number of copies of the software only with Microsoft Visual Studio, Visual Studio for Mac, Visual Studio Code, Azure DevOps, Team Foundation Server, and successor Microsoft products and services (collectively, the “Visual Studio Products and Services”) to develop and test your applications

    Nice to see that Microsoft has gone full circle with LSP from proposing to solve the NM problem, to admitting that it is NM+M to now saying N = 1, and it’s VSCode. Guess that is just another way to approach the issue, just instead of the famous EEE this is Create, Extend, Extinguish.

    1. 5

      In what way is LSP NM+M? It still is N+M in my book…

      1. 3

        Not sure what zge meant, but a fun observation about M+N is that VS Code itself has a dedicated plugin for every language. There’s no universal LSP support in VS Code per se. To use Rust, you need one extension, to use Dart, you need another, etc. These extensions are not trivial.

        1. 4

          I wrote a language server for a niche language and wrote an extension for it for VS Code. The bare extension was very easy to write: I copy pasted their example code and changed the name of the executable.

          The cool thing was that I was able to then extend the VS Code extension and add many features to it over the Language Server.

          The code is here: https://github.com/rabix/benten/tree/master/vscode-client

          1. 3

            Yeah I agree it’s super weird that the flagship editor for LSP doesn’t have one-line config for it. I think it’s to make sure there are as many extensions as possible because it looks better, and it encourages people to add custom features to their extensions (“I already have an extension, why not add to it?”). Meanwhile in nvim it takes me one line to add a LSP server for a niche language.

            1. 3

              I don’t find this strange, I think that’s the right setup to get the best user experience. Languages are different, you can lowest-common-denominator 80% of the things, but the other 20% matter for people who are working with the tool extensively.

              What VS Code does is making this task embarrassingly parallel: folks who care about support for language X can implement perfect experience for this language with zero upstream coordination.

        2. 1

          What do NM, NM+M, and N=1 mean here?

          1. 5

            They have some unescaped asterisks turning part of the text italic. They refer to N*M meaning N code editors have to implement support for M languages, so there ends up being N*M implementations. Not sure about N*M+M.

            1. 2

              It’s funny how this still works even with markdown eating the asterisks because multiplication is often written as just two variables together without any punctuation :D

        1. 25

          It needs to be said that the people behind Soapbox and Spinster are TERFs. Alex Gleason is a TERF, and if I need to I can pull up plenty more receipts, but I’d rather not link to their stuff.

          Now is a really bad time to be a trans person, their basic civil rights are under attack, it’s getting hard for them to use the bathroom (i.e. go outside) or get basic medical care in part because of people like this coordinating harassment campaigns. We really can’t support people who spend their time creating forums to attack trans people, and using or promoting their software is definitely part of that.

          1. 4

            That’s sad. I keep seeing Pieroma recommended as an alternative to the bloat and operational complexity of Mastodon but whenever I’ve looked the install documentation is just abysmal.

            This is such a tricky problem space. I initially had the thought that “Oh hey it’s FLOSS, somebody who wasn’t a hatemonger could fork a new project and inherit their work” but is it fruit from the tainted tree at that point?

            ¯_(ツ)_/¯

            1. 8

              How about Honk instead?

              Even lighter than Pleroma, and I do not know anything to the author’s detriment.

              EDIT: Also, someone could do a hard fork of Soapbox, and maybe that would be ok, but this is promoting the project itself, not a hard fork thereof.

              1. 4

                Looks neat but… No stars? That’s… Unfortunate. I don’t love the idea of not being able to say “Hey I really like your post!”.

                I mean, I get it, it removes the whole negative dopamine release loop but… Hrrm.

                It DOES look deliciously light and easy to set up!

                1. 4

                  I don’t love the idea of not being able to say “Hey I really like your post!”.

                  You can always say that directly, usually means a lot more than just a number.

                  1. 4

                    You know that’s a REALLY good point.

                    Either you REALLY like the post, which warrants a comment, and might spur some good conversation which wouldn’t happen if you just starred and moved on.

                    I DO think removing that quick dopamine hit from the “like” response could lead to consuming the content in a more thoughtful, meaingful way.

                    I’ll add setting Honk up to my projects list :)

                    1. 1

                      Yeah it’s quite easy, if you use nixOS at all I have a module setup to configure/enable it.

                      1. 1

                        I’d be interested in that module if you could upload it somewhere

                        1. 1

                          Yeah sure, let me go paste it to my GitHub.

              2. 1

                It’s configuration isn’t great. I have a Dockerfile that runs it and SoapBoxUI:

                https://github.com/sumdog/bee2/tree/master/dockerfiles/SoapBox

                Mastodon is easier to scale up with zero downtime releases (you can launch more web or sidekiq containers) but it’s a hog and not as robust at Pleroma. Pleroma has some better rules engines, but they’re more complex to configure. I hope with the SoapBox fork that Alex tries to do standard release containers and make the configuration easier. He’s done a lot of work.

            1. 12

              I agree that Ada is very much benefiting from the recent new interest in systems programming. A tag sounds reasonable.

              1. 5

                …Ada is very much benefiting from the recent new interest in systems programming.

                This is fantastic point, thanks for making it. My personal impression has been exactly this, and it’s identifiable as such now that you’ve articulated it with clarity. Hopefully the heritage of Ada will continue to have a positive influence even as Ada itself continues to grow and thrive.

                1. 2

                  recent new interest in systems programming

                  In what sense is it “recent”? But yes, a tag would be good.

                  1. 5

                    There’s a new wave of systems programming languages in the last 5-10 years: go as a starter, but then Swift, Rust, nim, zig and such, all serious contenders in that space. Before that, systems programming was the thing you do if you must for a while, now, it’s becoming a much stronger thing.

                    Before that happened, there was no way past C or C++, but now there’s a space where people actually consider switching away from those. I know of at least one place where that evaluation was actually “Ada vs. Rust” and that speaks to how Ada doing well in that situation.

                    Essentially: Rust makes people remember Ada.

                1. 2

                  I really like the idea of Guix and the GNU Operating System, but does it have an option to use the HURD kernel instead of Libre Linux?

                  Like that QEMU images are available and it’s multi-arch, so may give it a test this weekend.

                  1. 3

                    I really like the idea of Guix and the GNU Operating System, but does it have an option to use the HURD kernel instead of Libre Linux?

                    Of course :) https://guix.gnu.org/en/blog/2020/deprecating-support-for-the-linux-kernel/ (note when it was published).

                  1. 7

                    Doom has an active and helpful Discord. It’s actually the best place to talk about Emacs in general that I’ve found.

                    This is kind of sad to read. Is it related to Discord or the specific community?

                    1. 14

                      Hi I’m a moderator in the doom discord community. If it’s any consolation I have been trying to steer people toward using other resources available, particularly the mailing list.

                      I think we’ve grown the way we have because we’re not very strict about what the topic at hand is and understand that most users coming to doom emacs (and I expect spacemacs as well) are coming not from emacs but from vim or vim-likes and like to talk about lots of things besides emacs. This includes other text editors, games, operating systems, and more. Being off-topic is on topic in a lot of cases. Our community is also younger (at 33 I’m probably one of the oldest in the server) with lots of college and high school kids who’s online social life has been won by discord. We’re simply more approachable to them than most other emacs communities.

                      A lot of credit for the helpfulness can also go to Henrik who is both patient and gracious with new users and eager to help wherever he can. He sets a very friendly tone and I don’t think I’ve ever seen him troll, lighthearted or otherwise (thinking back I don’t think I’ve ever registered a swear word from him).

                      1. 5

                        If it’s any consolation I have been trying to steer people toward using other resources available, particularly the mailing list. […] Our community is also younger […] with lots of college and high school kids who’s online social life has been won by discord. We’re simply more approachable to them than most other emacs communities.

                        That is nice to know. As someone who is just around the “Discord generation” (22), I fear that I would have got caught up in that development. Emacs in particular was essential to my appreciation of Free Software, which is why I care about this. The other reason is that the dominance of Discord is something I often resent, as I get excluded from communities I could be interested in participating because of my own principles.

                        1. 2

                          The other reason is that the dominance of Discord is something I often resent, as I get excluded from communities I could be interested in participating because of my own principles.

                          In the linked comment you mentioned your hesitation to use an Electron application. Have you considered trying the Discord-Matrix bridge?

                          1. 2

                            I could use it (even if Matrix is a bit too slow for my taste), but my there point was not the specific server, as I don’t use Doom, but the general culture of organizing communities around Discord, a platform I would like to have nothing to do with in itself.

                            1. 2

                              The discord-matrix bridge allows you to communicate with a community using Discord while giving you the option of using a matrix client, which is better than being forced to use the discord client. But this requires cooperation from the moderators of the discord guild, and still doesn’t solve the problem of Discord interfering with communications on what is fundamentally their platform.

                              1. 1

                                I read that using a third party client can get your account banned for life since it is against the terms of service.

                            2. 2

                              That matches my experience. Henrik is amazingly patient for someone whose project blew up into this huge thing. I wish I could match that.

                            3. 2

                              As someone who uses customize almost exclusively to configure my Emacs environment, I’m curious why none of the popular Emacs enhancement frameworks use it. I found this comment in the Doom repo—does anyone have insight into what they mean here?

                              ;; Doom doesn't support `customize' and it never will. It's a clumsy interface
                              ;; that sets variables at a time where it can be easily and unpredictably
                              ;; overwritten. Configure things from your $DOOMDIR instead.
                              
                              1. 1

                                AFAIK a lot of people do not like that customize writes code, that makes code (slightly) harder to version. Doom is opinionated, so I guess they decided to not be interested in preserving that mode of configuring.

                              2. 1

                                Not sure - but I will say that the community on the Doom Emacs Discord is very friendly and very helpful, and also active. So if you ask a question you’ll probably get an answer fairly quickly.

                                1. 7

                                  That might be the case, my issue is mainly that it is organized on Discord, which IMO shouldn’t be used for free software projects.

                                  1. 4

                                    I honestly actively do not care about that criteria.

                                    1. 1

                                      What do you mean? The usage of Discord per se or the usage of Discord by Free Software communities?

                                      1. 1

                                        I’m not going to derail this thread with my opinion on that.

                                        1. 1

                                          Ok :/

                                2. 1

                                  I don’t know, but I have yet to find as friendly a group to ask dumb newbie questions, as that one. And some of my questions are, sadly, still dumb newbie questions :)

                                  1. 2

                                    Out of curiosity, have you ever tried the help-gnu-emacs mailing list? There are all kinds of questions posted there all the time, from total beginners to Elisp developers.

                                1. 19

                                  Going to write a blog post on my personal tech stack. OpenBSD, relayd, Golang, and YAML. Also known as the ORGY stack.

                                  1. 20

                                    If you make a distributed solution on top of that, would it then become a clusterfuck?

                                    1. 2

                                      Distributed solution to flat files, has to be NFS right? shudder

                                    2. 4

                                      I’m interested in reading it, do you mind sharing the link to your post once you publish it?

                                      1. 2

                                        I’ve posted it here

                                        1. 1

                                          Thanks!

                                        2. 2

                                          Yeah of course, will do

                                          Edit: I actually finished the example application at least, I’ll publish the code for it too of course but if you’re interested in a semi-early-bird-alpha-preview it’s here

                                        3. 3

                                          Please post the link when you’re finished. I’d love to read this implementation.

                                          1. 1

                                            I’ve posted it here; sorry I missed tagging you.

                                            1. 1

                                              No problem. Thank you very much.

                                          2. 2

                                            What are you using Relayd for?

                                            1. 2

                                              TLS termination and other proxy stuff

                                              1. 1

                                                other proxy stuff

                                                I’m intending to switch (back) to OpenBSD once my current VPS contract expiries, and one of the things I want to figure out is how to implement a reverse-HTTP-proxy using Reladyd. So all requests to foo.example.com are directed to localhost:9090, all requests to bar.example.com are directed to localhost:9091. Didn’t manage to get it working last time I tried, so I hope your posts might give me some hints.

                                                1. 2

                                                  Yep 100% possible with not that much config. This specific post will contain how to do that but I am also planning a “how everything on this website” hangs together post which will be more in-depth on that subject. I’ll ping you once I’ve written it

                                                  1. 1

                                                    Great, looking forward to that!

                                                    1. 2

                                                      I’ve posted it here

                                                      1. 1

                                                        Thank you!

                                            2. 2

                                              I always thought that Go on OpenBSD would be solid. Do you use pledge or unveil in your Go application? Or does relayd pretty much take care of that?

                                              P.S. I’m pretty green with OpenBSD, etc.

                                              1. 2

                                                Go has support for it (unveil, pledge), and there are additional packages that wrap it such as suah.dev/protect that allow transparent use on non-OpenBSD systems

                                              2. 1

                                                I assume if you mess things up, it becomes the GORY stack.

                                              1. 5

                                                About two years ago I had a fight with my Linux laptop and decided that was it. I got a MacBook Air, an iPhone, signed up for iCloud, and I have had effectively zero computer or phone troubles since. Maybe it’s just that I’m old now, but I have literally no patience for “fiddling” with my computer any more.

                                                1. 2

                                                  That is the complete opposite of what I have experienced. I am forced to use OSX at work and I often feel like throwing the damned iThing out of the window. So many glitches, so many idiosiceasies. So many bugs and things that don’t work. So much crap people accept because apple says so. It.s just absurd.

                                                  If I need to provision a new workstation I can install Linux mint in 10 min. Pull my very basic set up script and I am ready to go in 15 min. Including installing my preferred tools and my configs.

                                                  1. 2

                                                    Well that is their deal, isn’t it? Resign your (software) freedoms an we will take care of you.

                                                    1. 5

                                                      Yeah, I guess I’m just no longer certain I gave up anything of value. Like, thinking back, I’m not sure I ever actually benefited from running a FOSS operating system. I suppose I benefited (and still do) from the existence of desktop Linux, to the extent that it provides a check on Microsoft and Apple, but even then, I’m uncertain.

                                                      1. 4

                                                        I’m not sure I ever actually benefited from running a FOSS operating system

                                                        That is a real problem, I think. Free Software operating systems should make it as easy and accessible as possible to use the freedoms they formally grant their users, but sadly a lot of the software infrastructure does not encourage it. You end up with all the downsides of FOSS (lack of resources for “professional” development), and none of the up-sides.

                                                        1. 1

                                                          Part of the problem, I think, is fairly mundane: extension points have become the norm for software (and to some extent, at least in the Unix world, always were). I don’t need to modify the source code when I can simply write a script of some kind to scratch the itch I want scratched. In this sense, the FOSS world has started to fall behind, with, for example, Gnome changing its extension APIs all the time (at least for awhile there, not sure what the situation is like now). Apple, on the other hand, has maintained a pretty impressive level of backward compatibility in both Automator and AppleScript, in addition to shipping standard tools like Bash.

                                                  1. 10

                                                    Interesting that the discussion around this all sort of seems to assume that the maintainers are choosing not to fix bugs and therefore must be bad at running their projects.

                                                    That’s probably true pretty often. But I think it’s also often true that open-source projects have a small, more-or-less-fixed time budget for maintenance that does not expand at the same rate as the growth of the user base, whereas the volume of bug reports does expand along with the user base. Over the years I’ve frequently been surprised to learn that some piece of software I use constantly was the product of one person working solo and it wasn’t even their day job.

                                                    If I publish something I wrote in my spare time to scratch an itch, and it happens to get popular enough to generate more bug reports than I can handle in the few hours a week I’m able to spend on the thing, what are my options, other than quitting my job and working full-time on bug fixing? Do I delete the popular repo that people are finding useful enough to file lots of bug reports about, in the name of reducing the number of unhealthy projects in the world? Do I just let the bug list expand to infinity knowing I’ll never have time to clear the backlog? Do I spend my project time reviewing historical bug reports rather than working on the code?

                                                    In theory, of course, a project that gets bug reports should have lots of people volunteering to fix the bugs. But the quantity and quality of community contributions seems to vary dramatically from one project to the next.

                                                    1. 23

                                                      It’s not about not fixing bugs. That’s expected, normal, fine. Maintainer does not owe me labour. Closing issues that aren’t fixed is just an FU to the users, though. Let the issue stay until someone fixes it, hopefully someone being affected by it.

                                                      1. 7

                                                        Exactly this. I was raised in a world where SFTW or RTFM was an acceptable response, and so put effort into providing good issues. Having a bot drive by and close/lock my still open issue is a slap in the face with a wet fish.

                                                        1. 1

                                                          SFTW

                                                          I’ve never seen this acronym before? What does it mean?

                                                          1. 1

                                                            It was meant to be STFW, but I can’t type.

                                                            1. 2

                                                              Search The Fucking Web

                                                              (I had still never seen this one and had to look it up … by searching the web)

                                                              1. 3

                                                                Ironically this becomes more difficult as the GitHub bug pages show up first and are locked/closed due to being old.

                                                      2. 8

                                                        What’s the downside of letting the open bug list expand?

                                                        1. 14

                                                          Eh. Caveats first:

                                                          1. I’m not a fan of simple auto-closing stale bots (roughly: I think the incentives are misaligned, and ~stale is a not-quite-right proxy of a few real problems, and it is trying to work around some limitations of using the GH issue system for triage)

                                                          2. but I am a big fan of having a way to punt unactionable issues back to the reporter, and let automation close the issue if the reporter never responds or return it to the triage queue if they do

                                                          That out of the way: in a busy long-term project, a stack of open bug reports comes with ongoing logistical overhead. GH’s UI/X exacerbates this, but some of it just comes with the territory.

                                                          • This cost is an aggregate of: maintainer time spent managing the requests, the opportunity-cost of that time, the motivation it saps from the maintainers as they spend more time doing paperwork and less time making electrons dance, and the despair that comes from feeling obliged to play Sisyphus on an ever-growing mountain of reports (or the guilt that comes with having to ignore the reports).
                                                          • Each person helping with bug triage will be paying a pretty similar cost unless you have tools/processes for de-duplicating logistical effort (unlikely for most volunteer efforts).
                                                          • This cost is somewhat proportional to the amount of time each triager is able to invest. If you have 100 contributors doing 1 hour of triage a week, there’s going to be a lot more duplicate effort in the reports they read than if you have 2 people doing 50 hours of triage a week.
                                                          • As the pile grows, efficiently pairing people with reports that are the best use of their time will get both harder and more important to your success.
                                                          • At scale, the high-leverage maintainers most-likely to have the authority (whether literally the permission, or just the reputational capital) to go around closing reports manually will usually have more critical things to invest their time in. Those the task actually falls to aren’t as likely to be empowered to weed the garden.
                                                          • Unless you’re willing to declare bug-bankruptcy or use automation (i.e., ideally #2 above–though obviously also a stale-bot), weeding out old dead-weight reports with the same care/consideration you’d usually run triage with can be (ironically?) a massive ongoing timesink in its own right.

                                                          Reports don’t have to be bad or obsolete to contribute to this cost. 10 or 20 brilliant suggestions may be an asset if you can find time to implement 1 a year for the next 5 years. 1000 brilliant suggestions may be an albatross if the extra logistical overhead fritters away the time and motivation you need to implement one.

                                                          It doesn’t, of course, apply to all reports equally. It’s better to have a single long-lived “darkmode pls” issue with gobs of comments and reactions than to deal with someone opening a new one every few days.

                                                          1. 7

                                                            Similar to not cleaning up your house. Increased cognitive load and ugliness.

                                                            1. 8

                                                              This. I am constantly surprised how many people dismiss the paralyzing effect of 2.4k open issues, even if they all make sense in some way. Resources are limited.

                                                              1. 8

                                                                Wouldn’t auto-closing issues be like just hiding everything under the bed in this analogy?

                                                                This all looks like the consequence of bad tooling to me.

                                                                1. 1

                                                                  Yes. The issue trackers we have need the following features.

                                                                  1. accept issues
                                                                  2. comments on issues
                                                                  3. occasional attachments to issues (need? Maybe not)
                                                                  4. assigning people to issues
                                                                  5. lifecycles as makes sense for your team and organization (open, closed, or on the other extreme: open, ready, in work, in testing, merged, in production, closed)
                                                                  6. tags on issues for classifications
                                                                  7. filtering on reporter, tag, assignee, and status, including default filtering.

                                                                  You do that, you fit 99% of people’s needs.

                                                                  Imagine if github had a tag “maintainers will not address” and the automatic filter (currently set to only show issues where their lifecycle is at “open”) only showed “open & not(tag:will-not-address-but-might-merge)”

                                                                  People would be happier to tune that automatic filter for their own needs, and this would allow a tag to banish the issue from a maintainer’s headspace.

                                                              2. 5

                                                                Mostly that there will be a certain number of old bugs that get fixed or rendered irrelevant as a side effect of some other change, and searching through the bug database will become less useful if a sizable portion of the bug reports reflect an obsolete reality.

                                                                I’ve occasionally run into this on existing projects, in fact: I will search for some error message I’m seeing, find a bug report with some discussion that includes workarounds people used, and only after wasting time trying those workarounds and digging into why they’re not working for me do I discover that the bug they were talking about is in a part of the code that no longer even exists.

                                                                Of course, in those kinds of cases, issues being auto-closed by a stale bot wouldn’t have helped much; I’d have still found them and probably still tried the obsolete workarounds. I don’t have a perfect answer for this either!

                                                                1. 8

                                                                  I haven’t heard a good case for a stale-bot. In the article it says “to reduce the number of open tickets”, but it doesn’t explain why that is a good thing – the number could be reduced to zero by not having a bug tracker at all!

                                                                  Perhaps it is a UI thing, and Github should display the number of “active” open tickets, for some activity metric.

                                                                  1. 4

                                                                    I turned a stale bot on for a popular project because I thought it would be a more honest communication of the likely fate of old GitHub reports - “this is stale and we aren’t planning to triage or address it further”. Few others perceived it that way and, after a few months, the bot was removed.

                                                                    The triage work is substantial - we spend easily 120 hours a week on community question/support/defect/feature-request triage. I’d love to do more but of course there are fixed dollars and hours to allocate across many priorities.

                                                                    Anyway, that was my reasoning, which turned out to be incorrect.

                                                            1. 4

                                                              This is a time of change for the internet and for Mozilla. From combatting a lethal virus and battling systemic racism to protecting individual privacy — one thing is clear: an open and accessible internet is essential to the fight.

                                                              Literally none of that is the job of the browser. The job of the browser is to render web pages. The job of the open source browser is to render web pages using a completely open stack that anyone can modify and distribute without requiring permission. Individual privacy is a nice to have, but one that’s best solved in addons, or simple patches. Something that Mozilla has made between impossible and very hard for third parties to do.

                                                              Mozilla is killing Firefox.

                                                              Looking at what’s happened to it over the last 10 years is like giving a rock guitar to your father when you leave for university and coming back to a banjo.

                                                              People completely out of touch, making decisions about products they don’t understand, because the people who they want to work for did it that way. I don’t understand the parts of open source that want to be corporate, just go work for chrome if you want to be them so badly.

                                                              1. 7

                                                                Literally none of that is the job of the browser.

                                                                What you quoted was a statement about Mozilla, which has a broader societal mission than just “render web pages”, and historically has used the browser as one way to advance that mission.

                                                                1. 5

                                                                  The fact that they spun out the Rust foundation but not the Firefox foundation should tell you all you need to know about their societal mission. After all one was making over 95% of their net revenue over the past decade, the other was a rounding error.

                                                                  Mozilla is a zombie org propped up by Google cash for the sole purpose of being pointed at when Google gets asked about being a monopoly in the browser space.

                                                                  The rest of it is culture war bullshit to give the people involved plausible deniability for running one of the crown jewels of open source into the ground for their 30 pieces of silver.

                                                                  1. 2

                                                                    It mostly just implies, to me, that they finally got around to making it official after trying as hard as they could to imply it for years. Here’s what Graydon Hoare said about it back in 2011:

                                                                    So far I’ve stated a preference for an independent branding / identity since programming languages tend to do best when they have the broadest possible community of their own, not a single promoting organization or company. If this is problematic I’m happy to discuss further, but this is why (for example) we have independent domain names (rust-lang.org and such) rather than sub-brands of “mozilla”.

                                                                    A lot of people simply won’t trust languages until they’re clearly outside the control (or even direct-affiliation) of a single organization; say, subject to an international standard (ISO/ECMA or such). We don’t brand JS as “Javascript (by mozilla)”, and rightly so: everyone would steer clear of it. Even if we own the trademark. Choosing to write code in a language is a big bet on the language’s continuity and strategic development, and requires the perception (and reality) of a community larger than a single organization.

                                                                    IOW, it’s essential to the project’s longevity that mozilla ceases to control it.

                                                                    1. 2

                                                                      Everything does better when it is run by an org specifically dedicated to it.

                                                                      Linux is better than XNU because one has the Linux foundation. GCC is worse than LLVM because one is part of the FSF.

                                                                      Firefox with a Firefox foundation would be a better browser and might even still exist in 10 years time. Firefox as part of Mozilla, less so.

                                                                2. 3

                                                                  It always puzzles me when people complain about Firefox chasing Chrome, as if that were somehow a new development. Firefox has been chasing the flavour-of-the-month popular browser since before Chrome existed and they were copying Internet Explorer 6. That’s the whole point of Firefox — to be like the popular browser but more wholesome, to give users a practical choice.

                                                                  If you ever thought the point of Firefox was something else, you were misled.

                                                                  1. 5

                                                                    There is a difference between copying features and copying corporate press releases. I don’t see why 2/3rds of the staff of Mozilla seem to be hired for the latter.

                                                                    1. 5

                                                                      Also, seriously, not all features are worth copying. Back in the pre-Chrome days, and even afterwards, Firefox gained lots of users because it had all of the good parts of IE (and some of the good parts of Opera :-( ) and none, or very few, of the bad parts. Getting the bad parts in might amateur product managers happy because the feature checklists now look the same but it won’t make the software better.

                                                                      If you just try to copy all of Chrome, without the kind of money that funds Chrome development, all you get is a cheap Chrome knock-off.

                                                                    2. 3

                                                                      In the days of ie Firefox was faster, more stable, had adblock and other extensions. It was the popular browser, not like a popular browser.

                                                                      1. 3

                                                                        The difference was that IE was holding the web back, while Chrome is “leading” the innovation. Firefox used to be pulling ahead, while not they are struggling to keep up.

                                                                      2. 2

                                                                        Yet Firefox remains the only major browser platform with good and reliable ad/javascript blockers that do not get gimped by the host platform.

                                                                        1. 3

                                                                          Yet it’s worse at it than it was a decade ago.

                                                                          Since it’s worse at stopping adds, has a smaller market share and is in terminal decline it sounds like Google got its moneys worth over the last ten years. And made the web more malvertising friendly.

                                                                      1. 14

                                                                        This is pretty interesting, but I think the real test for “society-changing” applications of cryptography like this is if normal people can use it without making mistakes. In particular, the interface for restaurant owners probably needs to be basically seamless— no restaurant owner wants to have to learn the details of public key cryptography, they want to make and serve food.

                                                                        also, it doesn’t look like there’s any details on how deliveries are going to work? all I can see is a somewhat overengineered protocol for placing an order with a restaurant, for a which a great decentralized system already exists: a landline phone. How would you prevent a delivery driver from jacking your food, or someone who claims to be a delivery driver from taking food?

                                                                        1. 4

                                                                          There are countless stories of people losing money because they’ve accidentally deleted their BitCon wallet key, or they’ve sent money to someone else and it turned out that they were a fraudulent entity. Much as I like decentralisation, centralised solutions have one big advantage: accountability. In the case of fraud, my bank can reverse transactions. If they do something illegal, they have a registered address that the police can visit and a load of recorded assets that can be seized. As a result of this, ‘know your customer’ legislation can be enforced, which make it much easier to find the perpetrators of fraud and help shift the liability towards banks that enable it. These things are really hard to replicate in a decentralised system.

                                                                          These aggregators do provide a few bits of value to restaurants, only some of which are captured by this:

                                                                          • They provide a single place to browse for a load of different things, which helps discovery. I’ve tried a bunch of takeaways via Deliveroo and Just Eat that I’d never have heard of otherwise.
                                                                          • They allow restaurants to outsource delivery. Unless you’re doing a lot of delivery business, paying people to deliver for you can be expensive. If you have slack times, you’re paying them anyway, whereas outsourcing it means that other restaurants can take up some of that slack.
                                                                          • They provide a reputation system. Deliveroo’s differentiating feature at launch was that they were selective in the restaurants that they’d sign up. If a restaurant gets too many bad reviews, they’re kicked off. Even Just Eat, which accepts pretty much anyone onto the platform, tracks reviews and knows that the person leaving the review actually ordered (and paid for) the food (and it was picked up by a delivery person who wasn’t affiliated with the restaurant), which makes it much harder to scam.
                                                                          • They handle refunds. I had a pizza delivery person accelerate too hard on his scooter so that my pizza was smushed into one end of the box. Just Eat handled the refund immediately. Again, knowing that I won’t have problems with refunds increases my confidence and makes it much lower risk for a customer to try a new take-away.
                                                                          • They handle all of the payments. Most restaurants can handle credit card payments in person, but doing so online requires more infrastructure. Outsourcing this reduces costs.

                                                                          The big problems with these companies are that they’re abusive to their delivery workers (Deliveroo recently had an IPO and their share price tanked immediately, in a large part due to the fact that they’re expected to be taken to court soon and end up having to pay their riders more) and they take a disproportionate cut of the price.

                                                                          In a decentralised system, there are a bunch of other questions:

                                                                          • How is the data handled privately?
                                                                          • Who is liable in case of a GDPR violation (is it the individual restaurants who opt in?)
                                                                          • How do I know a restaurant is legitimate / of decent quality?
                                                                          • If it handles matching riders with restaurants, how does it comply with employment law?
                                                                          1. 4

                                                                            Accountability is just the flip side of power abuse. If you can prevent people from signing up, reverse transactions, delete their accounts without traces, etc. you might as well do a good thing when someone asks you kindly.

                                                                            Distributed and decentralized solutions are usually created to avoid giving some instance power, and this is one example of the downside that this position grants you, but it is to a certain degree unavoidable – unless you regress on the central principle of being distributed and/or decentralized.

                                                                            That doesn’t mean it doesn’t have to be all bad. Different approaches can be taken. Maybe you could have “trustworthiness indexes”, where some food critic you trust (or pay) publishes how good a restaurant is, comparable to block-lists for ad-blockers. Maybe you could have a gossip system where friends can recommend or advise against visiting a restaurant? It is difficult, but without a central authority, there is no “definitive” knowledge. But then again, “real life” is also a distributed network of humans and their relations that suffers from the same problem.

                                                                          2. 1

                                                                            “ …great decentralized system already exists: a landline phone…”

                                                                            That does not offer automation or elasticity/scale. Not trying trivialize a response to your assertion, but I think automation is needed by any small business with a razor-thin margin.

                                                                            “…How would you prevent a delivery driver from jacking your food, or someone who claims to be a delivery driver from taking food?”

                                                                            From the readme, they seem to rely on strong digital identity of each service provider:

                                                                            “Identity creation for resource providers is made costly with a computational proof-of-work mechanism based on the partial preimage discovery first employed by Hashcash…. Free Food requires resource providers to supply a photographic proof of identity which includes a unique symbol which is mathematically bound to the proof-of-work associated with their public key.”

                                                                            That does not prevent a verified provider to do bad things, of course. But we can assume that the bad behavior, if it happens, at all, would happen only once. So the outcomes would not be no different than in centralized solutions.

                                                                            “… cryptography like this is if normal people can use it without making mistakes…”

                                                                            Evolution of technologies.

                                                                            Personal hardware cryptowallets, (that,also, incorporate password wallets) already exist. And, then, in the future, perhaps, incorporate more things (like health record, employment record, EDU record, asset records) – will likely to be a thing (assuming that we are allowed by governments, to have digital personalization, but without centralization).

                                                                            It should not be that difficult for this project, to approach 2 or 3 hardware cryptowallets provider and ask them to allow their solution to store the identity and authorization tokens for the libfood based systems. These, in a way, are tokens of trusts, and have value across locations, countries, decentralized networks, etc. I hope more and more such things are done. As person goes through their life, accumulating good ‘verifiable resume’ is of enormous value (regardless of the industry).

                                                                            Overall this service is looking for ways to decentralize (and therefore, if you accept the leap of faith, democratize ?) the discovery and integrated delivery part of the restaurant business.

                                                                            I can certainly see how this would work way beyond their modest goals, in many other businesses.

                                                                            1. 1

                                                                              also, it doesn’t look like there’s any details on how deliveries are going to work?

                                                                              Lots of local restaurants have, or could hire and staff, drivers, pizza places have been doing it for decades. This gives the restaurant owner/manager more choice and accountability, in comparison to grubhub type services.

                                                                              a landline phone

                                                                              I personally love the landline phone, BUT, I think the benefit over a landline phone is pretty clear to anyone who has worked a rush hour (it’s been a long time, but I remember). In this case, you’re avoiding a lot of things:

                                                                              If you have integrated payment processing (like he mentions), you’re avoiding transcription errors around card numbers. But most importantly, you’re saving time, and queueing in a different part of the system. When I call my local pizza place on a friday night (on their land line), I sometimes can’t get through until the third or fourth try (it’s good Pizza). Also, the system avoids mis-ordering or entry errors on the part of the order-taker. Finally, it gives the buyer an opportunity to double-check a placed order, or add impulse items ;). I think it’s interesting, and it’s just my perspective. Thoughts?

                                                                            1. 12

                                                                              Saint Florian (Latin: Florianus; 250 – c. 304 AD) was a Christian holy man, and the patron saint of Linz, Austria; chimney sweeps; soapmakers, and firefighters. His feast day is 4 May.

                                                                              1. 9

                                                                                Dave Brubeck Day

                                                                                In case you suspected:

                                                                                In the United States, May 4 is informally observed as “Dave Brubeck Day”. In the format most commonly used in the U.S., May 4 is written “5/4”, recalling the time signature of “Take Five”, Brubeck’s best known recording.

                                                                                1. 8

                                                                                  Apparently it is also:

                                                                                  • Anti-Bullying Day
                                                                                  • Bird Day
                                                                                  • Dave Brubeck Day
                                                                                  • International Firefighters’ Day
                                                                                  • World Naked Gardening Day

                                                                                  among other things.

                                                                                  1. 7

                                                                                    Which are also interesting, and neither is a free advertisement by proxy for a megacorporation.

                                                                                    Also you may spot the origin of the Firefighters’ day in my original post.

                                                                                    1. 2

                                                                                      Also you may spot the origin of the Firefighters’ day in my original post.

                                                                                      Ah yes, didn’t notice that.

                                                                                    2. 5

                                                                                      World Naked Gardening Day

                                                                                      I’d give it a shot but it’s around 10 degrees C here and we have snow forecast…

                                                                                      Edit I re-potted a store-bought spice plant but I did it fully clothed …

                                                                                  1. 5

                                                                                    A related rabbit hole that goes deep

                                                                                    1. 5

                                                                                      That is more of a parallel path, as BSD awk’s usually don’t implement these GNU awk extensions.

                                                                                    1. 16

                                                                                      I don’t understand why people are voting this as off topic.

                                                                                      1. 11

                                                                                        People don’t realize that software is made of people? :-)

                                                                                        1. 5

                                                                                          I’m guessing it is a reaction to the language and/or mentailty presented in the article, that can be percieved as “HR”/buzz word-y. I can relate to it, even if I don’t think it is off-topic.

                                                                                          1. 2

                                                                                            It uses words like “toxic” and discusses developers’ emotional states, which to some people is a signifier of the ways of the Bad Tribe™.

                                                                                          1. 4

                                                                                            Thank you for this! I totally missed out the transition, because I have an older version of Go installed, and now it seems that it’s assumed knowledge.

                                                                                            1. 1

                                                                                              Didn’t know that GitHub allowed embedding mp4 files into READMEs.

                                                                                              1. 2

                                                                                                First-class packages are the most underrated feature of lisp. AFAIK only perl offers it fully but it uses very bad syntax, globs . Most macros merely suppress evaluation and this can be done using first class functions. Here is my question for lispers, If you can use lex / yacc and can write a full fledged interpreter do you really need macros ?

                                                                                                1. 7

                                                                                                  Most macros merely suppress evaluation and this can be done using first class functions.

                                                                                                  I strongly disagree with this. Macros are not there to “merely suppress evaluation.” As you point out, they’re not needed for that, and in my opinion they’re often not even the best tool for that job.

                                                                                                  “Good” macros extend the language in unusual or innovative ways that would be very clunky, ugly, and/or impractical to do in other ways. It’s in the same vein as asking if people really need all these control flow statements when there’s ‘if’ and ‘goto’.

                                                                                                  To give some idea, cl-autowrap uses macros to generate Common Lisp bindings to C and C++ libraries using (cl-autowrap:c-include "some-header.h"). Other libraries, like “iterate” add entirely new constructs or idioms to the language that behave as if they’re built-in.

                                                                                                  Here is my question for lispers, If you can use lex / yacc and can write a full fledged interpreter do you really need macros ?

                                                                                                  Lex/Yacc and CL macros do very different things. Lex/Yacc generate parsers for new languages that parse their input at runtime. CL macros emit CL code at compile time which in turn gets compiled into your program.

                                                                                                  In some sense your question is getting DSLs backwards The idea isn’t to create a new language for a special domain, but to extend the existing language with new capabilities and operations for the new domain.

                                                                                                  1. 1

                                                                                                    Here are examples of using lex/yacc to extend a language

                                                                                                    1. Ragel compiles state machines to multiple languages
                                                                                                    2. Swig which does something like autowrap
                                                                                                    3. The babel compiler uses parsing to add features ontop of older javascript like asyc/await.

                                                                                                    I am guessing all these use lex/yacc internally. Rails uses scaffolding and provides helpers to generate js code compile time. Something like parenscript.

                                                                                                    The basic property of a macro is to generate code at compile time. Granted most of these are not built into the compiler but nothing is stopping you adding a new pre-compile step with the help of a make file.

                                                                                                    Code walking is difficult in lisp as well. How would I know if an expression is a function or a macro ? If I wanted to write a code highlighter in vim that highlights all macros differently I would have a difficult time doing this by parsing alone even though lisp is an easy language to parse.

                                                                                                    1. 5

                                                                                                      Code walking is difficult in lisp as well. How would I know if an expression is a function or a macro ?

                                                                                                      CL-USER> (describe #'plus-macro)
                                                                                                      #<CLOSURE (:MACRO PLUS-MACRO) {1002F8AB1B}>
                                                                                                        [compiled closure]
                                                                                                      
                                                                                                      
                                                                                                      Lambda-list: (&REST SB-IMPL::ARGS)
                                                                                                      Derived type: (FUNCTION (&REST T) NIL)
                                                                                                      Documentation:
                                                                                                        T
                                                                                                      Source file: SYS:SRC;CODE;SIMPLE-FUN.LISP
                                                                                                      ; No value
                                                                                                      CL-USER> (describe #'plus-fn)
                                                                                                      #<FUNCTION PLUS-FN>
                                                                                                        [compiled function]
                                                                                                      
                                                                                                      
                                                                                                      Lambda-list: (A B)
                                                                                                      Derived type: (FUNCTION (T T) (VALUES NUMBER &OPTIONAL))
                                                                                                      Source form:
                                                                                                        (LAMBDA (A B) (BLOCK PLUS-FN (+ A B)))
                                                                                                      ; No value
                                                                                                      

                                                                                                      You underestimate the power of the dark side Common Lisp ;)

                                                                                                      In other words … macros aren’t an isolated textual tool like they are in other, less powerful, languages. They’re a part of the entire dynamic, reflective, homoiconic programming environment.

                                                                                                      1. 2

                                                                                                        I know that but without using lisp runtime and parsing alone can you do the same ?

                                                                                                        1. 3

                                                                                                          I’m not sure where you’re going with this.

                                                                                                          In the Lisp case, a tool (like an editor) only has to ask the Lisp environment about a bit of syntax to check if it’s a macro, function, variable, or whatever.

                                                                                                          In the non-Lisp case, there’s no single source of information, and every tool has to know about every new language extension and parser that anybody may write.

                                                                                                          1. 1

                                                                                                            I believe the their claim is that code walkers can provide programmers with more power than Lisp macros. That’s some claim, but the possibility of it being true definitely makes reading the article they linked ( https://mkgnu.net/code-walkers ) worthwhile.

                                                                                                          2. 2

                                                                                                            Yes. You’d start by building a Lisp interpreter.

                                                                                                            1. 1

                                                                                                              … a common lisp interpreter, which you are better off writing in lex/yacc. Even if you do that each macro defines new ways of parsing code so you can’t write a generic highlighter for loop like macros. If you are going to write a language interpreter and parse, why not go the most generic route of lex/yacc and support any conceivable syntax ?

                                                                                                              1. 5

                                                                                                                I really don’t understand your point, here.

                                                                                                                Writing a CL implementation in lex/yacc … I can’t begin to imagine that. I’m not an expert in either, but it seems like it’d be a lot of very hard work for nothing, even if it were possible, and I’m not sure it would be.

                                                                                                                So, assuming it were possible … why would you? Why not just use the existing tooling as it is intended to be used???

                                                                                                                1. 2

                                                                                                                  That’s too small of a problem to demonstrate why code walking is difficult. How about this then,

                                                                                                                  1. Count number of s-expression used in the program
                                                                                                                  2. Shows the number of macros used
                                                                                                                  3. Show number of lines generated by each macro and measure line savings
                                                                                                                  4. Write a linter which enforces stylistic choices
                                                                                                                  5. Suggest places where macros could be used for minimising code
                                                                                                                  6. Measure code complexity, coupling analysis
                                                                                                                  7. Write a lisp minifier, obfuscator
                                                                                                                  8. Find all places where garbage collection can be improved and memory leaks can be detected
                                                                                                                  9. Insert automatic profiling code for every s-expression and list out where the bottlenecks are
                                                                                                                  10. Write code refactoring tools.
                                                                                                                  11. List most used functions in runtime to suggest which of them can be optimised for speed

                                                                                                                  Ironically the above is much easier todo with assembly.

                                                                                                                  My point is simply this, lisp is only easy to parse superficially. Writing the above will still be challenging. Writing lexers and parsers is better at code generation and hence macros in the most general sense. If you are looking for power then code walking beats macros and thats also doable in C.

                                                                                                                  1. 1

                                                                                                                    While intriguing, it would be nice if the article spelled out the changes made with code walkers. Hearing that a program ballooned 9x isn’t impressive by itself. Without knowing about the nature of the change it just sounds bloated. (Which isn’t to say that it wasn’t valid, it’s just hard to judge without more information.)

                                                                                                                    Regarding your original point, unless I’m misunderstanding the scope of code walkers, I don’t see why it needs to be an either/or situation. Macros are a language supported feature that do localized code changes. It seems like code walkers are not language supported in most cases (all?), but they can do stateful transformations globally across the program. It sounds like the both have their use cases. Like lispers talk about using macros only if functions won’t cut it, maybe you only use code walkers if macros won’t cut it.

                                                                                                                    BTW, it looks like there is some prior art on code walkers in Common Lisp!

                                                                                                                    1. 1

                                                                                                                      Okay, I understand your argument now.

                                                                                                                      I’ll read that article soon.

                                                                                                                      1. 6

                                                                                                                        “That’s two open problems: code walkers are hard to program and compilers to reprogram.”

                                                                                                                        The linked article also ends with something like that. Supports your argument given macros are both already there in some languages and much easier to use. That there’s lots of working macros out there in many languages supports it empirically.

                                                                                                                        There’s also nothing stopping experts from adding code walkers on top of that. Use the easy route when it works. Take the hard route when it works better.

                                                                                                                        1. 6

                                                                                                                          Welcome back Nick, haven’t seen you here in a while.

                                                                                                                          1. 4

                                                                                                                            Thank you! I missed you all!

                                                                                                                            I’m still busy (see profile). That will probably increase. I figure I can squeeze a little time in here and there to show some love for folks and share some stuff on my favorite, tech site. :)

                                                                                                              2. 1

                                                                                                                That kind of is the point. Lisp demonstrates that there is no real boundary between the language as given and the “language” it’s user creates, by extending and creating new functions and macros. That being said, good lisp usually follows conventions so that you may recognize if something is a macro (eg. with-*) or not.

                                                                                                            2. 1

                                                                                                              Here are examples of using lex/yacc to extend a language

                                                                                                              Those are making new languages, as they use new tooling, which doesn’t come with existing tooling for the language. If someone writes Babel code, it’s not JavaScript code anymore - it can’t be parsed by a normal JavaScript compiler.

                                                                                                              Meanwhile, Common Lisp macros extend the language itself - if I write a Common Lisp macro, anyone with a vanilla, unmodified Common Lisp implementation can use them, without any additional tooling.

                                                                                                              Granted most of these are not built into the compiler but nothing is stopping you adding a new pre-compile step with the help of a make file.

                                                                                                              …at which point you have to modify the build processes of everybody that wants to use this new language, as well as breaking a lot of tooling - for instance, if you don’t modify your debugger, then it no longer shows an accurate translation from your source file to the code under debugging.

                                                                                                              If I wanted to write a code highlighter in vim that highlights all macros differently I would have a difficult time doing this by parsing alone even though lisp is an easy language to parse.

                                                                                                              Similarly, if you wanted to write a code highlighter that highlights defined functions differently without querying a compiler/implementation, you couldn’t do it for any language that allows a function to be bound at runtime, like Python. This isn’t a special property of Common Lisp, it’s just a natural implication of the fact that CL allows you to create macros at runtime.

                                                                                                              Meanwhile, you could capture 99.9%+ of macro definitions in CL (and function definitions in Python) using static analysis - parse code files into s-expression trees, look for defmacro followed by a name, add that to the list of macro names (modulo packages/namespacing).

                                                                                                              tl;dr “I can’t determine 100% of source code properties using static analysis without querying a compiler/implementation” is not an interesting property, as all commonly used programming languages have it to some extent.

                                                                                                              1. 1

                                                                                                                If you can use lex / yacc and can write a full fledged interpreter do you really need macros ?

                                                                                                                I don’t know why you’d think they are comparable. The amount of effort to write a macro is way less than the amount of effort required to write a lexer + parser. The fact that macros are written in lisp itself also reduces the effort needed. But most importantly one is an in-process mechanism for code generation and the other one involves writing the generated code to the file. The first mechanism makes it easy to iterate and modify the generated codec. Given that most of the time you are maintain, hence modifying, code I’d say that is a pretty big difference.

                                                                                                                The babel compiler uses parsing to add features on top of older javascript like asyc/await.

                                                                                                                Babel is an example of how awful things can be when macros happen out of process. The core of babel is a macro system + plugable reader .

                                                                                                                I am guessing all these use lex/yacc internally.

                                                                                                                Babel certainly doesn’t. When it started it used estools which used acorn iirc. I think nowadays it uses its own parser.

                                                                                                                Rails uses scaffolding and provides helpers to generate js code compile time. Something like parenscript.

                                                                                                                I have no idea why you think scaffolding it is like parenscript. The common use case for parenscript is to do the expansion of the fly. Not to generate the initial boilerplate.

                                                                                                                Code walking is difficult in lisp as well.

                                                                                                                And impossible to write in portable code, which is why most (all?) implementations come with a code-walker you can use.

                                                                                                                1. 1

                                                                                                                  If syntax is irrelevant, why even bother with Lisp ? If I just stick to using arrays in the native language I can also define functions like this and extend the array language to support new control flow structures

                                                                                                                  ["begin",
                                                                                                                      ["define", "fib",
                                                                                                                          ["lambda", ["n"],
                                                                                                                              ["cond", [["eq", "n", 0], 0],
                                                                                                                                       [["eq", "n", 1], 1],
                                                                                                                                       ["T", ["+", ["fib", ["-", "n", 1]], ["fib", ["-", "n", 2]]]] ]]],
                                                                                                                      ["fib", 6]]
                                                                                                                  
                                                                                                                2. 1

                                                                                                                  Well, if your question is “Would you prefer a consistent, built-in way of extending the language, or a hacked together kludge of pre-processors?” then I’ll take the macros… ;-)

                                                                                                                  Code walking is difficult in lisp as well. How would I know if an expression is a function or a macro ? If I wanted to write a code highlighter in vim that highlights all macros differently I would have a difficult time with doing pure code walking alone even though lisp is an easy language to parse.

                                                                                                                  My first question would be whether or not it makes sense to highlight macros differently. The whole idea is that they extend the language transparently, and a lot of “built-in” constructs defined in the CL standard are macros.

                                                                                                                  Assuming you really wanted to do this, though, I’d suggest looking at Emacs’ Slime mode. It basically lets the CL compiler do the work. It may not be ideal, but it works, and it’s better than what you’d get using Ragel, Swig, or Babel.

                                                                                                                  FWIW, Emacs, as far as I know (and as I have it configured), only highlights symbols defined by the CL standard and keywords (i.e. :foo, :bar), and adjusts indentation based on cues like “&body” arguments.

                                                                                                                  1. 1

                                                                                                                    Btw there is already a syntax highlighter that uses a code walker and treats macros differently. The code walker may not be easy to write, but it can hardly be said that it is hard to use.

                                                                                                                    https://github.com/scymtym/sbcl/blob/wip-walk-forms-new-marco-stuff/examples/code-walking-example-syntax-highlighting.lisp

                                                                                                              2. 5

                                                                                                                Yes, you absolutely want macros even if you Lex/Yacc and interpreters.

                                                                                                                Lex/Yacc (and parsers more generally), interpreters (and “full language compilers”), and macros all have different jobs at different stages of a language pipeline. They are complimentary, orthogonal systems.

                                                                                                                Lex/Yacc are for building parsers (and aren’t necessarily the best tools for that job), which turn the textual representation of a program into a data structure (a tree). Every Lisp has a parser, for historical reasons usually called a “reader”. Lisps always have s-expression parsers, of course, but often they are extensible so you can make new concrete textual notations and specify how they are turned into a tree. This is the kind of job Lex and Yacc do, though extended s-expression parsers and lex/yacc parsers generally have some different capabilities in terms of what notations they can parse, how easy it is to build the parser, and how easy it is to extend or compose any parsers you create.

                                                                                                                Macros are tree transformers. Well, M4 and C-preprocessor are textual macro systems that transform text before parsing, but that’s not what we’re talking about. Lisp macros transform the tree data structure you get from parsing. While parsing is all about syntax, macros can be a lot more about semantics. This depends a lot on the macro system – some macro systems don’t allow much more introspection on the tree than just what symbols there are and the structure, while other macro systems (like Racket’s) provide rich introspection capabilities to compare binding information, allow macros to communicate by annotating parts of the tree with extra properties, or by accessing other compile-time data from bindings (see Racket’s syntax-local-value for more details), etc. Racket has the most advanced macro system, and it can be used for things like building custom DSL type systems, creating extensible pattern matching systems, etc. But importantly, macros can be written one at a time as composable micro-compilers. Rather than writing up-front an entire compiler or interpreter for a DSL, with all its complexity, you can get most of it “for free” and just write a minor extension to your general-purpose language to help with some small (maybe domain-specific) pain point. And let me reiterate – macros compose! You can write several extensions that are each oblivious to each other, but use them together! You can’t do that with stand-alone language built with lex/yacc and stand-alone interpreters. Let me emphatically express my disagreement that “most macros merely suppress evaluation”!

                                                                                                                Interpreters or “full” compilers then work after any macro expansion has happened, and again do a different, complimentary job. (And this post is already so verbose that I’ll skip further discussion of it…)

                                                                                                                If you want to build languages with Lex/Yacc and interpreters, you clearly care about how languages allow programmers to express their programs. Macros provide a lot of power for custom languages and language extensions to be written more easily, more completely, and more compositionally than they otherwise can be. Macros are an awesome tool that programmers absolutely need! Without using macros, you have to put all kinds of complex stuff into your language compiler/interpreter or do without it. Eg. how will your language deal with name binding and scoping, how will your language order evaluation, how do errors and error handling work, what data structures does it have, how can it manipulate them, etc. Every new little language interpreter needs to make these decisions! Often a DSL author cares about only some of those decisions, and ends up making poor decisions or half-baked features for the other parts. Additionally, stand-alone interpreters don’t compose, and don’t allow their languages to compose. Eg. if you want to use 2+ independent languages together, you need to shuttle bits of code around as strings, convert data between different formats at every boundary, maybe serialize it between OS processes, etc. With DSL compilers that compile down to another language for the purpose of embedding (eg. Lex/Yacc are DSLs that output C code to integrate into a larger program), you don’t have the data shuffling problems. But you still have issues if you want to eg. write a function that mixes multiple such DSLs. In other words, stand-alone compilers that inject code into your main language are only suitable for problems that are sufficiently large and separated from other problems you might build a DSL for.

                                                                                                                With macro-based embedded languages, you can sidestep all of those problems. Macro-based embedded languages can simply use the features of the host language, maybe substituting one feature that it wants to change. You mention delaying code – IE changing the host language’s evaluation order. This is only one aspect of the host language out of many you might change with macros. Macro extensions can be easily embedded within each other and used together. The only data wrangling at boundaries you need to do is if your embedded language uses different, custom data structures. But this is just the difference between two libraries in the same language, not like the low-level serialization data wrangling you need to do if you have separate interpreters. And macros can tackle problems as large as “I need a DSL for parsing” like Yacc to “I want a convenience form so I don’t have to write this repeteating pattern inside my parser”. And you can use one macro inside another with no problem. (That last sentence has a bit of ambiguity – I mean that users can nest arbitrary macro calls in their program. But also you can use one macro in the implementation of another, so… multiple interpretations of that sentence are correct.)

                                                                                                                To end, I want to comment that macro systems vary a lot in expressive power and complexity – different macro systems provide different capabilities. The OP is discussing Common Lisp, which inhabits a very different place in the “expressive power vs complexity” space than the macro system I use most (Racket’s). Not to disparage the Common Lisp macro system (they both have their place!), but I would encourage anyone not to come to conclusions about what macros can be useful for or whether they are worthwhile without serious investigation of Racket’s macro system. It is more complicated, to be certain, but it provides so much expressive power.

                                                                                                                1. 4

                                                                                                                  I mean, strictly, no - but that’s like saying “if you can write machine code, do you really need Java?”

                                                                                                                  (Edited to add: see also Greenspun’s tenth rule … if you were to build a macro system out of such tooling, I’d bet at least a few pints of beer that you’d basically wind up back at Common Lisp again).

                                                                                                                  1. 2

                                                                                                                    First-class packages are the most underrated feature of lisp. AFAIK only perl offers it fully

                                                                                                                    OCaml has first-class modules: https://ocaml.org/releases/4.11/htmlman/firstclassmodules.html

                                                                                                                    I’m a lot more familiar with them than I am with CL packages though, so they may not be 100% equivalent.

                                                                                                                    1. 2

                                                                                                                      I’m not claiming to speak for all lispers, but the question

                                                                                                                      Here is my question for lispers, If you can use lex / yacc and can write a full fledged interpreter do you really need macros ?

                                                                                                                      might be misleading. Obviously you don’t need macros, and everything could be done some other way, but macros are easy to use, while also powerful, can be dynamically created or restricted to a lexical scope. I’ve never bothered to learn lax/yacc, so I might be missing something.

                                                                                                                    1. 2

                                                                                                                      Slightly unfortunately named: https://github.com/jordansissel/fpm

                                                                                                                        1. 2

                                                                                                                          I you ask me, “Fortran” seems more intuitive than “Effing”.

                                                                                                                        1. 2

                                                                                                                          TIL Common Lisp as aliases for car and cdr, first and rest respectively. I guess this was made as a minor change to be more approachable? Seems like kind of a pointless feature to be honest.

                                                                                                                          1. 5

                                                                                                                            I’ve often heard that first (second, third, …) and rest should be used by default and that c[ad]*r are just part of CL’s heritage, and are provided as legacy functions.

                                                                                                                            1. 8

                                                                                                                              After seeing how much my TLA+ students struggled with using /\ instead of &&, I’ve come to the opinion that any unnecessary naming differences should be ruthlessly purged.

                                                                                                                              1. 3

                                                                                                                                That struggle only lasts for a fraction of a generation — later generations may be confused why there’s & and && instead of and.

                                                                                                                                1. 1

                                                                                                                                  As someone who somewhat-recently was a TLA+ student myself: the difference between those two is not unnecessary, because it avoids the classic student problem of “these symbols look similar, therefore the ideas must be similar” - /\ in TLA+ is different than && in JavaScript and C. It was very helpful for me to use the former instead of the latter.

                                                                                                                                2. 8

                                                                                                                                  Not really. first, rest, etc should be used when dealing with lists. cons cells have more uses than to construct a list. (dequeue, trees, alists, etc). In those scenarios it is preferred to use car/cdr.

                                                                                                                              1. 14

                                                                                                                                Would have been nice to mention the type builtin, at least for bash, that helps newcomers distinguish between different kinds of commands:

                                                                                                                                $ type cat
                                                                                                                                cat is /usr/bin/cat
                                                                                                                                $ type cd
                                                                                                                                cd is a shell builtin
                                                                                                                                $ type ls
                                                                                                                                ls is aliased to `ls -Fh'
                                                                                                                                
                                                                                                                                1. 5

                                                                                                                                  Wow, I’ve been using Unix for most of my computing life (30 years?) and I didn’t know about type.

                                                                                                                                  1. 1

                                                                                                                                    It is great to find duplicates in your PATH: type - all Shows you all places where exists

                                                                                                                                  2. 2

                                                                                                                                    I use which as opposed to type and it seems to do the exact same thing.

                                                                                                                                    1. 9

                                                                                                                                      You should use type instead. More than you ever wanted to know on why:

                                                                                                                                      https://unix.stackexchange.com/questions/85249/why-not-use-which-what-to-use-then

                                                                                                                                      1. 1

                                                                                                                                        Interesting. As a long time DOS user, I expected type to behave like cat. I typically use which as if it is just returning the first result from whereis, e.g. xxd $(which foo) | vim -R -. I didn’t know about the csh aliases, because the last time I used csh was in the nineties when I thought that since I use C, surely csh is a better fit for me than something whose name starts with a B, which clearly must be related to BCPL.

                                                                                                                                        1. 1

                                                                                                                                          I did not know about type and after knowing about it for 15 seconds now I almost completely agree with you. The only reason you could want to use which is to avoid complicating the readlink $(which <someprogram>) invocation on guix or nixos systems. That is; which is still useful in scripts that intend to use the path, type has an output of the form <someprogram> is <path to someprogram>.

                                                                                                                                          Edit: OK I followed a link from the article to some stackoverflow that goes through the whole bonanza of these scripts and I think whereis <someprogram> is probably better than readlink $(which <someprogram>).

                                                                                                                                          1. 3

                                                                                                                                            @ilmu type -p will return just the path.

                                                                                                                                            1. 2

                                                                                                                                              Two problems with whereis: 1) it’s not available everywhere, and 2) it can return more than one result, so you have to parse its output. So for that use case I’ll probably stick with which until someone points me at a simple program that does the same thing without the csh aliases.

                                                                                                                                        2. 1

                                                                                                                                          Interesting. In fish shell, type gives you the full definition of the function for built-ins that are written in fish, and builtin -n lists all the bultins. There’s a surprising about of fish code around the cd builtin.

                                                                                                                                        1. 64

                                                                                                                                          Except that, as far as I can tell, Firefox isn’t produced by a malicious actor with a history of all sorts of shenanigans, including a blatantly illegal conspiracy with other tech companies to suppress tech wages.

                                                                                                                                          Sure, if your personal threat model includes nation states and police departments, it may be worthwhile switching to Chromium for that bit of extra hardening.

                                                                                                                                          But for the vast majority of people, Firefox is a better choice.

                                                                                                                                          1. 13

                                                                                                                                            I don’t think we can meaningfully say that there is a “better” choice, web browsers are a depressing technical situation, that every decision has significant downsides. Google is obviously nefarious, but they have an undeniable steering position. Mozilla is more interested in privacy, but depends on Google, nor can they decide to break the systems that are created to track and control their users, because most non-technical users perceive the lack of DRM to mean something is broken (“Why won’t Netflix load”). Apple and Microsoft are suspicious for other reasons. Everything else doesn’t have the manpower to keep up with Google and/or the security situation.

                                                                                                                                            When I’m cynical, I like to imagine that Google will lead us into a web “middle age”, that might clean the web up. When I’m optimistic, I like to imagine that a web “renaissance” would manage to break off Google’s part in this redesign and result in a better web.

                                                                                                                                            1. 19

                                                                                                                                              Mozilla also has a history of doing shady things and deliberately designed a compromised sync system because it is more convenient for the user.

                                                                                                                                              Not to mention, a few years ago I clicked on a Google search result link and immediately had a malicious EXE running on my PC. At first I thought it was a popup, but no, it was a drive-by attack with me doing nothing other than opening a website. My computer was owned, only a clean wipe and reinstallation helped.

                                                                                                                                              I’m still a Firefox fan for freedom reasons but unfortunately, the post has a point.

                                                                                                                                              1. 12

                                                                                                                                                a few years ago I clicked on a […] link and immediately had a malicious EXE

                                                                                                                                                I find this comment disingenuous due to the fact that every browser on every OS had or still has issues with a similar blast radius. Some prominent examples include hacking game consoles or closed operating systems via the browser all of which ship some version of the Webkit engine. Sure, the hack was used to “open up” the system but it could have been (and usually is) abused in exactly the same way you described here.

                                                                                                                                                Also, I’m personally frustrated by people holding Mozilla to a higher standard than Google when it really should be the absolute opposite due to how much Google knows about each individual compared to Mozilla. Yes, it would be best if some of the linked issues could be resolved such that Mozilla can’t intercept your bookmark sync but I gotta ask: really, is that a service people should really be worried about? Meanwhile, Google boasts left, right and center how your data is secure with them and we all know what that means. Priorities people! The parent comment is absolutely right: Firefox is a better choice for the vast majority of people because Mozilla as a company is much more concerned about all of our privacy than Google. Google’s goal always was and always will be to turn you into data points and make a buck of that.

                                                                                                                                                1. 1

                                                                                                                                                  your bookmark sync

                                                                                                                                                  It’s not just bookmark sync. Firefox sync synchronizes:

                                                                                                                                                  • Bookmarks
                                                                                                                                                  • Browsing history
                                                                                                                                                  • Open tabs
                                                                                                                                                  • Logins and passwords
                                                                                                                                                  • Addresses
                                                                                                                                                  • Add-ons
                                                                                                                                                  • Firefox options

                                                                                                                                                  If you are using these features and your account is compromised, that’s a big deal. If we just look at information security, I trust Google more than Mozilla with keeping this data safe. Of course Google has access to the data and harvests it, but the likelihood that my Google data leaks to hackers is probably lower than the likelihood that my Firefox data leaks to hackers. If I have to choose between leaking my data to the government or to hackers, I’d still choose the government.

                                                                                                                                                  1. 1

                                                                                                                                                    If I have to choose between leaking my data to the government or to hackers, I’d still choose the government.

                                                                                                                                                    That narrows down where you live, a lot.

                                                                                                                                                    Secondly, I’d assume that any data leaked to hackers is also available to Governments. I mean, if I had spooks with black budgets, I’d be encouraging them to buy black market datasets on target populations.

                                                                                                                                                    1. 1

                                                                                                                                                      I’d assume that any data leaked to hackers is also available to Governments.

                                                                                                                                                      Exactly. My point is that governments occasionally make an effort not to be malicious actors, whereas hackers who exploit systems usually don’t.

                                                                                                                                                2. 6

                                                                                                                                                  I clicked on a Google search result link

                                                                                                                                                  Yeah, FF is to blame for that, but also lol’d at the fact that Google presented that crap to you as a result.

                                                                                                                                                  1. 3

                                                                                                                                                    Which nicely sums up the qualitative difference between Firefox and Google. One has design issues and bugs; the other invades your privacy to sell the channel to serve up .EXEs to your children.

                                                                                                                                                    Whose browser would you rather use?

                                                                                                                                                  2. 3

                                                                                                                                                    Mozilla also has a history of doing shady things and deliberately designed a compromised sync system because it is more convenient for the user.

                                                                                                                                                    Sure, but I’d argue that’s a very different thing, qualitatively, from what Google has done and is doing.

                                                                                                                                                    I’d sum it up as “a few shady things” versus “a business model founded upon privacy violation, a track record of illegal industry-wide collusion, and outright hostility towards open standards”.

                                                                                                                                                    There is no perfect web browser vendor. But the perfect is the enemy of the good; Mozilla is a lot closer to perfect than Google, and deserves our support on that basis.

                                                                                                                                                  3. 8

                                                                                                                                                    These mitigations are not aimed at nation-state attackers, they are aimed at people buying ads that contain malicious data that can compromise your system. The lack of site isolation in FireFox means that, for example, someone who buys and ad on a random site that you happen to have open in one tab while another is looking at your Internet banking page can use spectre attacks from JavaScript in the ad to extract all of the information (account numbers, addresses, last transaction) that are displayed in the other tab. This is typically all that’s needed for telephone banking to do a password reset if you phone that bank and say you’ve lost your credentials. These attacks are not possible in any other mainstream browser (and are prevented by WebKit2 for any obscure ones that use that, because Apple implemented the sandboxing at the WebKit layer, whereas Google hacked it into Chrome).

                                                                                                                                                    1. 2

                                                                                                                                                      Hmmmm. Perhaps I’m missing something, but I thought Spectre was well mitigated these days. Or is it that the next Spectre, whatever it is, is the concern here?

                                                                                                                                                      1. 11

                                                                                                                                                        There are no good Spectre mitigations. There’s speculative load hardening, but that comes with around a 50% performance drop so no one uses it in production. There are mitigations on array access in JavaScript that are fairly fast (Chakra deployed these first, but I believe everyone else has caught up), but that’s just closing one exploit technique, not fixing the bug and there are a bunch of confused deputy operations you can do via DOM invocations to do the same thing. The Chrome team has basically given up and said that it is not possible to keep anything in a process secret from other parts of a process on current hardware and so have pushed more process-based isolation.